2007-01-01
countries in developing market nations in Asia (such as Korea, Taiwan, Singapore, Malaysia , China and Vietnam). The competition for the knowledge, economic...Intel, Infineon Technologies, STMicroelectronics, Samsung Electronics, Texas Instruments, AMD Spansion, Philips Semiconductor, Freescale... Samsung ($19.7B), #5 Toshiba ($9.8B), #6 TSMC ($9.7B), #7 Hynix ($8.0B) and #8 Renesas ($7.9B) (McGrath, 2007, p. 3). Samsung , headquartered in
2007-01-01
late 1980s, Korean firms began to compete globally on memory chips, with Samsung earning a sales profit in 1987 (Pecht, 1997, p. 10; Mathews, 2000, p...competitive in the 1990s (Lee, 1997, p. 41). Singapore, Malaysia and China have since developed significant chip industries (Beane, 1997, p. 9; Pecht...sales in parentheses): #2 Samsung ($19.7B), #5 Toshiba ($9.8B), #6 TSMC ($9.7B), #7 Hynix ($8.0B) and #8 Renesas ($7.9B) (McGrath, 2007, p. 3
OSCAR API for Real-Time Low-Power Multicores and Its Performance on Multicores and SMP Servers
NASA Astrophysics Data System (ADS)
Kimura, Keiji; Mase, Masayoshi; Mikami, Hiroki; Miyamoto, Takamichi; Shirako, Jun; Kasahara, Hironori
OSCAR (Optimally Scheduled Advanced Multiprocessor) API has been designed for real-time embedded low-power multicores to generate parallel programs for various multicores from different vendors by using the OSCAR parallelizing compiler. The OSCAR API has been developed by Waseda University in collaboration with Fujitsu Laboratory, Hitachi, NEC, Panasonic, Renesas Technology, and Toshiba in an METI/NEDO project entitled "Multicore Technology for Realtime Consumer Electronics." By using the OSCAR API as an interface between the OSCAR compiler and backend compilers, the OSCAR compiler enables hierarchical multigrain parallel processing with memory optimization under capacity restriction for cache memory, local memory, distributed shared memory, and on-chip/off-chip shared memory; data transfer using a DMA controller; and power reduction control using DVFS (Dynamic Voltage and Frequency Scaling), clock gating, and power gating for various embedded multicores. In addition, a parallelized program automatically generated by the OSCAR compiler with OSCAR API can be compiled by the ordinary OpenMP compilers since the OSCAR API is designed on a subset of the OpenMP. This paper describes the OSCAR API and its compatibility with the OSCAR compiler by showing code examples. Performance evaluations of the OSCAR compiler and the OSCAR API are carried out using an IBM Power5+ workstation, an IBM Power6 high-end SMP server, and a newly developed consumer electronics multicore chip RP2 by Renesas, Hitachi and Waseda. From the results of scalability evaluation, it is found that on an average, the OSCAR compiler with the OSCAR API can exploit 5.8 times speedup over the sequential execution on the Power5+ workstation with eight cores and 2.9 times speedup on RP2 with four cores, respectively. In addition, the OSCAR compiler can accelerate an IBM XL Fortran compiler up to 3.3 times on the Power6 SMP server. Due to low-power optimization on RP2, the OSCAR compiler with the OSCAR API achieves a maximum power reduction of 84% in the real-time execution mode.
NASA Astrophysics Data System (ADS)
Hayashi, Akihiro; Wada, Yasutaka; Watanabe, Takeshi; Sekiguchi, Takeshi; Mase, Masayoshi; Shirako, Jun; Kimura, Keiji; Kasahara, Hironori
Heterogeneous multicores have been attracting much attention to attain high performance keeping power consumption low in wide spread of areas. However, heterogeneous multicores force programmers very difficult programming. The long application program development period lowers product competitiveness. In order to overcome such a situation, this paper proposes a compilation framework which bridges a gap between programmers and heterogeneous multicores. In particular, this paper describes the compilation framework based on OSCAR compiler. It realizes coarse grain task parallel processing, data transfer using a DMA controller, power reduction control from user programs with DVFS and clock gating on various heterogeneous multicores from different vendors. This paper also evaluates processing performance and the power reduction by the proposed framework on a newly developed 15 core heterogeneous multicore chip named RP-X integrating 8 general purpose processor cores and 3 types of accelerator cores which was developed by Renesas Electronics, Hitachi, Tokyo Institute of Technology and Waseda University. The framework attains speedups up to 32x for an optical flow program with eight general purpose processor cores and four DRP(Dynamically Reconfigurable Processor) accelerator cores against sequential execution by a single processor core and 80% of power reduction for the real-time AAC encoding.
47 CFR 54.680 - Validity of electronic signatures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 3 2013-10-01 2013-10-01 false Validity of electronic signatures. 54.680... Validity of electronic signatures. (a) For the purposes of this subpart, an electronic signature (defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or...
47 CFR 54.680 - Validity of electronic signatures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 3 2014-10-01 2014-10-01 false Validity of electronic signatures. 54.680... Validity of electronic signatures. (a) For the purposes of this subpart, an electronic signature (defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or...
Lesson 5: Defining Valid Electronic Signatures
A valid electronic signature on an electronic document is one that is created with an electronic signature device that is uniquely entitled to a signatory, not compromised, and used by a signatory who is authorized to sign the electronic document.
47 CFR 54.419 - Validity of electronic signatures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 3 2013-10-01 2013-10-01 false Validity of electronic signatures. 54.419... electronic signatures. (a) For the purposes of this subpart, an electronic signature, defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or process...
47 CFR 54.419 - Validity of electronic signatures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 3 2012-10-01 2012-10-01 false Validity of electronic signatures. 54.419... electronic signatures. (a) For the purposes of this subpart, an electronic signature, defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or process...
47 CFR 54.419 - Validity of electronic signatures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 3 2014-10-01 2014-10-01 false Validity of electronic signatures. 54.419... electronic signatures. (a) For the purposes of this subpart, an electronic signature, defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or process...
NASA Electronic Parts and Packaging Program
NASA Technical Reports Server (NTRS)
Kayali, Sammy
2000-01-01
NEPP program objectives are to: (1) Access the reliability of newly available electronic parts and packaging technologies for usage on NASA projects through validations, assessments, and characterizations, and the development of test methods/tools; (2)Expedite infusion paths for advanced (emerging) electronic parts and packaging technologies by evaluations of readiness for manufacturability and project usage consideration; (3) Provide NASA projects with technology selection, application, and validation guidelines for electronic parts and packaging hardware and processes; nd (4) Retain and disseminate electronic parts and packaging quality assurance, reliability validations, tools, and availability information to the NASA community.
Validation of asthma recording in electronic health records: a systematic review
Nissen, Francis; Quint, Jennifer K; Wilkinson, Samantha; Mullerova, Hana; Smeeth, Liam; Douglas, Ian J
2017-01-01
Objective To describe the methods used to validate asthma diagnoses in electronic health records and summarize the results of the validation studies. Background Electronic health records are increasingly being used for research on asthma to inform health services and health policy. Validation of the recording of asthma diagnoses in electronic health records is essential to use these databases for credible epidemiological asthma research. Methods We searched EMBASE and MEDLINE databases for studies that validated asthma diagnoses detected in electronic health records up to October 2016. Two reviewers independently assessed the full text against the predetermined inclusion criteria. Key data including author, year, data source, case definitions, reference standard, and validation statistics (including sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV]) were summarized in two tables. Results Thirteen studies met the inclusion criteria. Most studies demonstrated a high validity using at least one case definition (PPV >80%). Ten studies used a manual validation as the reference standard; each had at least one case definition with a PPV of at least 63%, up to 100%. We also found two studies using a second independent database to validate asthma diagnoses. The PPVs of the best performing case definitions ranged from 46% to 58%. We found one study which used a questionnaire as the reference standard to validate a database case definition; the PPV of the case definition algorithm in this study was 89%. Conclusion Attaining high PPVs (>80%) is possible using each of the discussed validation methods. Identifying asthma cases in electronic health records is possible with high sensitivity, specificity or PPV, by combining multiple data sources, or by focusing on specific test measures. Studies testing a range of case definitions show wide variation in the validity of each definition, suggesting this may be important for obtaining asthma definitions with optimal validity. PMID:29238227
A Validity and Reliability Study of the Basic Electronics Skills Self-Efficacy Scale (BESS)
ERIC Educational Resources Information Center
Korkmaz, Ö.; Korkmaz, M. K.
2016-01-01
The aim of this study is to improve a measurement tool to evaluate the self-efficacy of Electrical-Electronics Engineering students through their basic electronics skills. The sample group is composed of 124 Electrical-Electronics engineering students. The validity of the scale is analyzed with two different methods through factor analysis and…
NASA Astrophysics Data System (ADS)
Hidayati, A.; Rahmi, A.; Yohandri; Ratnawulan
2018-04-01
The importance of teaching materials in accordance with the characteristics of students became the main reason for the development of basic electronics I module integrated character values based on conceptual change teaching model. The module development in this research follows the development procedure of Plomp which includes preliminary research, prototyping phase and assessment phase. In the first year of this research, the module is validated. Content validity is seen from the conformity of the module with the development theory in accordance with the demands of learning model characteristics. The validity of the construct is seen from the linkage and consistency of each module component developed with the characteristic of the integrated learning model of character values obtained through validator assessment. The average validation value assessed by the validator belongs to a very valid category. Based on the validator assessment then revised the basic electronics I module integrated character values based on conceptual change teaching model.
Experimental investigations into visual and electronic tooth color measurement.
Ratzmann, Anja; Treichel, Anja; Langforth, Gabriele; Gedrange, Tomasz; Welk, Alexander
2011-04-01
The present study aimed to examine the validity of the visual color assessment and an electronic tooth color measurement system by means of Shade Inspector™ in comparison with a gold standard. Additionally, reproducibility of electronic measurements was demonstrated by means of two reference systems. Ceramic specimens of two thicknesses (h=1.6 mm, h=2.6 mm) were used. Three experienced dental technicians using the VITAPAN Classical(®) color scale carried out all visual tests. Validity of the visual assessment and the electronic measurements was confirmed separately for both thicknesses by means of lightness and hue of the VITAPAN Classical(®) color scale. Reproducibility of electronic measurements was confirmed by means of the VITAPAN Classical(®) and 3D-Master(®). The 3D-Master(®) data were calculated according to lightness, hue and chroma. Intraclass correlation coefficient (ICC) was used in assessing validity/reproducibility for lightness and chroma, Kappa statistics were used for hue. A level ≥0.75 was pre-established for ICC and ≥0.60 for the Kappa index. RESULTS OF VISUAL COLOR ASSESSMENT: Validity for lightness was good for both thicknesses; agreement rates for hue were inconsistent. ELECTRONIC MEASUREMENT: Validity for lightness was fair to good, hue values were below 0.60. Reproducibility of lightness was good to very good for both reference systems. Hue values (VITAPAN Classical(®)) for 1.6 mm test specimens were upside, for 2.6 mm below 0.60, Kappa values for 3D-Master(®) were ≥0.60 for all measurements, reproducibility of chroma was very good. Validity was better for visual than for electronic color assessment. Reproducibility of the electronic device by means of the Shade Inspector™ was given for the VITAPAN Classical(®) and 3D-Master(®) systems.
Prognostics of Power Electronics, Methods and Validation Experiments
NASA Technical Reports Server (NTRS)
Kulkarni, Chetan S.; Celaya, Jose R.; Biswas, Gautam; Goebel, Kai
2012-01-01
Abstract Failure of electronic devices is a concern for future electric aircrafts that will see an increase of electronics to drive and control safety-critical equipment throughout the aircraft. As a result, investigation of precursors to failure in electronics and prediction of remaining life of electronic components is of key importance. DC-DC power converters are power electronics systems employed typically as sourcing elements for avionics equipment. Current research efforts in prognostics for these power systems focuses on the identification of failure mechanisms and the development of accelerated aging methodologies and systems to accelerate the aging process of test devices, while continuously measuring key electrical and thermal parameters. Preliminary model-based prognostics algorithms have been developed making use of empirical degradation models and physics-inspired degradation model with focus on key components like electrolytic capacitors and power MOSFETs (metal-oxide-semiconductor-field-effect-transistor). This paper presents current results on the development of validation methods for prognostics algorithms of power electrolytic capacitors. Particularly, in the use of accelerated aging systems for algorithm validation. Validation of prognostics algorithms present difficulties in practice due to the lack of run-to-failure experiments in deployed systems. By using accelerated experiments, we circumvent this problem in order to define initial validation activities.
Zbrozek, Arthur; Hebert, Joy; Gogates, Gregory; Thorell, Rod; Dell, Christopher; Molsen, Elizabeth; Craig, Gretchen; Grice, Kenneth; Kern, Scottie; Hines, Sheldon
2013-06-01
Outcomes research literature has many examples of high-quality, reliable patient-reported outcome (PRO) data entered directly by electronic means, ePRO, compared to data entered from original results on paper. Clinical trial managers are increasingly using ePRO data collection for PRO-based end points. Regulatory review dictates the rules to follow with ePRO data collection for medical label claims. A critical component for regulatory compliance is evidence of the validation of these electronic data collection systems. Validation of electronic systems is a process versus a focused activity that finishes at a single point in time. Eight steps need to be described and undertaken to qualify the validation of the data collection software in its target environment: requirements definition, design, coding, testing, tracing, user acceptance testing, installation and configuration, and decommissioning. These elements are consistent with recent regulatory guidance for systems validation. This report was written to explain how the validation process works for sponsors, trial teams, and other users of electronic data collection devices responsible for verifying the quality of the data entered into relational databases from such devices. It is a guide on the requirements and documentation needed from a data collection systems provider to demonstrate systems validation. It is a practical source of information for study teams to ensure that ePRO providers are using system validation and implementation processes that will ensure the systems and services: operate reliably when in practical use; produce accurate and complete data and data files; support management control and comply with any existing regulations. Furthermore, this short report will increase user understanding of the requirements for a technology review leading to more informed and balanced recommendations or decisions on electronic data collection methods. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Validation of ISS Floating Potential Measurement Unit Electron Densities and Temperatures
NASA Technical Reports Server (NTRS)
Coffey, Victoria N.; Minow, Joseph I.; Parker, Linda N.; Bui, Them; Wright, Kenneth, Jr.; Koontz, Steven L.; Schneider, T.; Vaughn, J.; Craven, P.
2007-01-01
Validation of the Floating Potential Measurement Unit (FPMU) electron density and temperature measurements is an important step in the process of evaluating International Space Station spacecraft charging issues .including vehicle arcing and hazards to crew during extravehicular activities. The highest potentials observed on Space Station are due to the combined VxB effects on a large spacecraft and the collection of ionospheric electron and ion currents by the 160 V US solar array modules. Ionospheric electron environments are needed for input to the ISS spacecraft charging models used to predict the severity and frequency of occurrence of ISS charging hazards. Validation of these charging models requires comparing their predictions with measured FPMU values. Of course, the FPMU measurements themselves must also be validated independently for use in manned flight safety work. This presentation compares electron density and temperatures derived from the FPMU Langmuir probes and Plasma Impedance Probe against the independent density and temperature measurements from ultraviolet imagers, ground based incoherent scatter radar, and ionosonde sites.
Newton, Katherine M; Peissig, Peggy L; Kho, Abel Ngo; Bielinski, Suzette J; Berg, Richard L; Choudhary, Vidhu; Basford, Melissa; Chute, Christopher G; Kullo, Iftikhar J; Li, Rongling; Pacheco, Jennifer A; Rasmussen, Luke V; Spangler, Leslie; Denny, Joshua C
2013-06-01
Genetic studies require precise phenotype definitions, but electronic medical record (EMR) phenotype data are recorded inconsistently and in a variety of formats. To present lessons learned about validation of EMR-based phenotypes from the Electronic Medical Records and Genomics (eMERGE) studies. The eMERGE network created and validated 13 EMR-derived phenotype algorithms. Network sites are Group Health, Marshfield Clinic, Mayo Clinic, Northwestern University, and Vanderbilt University. By validating EMR-derived phenotypes we learned that: (1) multisite validation improves phenotype algorithm accuracy; (2) targets for validation should be carefully considered and defined; (3) specifying time frames for review of variables eases validation time and improves accuracy; (4) using repeated measures requires defining the relevant time period and specifying the most meaningful value to be studied; (5) patient movement in and out of the health plan (transience) can result in incomplete or fragmented data; (6) the review scope should be defined carefully; (7) particular care is required in combining EMR and research data; (8) medication data can be assessed using claims, medications dispensed, or medications prescribed; (9) algorithm development and validation work best as an iterative process; and (10) validation by content experts or structured chart review can provide accurate results. Despite the diverse structure of the five EMRs of the eMERGE sites, we developed, validated, and successfully deployed 13 electronic phenotype algorithms. Validation is a worthwhile process that not only measures phenotype performance but also strengthens phenotype algorithm definitions and enhances their inter-institutional sharing.
ERIC Educational Resources Information Center
Floro, Josh N.; Dunton, Genevieve F.; Delfino, Ralph J.
2009-01-01
Convergent validity of accelerometer and electronic diary physical activity data was assessed in children with asthma. Sixty-two participants, ages 9-18 years, wore an accelerometer and reported their physical activity level in quarter-hour segments every 2 hr using the Ambulatory Diary Assessment (ADA). Moderate validity was found between…
ERIC Educational Resources Information Center
Institute of Electrical and Electronics Engineers, Inc., New York, NY.
The Institute of Electrical and Electronics Engineers (IEEE) validation program is designed to motivate persons practicing in electrical and electronics engineering to pursue quality technical continuing education courses offered by any responsible sponsor. The rapid acceptance of the validation program necessitated the additional development of a…
Hediger, Hannele; Müller-Staub, Maria; Petry, Heidi
2016-01-01
Electronic nursing documentation systems, with standardized nursing terminology, are IT-based systems for recording the nursing processes. These systems have the potential to improve the documentation of the nursing process and to support nurses in care delivery. This article describes the development and initial validation of an instrument (known by its German acronym UEPD) to measure the subjectively-perceived benefits of an electronic nursing documentation system in care delivery. The validity of the UEPD was examined by means of an evaluation study carried out in an acute care hospital (n = 94 nurses) in German-speaking Switzerland. Construct validity was analyzed by principal components analysis. Initial references of validity of the UEPD could be verified. The analysis showed a stable four factor model (FS = 0.89) scoring in 25 items. All factors loaded ≥ 0.50 and the scales demonstrated high internal consistency (Cronbach's α = 0.73 – 0.90). Principal component analysis revealed four dimensions of support: establishing nursing diagnosis and goals; recording a case history/an assessment and documenting the nursing process; implementation and evaluation as well as information exchange. Further testing with larger control samples and with different electronic documentation systems are needed. Another potential direction would be to employ the UEPD in a comparison of various electronic documentation systems.
Faurholt-Jepsen, Maria; Munkholm, Klaus; Frost, Mads; Bardram, Jakob E; Kessing, Lars Vedel
2016-01-15
Various paper-based mood charting instruments are used in the monitoring of symptoms in bipolar disorder. During recent years an increasing number of electronic self-monitoring tools have been developed. The objectives of this systematic review were 1) to evaluate the validity of electronic self-monitoring tools as a method of evaluating mood compared to clinical rating scales for depression and mania and 2) to investigate the effect of electronic self-monitoring tools on clinically relevant outcomes in bipolar disorder. A systematic review of the scientific literature, reported according to the Preferred Reporting items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines was conducted. MEDLINE, Embase, PsycINFO and The Cochrane Library were searched and supplemented by hand search of reference lists. Databases were searched for 1) studies on electronic self-monitoring tools in patients with bipolar disorder reporting on validity of electronically self-reported mood ratings compared to clinical rating scales for depression and mania and 2) randomized controlled trials (RCT) evaluating electronic mood self-monitoring tools in patients with bipolar disorder. A total of 13 published articles were included. Seven articles were RCTs and six were longitudinal studies. Electronic self-monitoring of mood was considered valid compared to clinical rating scales for depression in six out of six studies, and in two out of seven studies compared to clinical rating scales for mania. The included RCTs primarily investigated the effect of heterogeneous electronically delivered interventions; none of the RCTs investigated the sole effect of electronic mood self-monitoring tools. Methodological issues with risk of bias at different levels limited the evidence in the majority of studies. Electronic self-monitoring of mood in depression appears to be a valid measure of mood in contrast to self-monitoring of mood in mania. There are yet few studies on the effect of electronic self-monitoring of mood in bipolar disorder. The evidence of electronic self-monitoring is limited by methodological issues and by a lack of RCTs. Although the idea of electronic self-monitoring of mood seems appealing, studies using rigorous methodology investigating the beneficial as well as possible harmful effects of electronic self-monitoring are needed.
Code of Federal Regulations, 2011 CFR
2011-01-01
...-on requests, from individuals (including individuals in control groups) under treatment or clinical... electronic format), to place the currently valid OMB control number on the front page of the collection of... valid OMB control number in the instructions, near the title of the electronic collection instrument, or...
Code of Federal Regulations, 2012 CFR
2012-01-01
...-on requests, from individuals (including individuals in control groups) under treatment or clinical... electronic format), to place the currently valid OMB control number on the front page of the collection of... valid OMB control number in the instructions, near the title of the electronic collection instrument, or...
ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes
NASA Astrophysics Data System (ADS)
Yuan, Gary; Gygi, Francois
2011-03-01
ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.
Pageler, Natalie M; Grazier G'Sell, Max Jacob; Chandler, Warren; Mailes, Emily; Yang, Christine; Longhurst, Christopher A
2016-09-01
The objective of this project was to use statistical techniques to determine the completeness and accuracy of data migrated during electronic health record conversion. Data validation during migration consists of mapped record testing and validation of a sample of the data for completeness and accuracy. We statistically determined a randomized sample size for each data type based on the desired confidence level and error limits. The only error identified in the post go-live period was a failure to migrate some clinical notes, which was unrelated to the validation process. No errors in the migrated data were found during the 12- month post-implementation period. Compared to the typical industry approach, we have demonstrated that a statistical approach to sampling size for data validation can ensure consistent confidence levels while maximizing efficiency of the validation process during a major electronic health record conversion. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Ahmed, Adil; Vairavan, Srinivasan; Akhoundi, Abbasali; Wilson, Gregory; Chiofolo, Caitlyn; Chbat, Nicolas; Cartin-Ceba, Rodrigo; Li, Guangxi; Kashani, Kianoush
2015-10-01
Timely detection of acute kidney injury (AKI) facilitates prevention of its progress and potentially therapeutic interventions. The study objective is to develop and validate an electronic surveillance tool (AKI sniffer) to detect AKI in 2 independent retrospective cohorts of intensive care unit (ICU) patients. The primary aim is to compare the sensitivity, specificity, and positive and negative predictive values of AKI sniffer performance against a reference standard. This study is conducted in the ICUs of a tertiary care center. The derivation cohort study subjects were Olmsted County, MN, residents admitted to all Mayo Clinic ICUs from July 1, 2010, through December 31, 2010, and the validation cohort study subjects were all patients admitted to a Mayo Clinic, Rochester, campus medical/surgical ICU on January 12, 2010, through March 23, 2010. All included records were reviewed by 2 independent investigators who adjudicated AKI using the Acute Kidney Injury Network criteria; disagreements were resolved by a third reviewer. This constituted the reference standard. An electronic algorithm was developed; its precision and reliability were assessed in comparison with the reference standard in 2 separate cohorts, derivation and validation. Of 1466 screened patients, a total of 944 patients were included in the study: 482 for derivation and 462 for validation. Compared with the reference standard in the validation cohort, the sensitivity and specificity of the AKI sniffer were 88% and 96%, respectively. The Cohen κ (95% confidence interval) agreement between the electronic and the reference standard was 0.84 (0.78-0.89) and 0.85 (0.80-0.90) in the derivation and validation cohorts. Acute kidney injury can reliably and accurately be detected electronically in ICU patients. The presented method is applicable for both clinical (decision support) and research (enrollment for clinical trials) settings. Prospective validation is required. Copyright © 2015 Elsevier Inc. All rights reserved.
Validation of asthma recording in electronic health records: protocol for a systematic review.
Nissen, Francis; Quint, Jennifer K; Wilkinson, Samantha; Mullerova, Hana; Smeeth, Liam; Douglas, Ian J
2017-05-29
Asthma is a common, heterogeneous disease with significant morbidity and mortality worldwide. It can be difficult to define in epidemiological studies using electronic health records as the diagnosis is based on non-specific respiratory symptoms and spirometry, neither of which are routinely registered. Electronic health records can nonetheless be valuable to study the epidemiology, management, healthcare use and control of asthma. For health databases to be useful sources of information, asthma diagnoses should ideally be validated. The primary objectives are to provide an overview of the methods used to validate asthma diagnoses in electronic health records and summarise the results of the validation studies. EMBASE and MEDLINE will be systematically searched for appropriate search terms. The searches will cover all studies in these databases up to October 2016 with no start date and will yield studies that have validated algorithms or codes for the diagnosis of asthma in electronic health records. At least one test validation measure (sensitivity, specificity, positive predictive value, negative predictive value or other) is necessary for inclusion. In addition, we require the validated algorithms to be compared with an external golden standard, such as a manual review, a questionnaire or an independent second database. We will summarise key data including author, year of publication, country, time period, date, data source, population, case characteristics, clinical events, algorithms, gold standard and validation statistics in a uniform table. This study is a synthesis of previously published studies and, therefore, no ethical approval is required. The results will be submitted to a peer-reviewed journal for publication. Results from this systematic review can be used to study outcome research on asthma and can be used to identify case definitions for asthma. CRD42016041798. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Lesson 6: Signature Validation
Checklist items 13 through 17 are grouped under the Signature Validation Process, and represent CROMERR requirements that the system must satisfy as part of ensuring that electronic signatures it receives are valid.
ERIC Educational Resources Information Center
Ritzhaupt, Albert D.; Ndoye, Abdou; Parker, Michele A.
2010-01-01
With the explosive growth of e-portfolios in teacher preparation programs, it is essential for administration and other relevant stakeholders to understand the student perspective of e-portfolios' organizational uses. This article describes the validation of the modified Electronic Portfolio Student Perspective Instrument (EPSPI). The analysis…
Establishes the United States Environmental Protection Agency's approach to adopting electronic signature technology and best practices to ensure electronic signatures applied to official Agency documents are legally valid and enforceable
Marceau, Vincent; Varin, Charles; Piché, Michel
2013-03-15
In the study of laser-driven electron acceleration, it has become customary to work within the framework of paraxial wave optics. Using an exact solution to the Helmholtz equation as well as its paraxial counterpart, we perform numerical simulations of electron acceleration with a high-power TM(01) beam. For beam waist sizes at which the paraxial approximation was previously recognized valid, we highlight significant differences in the angular divergence and energy distribution of the electron bunches produced by the exact and the paraxial solutions. Our results demonstrate that extra care has to be taken when working under the paraxial approximation in the context of electron acceleration with radially polarized laser beams.
Tien, M.; Kashyap, R.; Wilson, G. A.; Hernandez-Torres, V.; Jacob, A. K.; Schroeder, D. R.
2015-01-01
Summary Background With increasing numbers of hospitals adopting electronic medical records, electronic search algorithms for identifying postoperative complications can be invaluable tools to expedite data abstraction and clinical research to improve patient outcomes. Objectives To derive and validate an electronic search algorithm to identify postoperative thromboembolic and cardiovascular complications such as deep venous thrombosis, pulmonary embolism, or myocardial infarction within 30 days of total hip or knee arthroplasty. Methods A total of 34 517 patients undergoing total hip or knee arthroplasty between January 1, 1996 and December 31, 2013 were identified. Using a derivation cohort of 418 patients, several iterations of a free-text electronic search were developed and refined for each complication. Subsequently, the automated search algorithm was validated on an independent cohort of 2 857 patients, and the sensitivity and specificities were compared to the results of manual chart review. Results In the final derivation subset, the automated search algorithm achieved a sensitivity of 91% and specificity of 85% for deep vein thrombosis, a sensitivity of 96% and specificity of 100% for pulmonary embolism, and a sensitivity of 100% and specificity of 95% for myocardial infarction. When applied to the validation cohort, the search algorithm achieved a sensitivity of 97% and specificity of 99% for deep vein thrombosis, a sensitivity of 97% and specificity of 100% for pulmonary embolism, and a sensitivity of 100% and specificity of 99% for myocardial infarction. Conclusions The derivation and validation of an electronic search strategy can accelerate the data abstraction process for research, quality improvement, and enhancement of patient care, while maintaining superb reliability compared to manual review. PMID:26448798
Introduction to Electronic Marketing.
ERIC Educational Resources Information Center
Dilbeck, Lettie
These materials for a five-unit course were developed to introduce secondary and postsecondary students to the use of electronic equipment in marketing. The units cover the following topics: electronic marketing as a valid marketing approach; telemarketing; radio electronic media marketing; television electronic media marketing; and cable TV…
Validation of multisource electronic health record data: an application to blood transfusion data.
Hoeven, Loan R van; Bruijne, Martine C de; Kemper, Peter F; Koopman, Maria M W; Rondeel, Jan M M; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P; Roes, Kit C B
2017-07-14
Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps or addressing the potential influence of linking multiple sources. Therefore we developed a practical approach for validating routinely collected data from multiple sources and to apply it to a blood transfusion data warehouse to evaluate the usability in practice. The approach consists of identifying existing validation frameworks for EHR data or linked data, selecting validity concepts from these frameworks and establishing quantifiable validity outcomes for each concept. The approach distinguishes external validation concepts (e.g. concordance with external reports, previous literature and expert feedback) and internal consistency concepts which use expected associations within the dataset itself (e.g. completeness, uniformity and plausibility). In an example case, the selected concepts were applied to a transfusion dataset and specified in more detail. Application of the approach to a transfusion dataset resulted in a structured overview of data validity aspects. This allowed improvement of these aspects through further processing of the data and in some cases adjustment of the data extraction. For example, the proportion of transfused products that could not be linked to the corresponding issued products initially was 2.2% but could be improved by adjusting data extraction criteria to 0.17%. This stepwise approach for validating linked multisource data provides a basis for evaluating data quality and enhancing interpretation. When the process of data validation is adopted more broadly, this contributes to increased transparency and greater reliability of research based on routinely collected electronic health records.
ERIC Educational Resources Information Center
Coleman, Karen J.; Lutsky, Marta A.; Yau, Vincent; Qian, Yinge; Pomichowski, Magdalena E.; Crawford, Phillip M.; Lynch, Frances L.; Madden, Jeanne M.; Owen-Smith, Ashli; Pearson, John A.; Pearson, Kathryn A.; Rusinak, Donna; Quinn, Virginia P.; Croen, Lisa A.
2015-01-01
To identify factors associated with valid Autism Spectrum Disorder (ASD) diagnoses from electronic sources in large healthcare systems. We examined 1,272 charts from ASD diagnosed youth <18 years old. Expert reviewers classified diagnoses as confirmed, probable, possible, ruled out, or not enough information. A total of 845 were classified with…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dixon, David A.; Hughes, Henry Grady
In this paper, we expand on previous validation work by Dixon and Hughes. That is, we present a more complete suite of validation results with respect to to the well-known Lockwood energy deposition experiment. Lockwood et al. measured energy deposition in materials including beryllium, carbon, aluminum, iron, copper, molybdenum, tantalum, and uranium, for both single- and multi-layer 1-D geometries. Source configurations included mono-energetic, mono-directional electron beams with energies of 0.05-MeV, 0.1-MeV, 0.3- MeV, 0.5-MeV, and 1-MeV, in both normal and off-normal angles of incidence. These experiments are particularly valuable for validating electron transport codes, because they are closely represented bymore » simulating pencil beams incident on 1-D semi-infinite slabs with and without material interfaces. Herein, we include total energy deposition and energy deposition profiles for the single-layer experiments reported by Lockwood et al. (a more complete multi-layer validation will follow in another report).« less
ERIC Educational Resources Information Center
Aquino, Cesar A.
2014-01-01
This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…
42 CFR 488.8 - Federal review of accreditation organizations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... through validation surveys, the State survey agency monitors corrections as specified at § 488.7(b)(3... CMS with electronic data in ASCII comparable code and reports necessary for effective validation and...) Validation review. Following the end of a validation review period, CMS will identify any accreditation...
42 CFR 488.8 - Federal review of accreditation organizations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... through validation surveys, the State survey agency monitors corrections as specified at § 488.7(b)(3... CMS with electronic data in ASCII comparable code and reports necessary for effective validation and...) Validation review. Following the end of a validation review period, CMS will identify any accreditation...
42 CFR 488.8 - Federal review of accreditation organizations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... through validation surveys, the State survey agency monitors corrections as specified at § 488.7(b)(3... CMS with electronic data in ASCII comparable code and reports necessary for effective validation and...) Validation review. Following the end of a validation review period, CMS will identify any accreditation...
Quantum Theory of Orbital Magnetization and Its Generalization to Interacting Systems
NASA Astrophysics Data System (ADS)
Shi, Junren; Vignale, G.; Xiao, Di; Niu, Qian
2007-11-01
Based on standard perturbation theory, we present a full quantum derivation of the formula for the orbital magnetization in periodic systems. The derivation is generally valid for insulators with or without a Chern number, for metals at zero or finite temperatures, and at weak as well as strong magnetic fields. The formula is shown to be valid in the presence of electron-electron interaction, provided the one-electron energies and wave functions are calculated self-consistently within the framework of the exact current and spin-density functional theory.
On the validity of the Arrhenius equation for electron attachment rate coefficients.
Fabrikant, Ilya I; Hotop, Hartmut
2008-03-28
The validity of the Arrhenius equation for dissociative electron attachment rate coefficients is investigated. A general analysis allows us to obtain estimates of the upper temperature bound for the range of validity of the Arrhenius equation in the endothermic case and both lower and upper bounds in the exothermic case with a reaction barrier. The results of the general discussion are illustrated by numerical examples whereby the rate coefficient, as a function of temperature for dissociative electron attachment, is calculated using the resonance R-matrix theory. In the endothermic case, the activation energy in the Arrhenius equation is close to the threshold energy, whereas in the case of exothermic reactions with an intermediate barrier, the activation energy is found to be substantially lower than the barrier height.
An atomic model of brome mosaic virus using direct electron detection and real-space optimization.
Wang, Zhao; Hryc, Corey F; Bammes, Benjamin; Afonine, Pavel V; Jakana, Joanita; Chen, Dong-Hua; Liu, Xiangan; Baker, Matthew L; Kao, Cheng; Ludtke, Steven J; Schmid, Michael F; Adams, Paul D; Chiu, Wah
2014-09-04
Advances in electron cryo-microscopy have enabled structure determination of macromolecules at near-atomic resolution. However, structure determination, even using de novo methods, remains susceptible to model bias and overfitting. Here we describe a complete workflow for data acquisition, image processing, all-atom modelling and validation of brome mosaic virus, an RNA virus. Data were collected with a direct electron detector in integrating mode and an exposure beyond the traditional radiation damage limit. The final density map has a resolution of 3.8 Å as assessed by two independent data sets and maps. We used the map to derive an all-atom model with a newly implemented real-space optimization protocol. The validity of the model was verified by its match with the density map and a previous model from X-ray crystallography, as well as the internal consistency of models from independent maps. This study demonstrates a practical approach to obtain a rigorously validated atomic resolution electron cryo-microscopy structure.
ERIC Educational Resources Information Center
Appenzellar, Anne B.; Kelley, H. Paul
The Measurement and Evaluation Center of the University of Texas (Austin) conducted a validity study to assist the Department of Management Science and Information (DMSI) at the College of Business Administration in establishing a program of credit by examination for an introductory course in electronic data processing--Data Processing Analysis…
40 CFR 262.25 - Electronic manifest signatures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Electronic manifest signatures. 262.25... (CONTINUED) STANDARDS APPLICABLE TO GENERATORS OF HAZARDOUS WASTE The Manifest § 262.25 Electronic manifest signatures. Electronic signature methods for the e-Manifest system shall: (a) Be a legally valid and...
Validity criteria for Fermi's golden rule scattering rates applied to metallic nanowires.
Moors, Kristof; Sorée, Bart; Magnus, Wim
2016-09-14
Fermi's golden rule underpins the investigation of mobile carriers propagating through various solids, being a standard tool to calculate their scattering rates. As such, it provides a perturbative estimate under the implicit assumption that the effect of the interaction Hamiltonian which causes the scattering events is sufficiently small. To check the validity of this assumption, we present a general framework to derive simple validity criteria in order to assess whether the scattering rates can be trusted for the system under consideration, given its statistical properties such as average size, electron density, impurity density et cetera. We derive concrete validity criteria for metallic nanowires with conduction electrons populating a single parabolic band subjected to different elastic scattering mechanisms: impurities, grain boundaries and surface roughness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Istomin, V. A.; Kustova, E. V.; Mekhonoshina, M. A.
2014-12-09
In the present work we evaluate the accuracy of the Eucken formula and Stokes’ viscosity relation in high temperature non-equilibrium air species with electronic excitation. The thermal conductivity coefficient calculated using the exact kinetic theory methods is compared with that obtained applying approximate formulas in the temperature range 200–20000 K. A modification of the Eucken formula providing a good agreement with exact calculations is proposed. It is shown that the Stokes viscosity relation is not valid in electronically excited monoatomic gases at temperatures higher than 2000 K.
ERIC Educational Resources Information Center
Amireault, Steve; Godin, Gaston
2014-01-01
The purpose of this study was to provide three construct validity evidence for using fitness center attendance electronic records to objectively assess the frequency of leisure-time physical activity among adults. One hundred members of a fitness center (45 women and 55 men; aged 18 to 64 years) completed a self-report leisure-time physical…
Modeling the relativistic runaway electron avalanche and the feedback mechanism with GEANT4
Skeltved, Alexander Broberg; Østgaard, Nikolai; Carlson, Brant; Gjesteland, Thomas; Celestin, Sebastien
2014-01-01
This paper presents the first study that uses the GEometry ANd Tracking 4 (GEANT4) toolkit to do quantitative comparisons with other modeling results related to the production of terrestrial gamma ray flashes and high-energy particle emission from thunderstorms. We will study the relativistic runaway electron avalanche (RREA) and the relativistic feedback process, as well as the production of bremsstrahlung photons from runaway electrons. The Monte Carlo simulations take into account the effects of electron ionization, electron by electron (Møller), and electron by positron (Bhabha) scattering as well as the bremsstrahlung process and pair production, in the 250 eV to 100 GeV energy range. Our results indicate that the multiplication of electrons during the development of RREAs and under the influence of feedback are consistent with previous estimates. This is important to validate GEANT4 as a tool to model RREAs and feedback in homogeneous electric fields. We also determine the ratio of bremsstrahlung photons to energetic electrons Nγ/Ne. We then show that the ratio has a dependence on the electric field, which can be expressed by the avalanche time τ(E) and the bremsstrahlung coefficient α(ε). In addition, we present comparisons of GEANT4 simulations performed with a “standard” and a “low-energy” physics list both validated in the 1 keV to 100 GeV energy range. This comparison shows that the choice of physics list used in GEANT4 simulations has a significant effect on the results. Key Points Testing the feedback mechanism with GEANT4 Validating the GEANT4 programming toolkit Study the ratio of bremsstrahlung photons to electrons at TGF source altitude PMID:26167437
Electron backscattering simulation in Geant4
NASA Astrophysics Data System (ADS)
Dondero, Paolo; Mantero, Alfonso; Ivanchencko, Vladimir; Lotti, Simone; Mineo, Teresa; Fioretti, Valentina
2018-06-01
The backscattering of electrons is a key phenomenon in several physics applications which range from medical therapy to space including AREMBES, the new ESA simulation framework for radiation background effects. The importance of properly reproducing this complex interaction has grown considerably in the last years and the Geant4 Monte Carlo simulation toolkit, recently upgraded to the version 10.3, is able to comply with the AREMBES requirements in a wide energy range. In this study a validation of the electron Geant4 backscattering models is performed with respect to several experimental data. In addition a selection of the most recent validation results on the electron scattering processes is also presented. Results of our analysis show a good agreement between simulations and data from several experiments, confirming the Geant4 electron backscattering models to be robust and reliable up to a few tens of electronvolts.
Validity criteria for Fermi’s golden rule scattering rates applied to metallic nanowires
NASA Astrophysics Data System (ADS)
Moors, Kristof; Sorée, Bart; Magnus, Wim
2016-09-01
Fermi’s golden rule underpins the investigation of mobile carriers propagating through various solids, being a standard tool to calculate their scattering rates. As such, it provides a perturbative estimate under the implicit assumption that the effect of the interaction Hamiltonian which causes the scattering events is sufficiently small. To check the validity of this assumption, we present a general framework to derive simple validity criteria in order to assess whether the scattering rates can be trusted for the system under consideration, given its statistical properties such as average size, electron density, impurity density et cetera. We derive concrete validity criteria for metallic nanowires with conduction electrons populating a single parabolic band subjected to different elastic scattering mechanisms: impurities, grain boundaries and surface roughness.
USDA-ARS?s Scientific Manuscript database
Given the unique physical activity patterns of preschoolers, wearable electronic devices for quantitative assessment of physical activity require validation in this population. Study objective was to validate uniaxial and triaxial accelerometers in preschoolers. Room calorimetry was performed over 3...
Minard, Janice P; Thomas, Nicola J; Olajos-Clow, Jennifer G; Wasilewski, Nastasia V; Jenkins, Blaine; Taite, Ann K; Day, Andrew G; Lougheed, M Diane
2016-01-01
To validate electronic versions of the Mini Pediatric and Pediatric Asthma Caregiver's Quality of Life Questionnaires (MiniPAQLQ and PACQLQ, respectively), determine completion times and correlate QOL of children and caregivers. A total of 63 children and 64 caregivers completed the paper and electronic MiniPAQLQ or PACQLQ. Agreement between versions of each questionnaire was summarized by intraclass correlation coefficients (ICC). The correlation between MiniPAQLQ and PACQLQ scores from child-caregiver pairs was assessed using Pearson's correlation coefficient. There was no significant difference (mean difference = 0.1, 95% CI -0.1, 0.2) in MiniPAQLQ Overall Scores between paper (5.9 ± 1.0, mean ± SD) and electronic (5.8 ± 1.0) versions, or any of the domains. ICCs ranged from 0.89 (Overall) to 0.86 (Emotional Function). Overall PACQLQ scores for both versions were comparable (5.9 ± 0.9 and 5.8 ± 1.0; mean difference = 0.0; 95% CI -0.1, 0.2). ICCs ranged from 0.81 (Activity Limitation) to 0.88 (Emotional Function). The electronic PACQLQ took 26 s longer (95% CI 11, 41; p < 0.001). Few participants (3-11%) preferred the paper format. MiniPAQLQ and PACQLQ scores were significantly correlated (all p < 0.05) for Overall (r paper = 0.33, r electronic = 0.27) and Emotional Function domains (r paper = 0.34, r electronic = 0.29). These electronic QOL questionnaires are valid, and asthma-related QOL of children and caregivers is related.
Robust validation of approximate 1-matrix functionals with few-electron harmonium atoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cioslowski, Jerzy, E-mail: jerzy@wmf.univ.szczecin.pl; Piris, Mario; Matito, Eduard
2015-12-07
A simple comparison between the exact and approximate correlation components U of the electron-electron repulsion energy of several states of few-electron harmonium atoms with varying confinement strengths provides a stringent validation tool for 1-matrix functionals. The robustness of this tool is clearly demonstrated in a survey of 14 known functionals, which reveals their substandard performance within different electron correlation regimes. Unlike spot-testing that employs dissociation curves of diatomic molecules or more extensive benchmarking against experimental atomization energies of molecules comprising some standard set, the present approach not only uncovers the flaws and patent failures of the functionals but, even moremore » importantly, also allows for pinpointing their root causes. Since the approximate values of U are computed at exact 1-densities, the testing requires minimal programming and thus is particularly suitable for rapid screening of new functionals.« less
Amra, Sakusic; O'Horo, John C; Singh, Tarun D; Wilson, Gregory A; Kashyap, Rahul; Petersen, Ronald; Roberts, Rosebud O; Fryer, John D; Rabinstein, Alejandro A; Gajic, Ognjen
2017-02-01
Long-term cognitive impairment is a common and important problem in survivors of critical illness. We developed electronic search algorithms to identify cognitive impairment and dementia from the electronic medical records (EMRs) that provide opportunity for big data analysis. Eligible patients met 2 criteria. First, they had a formal cognitive evaluation by The Mayo Clinic Study of Aging. Second, they were hospitalized in intensive care unit at our institution between 2006 and 2014. The "criterion standard" for diagnosis was formal cognitive evaluation supplemented by input from an expert neurologist. Using all available EMR data, we developed and improved our algorithms in the derivation cohort and validated them in the independent validation cohort. Of 993 participants who underwent formal cognitive testing and were hospitalized in intensive care unit, we selected 151 participants at random to form the derivation and validation cohorts. The automated electronic search algorithm for cognitive impairment was 94.3% sensitive and 93.0% specific. The search algorithms for dementia achieved respective sensitivity and specificity of 97% and 99%. EMR search algorithms significantly outperformed International Classification of Diseases codes. Automated EMR data extractions for cognitive impairment and dementia are reliable and accurate and can serve as acceptable and efficient alternatives to time-consuming manual data review. Copyright © 2016 Elsevier Inc. All rights reserved.
The Validity and Reliability of an iPhone App for Measuring Running Mechanics.
Balsalobre-Fernández, Carlos; Agopyan, Hovannes; Morin, Jean-Benoit
2017-07-01
The purpose of this investigation was to analyze the validity of an iPhone application (Runmatic) for measuring running mechanics. To do this, 96 steps from 12 different runs at speeds ranging from 2.77-5.55 m·s -1 were recorded simultaneously with Runmatic, as well as with an opto-electronic device installed on a motorized treadmill to measure the contact and aerial time of each step. Additionally, several running mechanics variables were calculated using the contact and aerial times measured, and previously validated equations. Several statistics were computed to test the validity and reliability of Runmatic in comparison with the opto-electronic device for the measurement of contact time, aerial time, vertical oscillation, leg stiffness, maximum relative force, and step frequency. The running mechanics values obtained with both the app and the opto-electronic device showed a high degree of correlation (r = .94-.99, p < .001). Moreover, there was very close agreement between instruments as revealed by the ICC (2,1) (ICC = 0.965-0.991). Finally, both Runmatic and the opto-electronic device showed almost identical reliability levels when measuring each set of 8 steps for every run recorded. In conclusion, Runmatic has been proven to be a highly reliable tool for measuring the running mechanics studied in this work.
NASA Astrophysics Data System (ADS)
Dixon, David A.; Hughes, H. Grady
2017-09-01
This paper presents a validation test comparing angular distributions from an electron multiple-scattering experiment with those generated using the MCNP6 Monte Carlo code system. In this experiment, a 13- and 20-MeV electron pencil beam is deflected by thin foils with atomic numbers from 4 to 79. To determine the angular distribution, the fluence is measured down range of the scattering foil at various radii orthogonal to the beam line. The characteristic angle (the angle for which the max of the distribution is reduced by 1/e) is then determined from the angular distribution and compared with experiment. Multiple scattering foils tested herein include beryllium, carbon, aluminum, copper, and gold. For the default electron-photon transport settings, the calculated characteristic angle was statistically distinguishable from measurement and generally broader than the measured distributions. The average relative difference ranged from 5.8% to 12.2% over all of the foils, source energies, and physics settings tested. This validation illuminated a deficiency in the computation of the underlying angular distributions that is well understood. As a result, code enhancements were made to stabilize the angular distributions in the presence of very small substeps. However, the enhancement only marginally improved results indicating that additional algorithmic details should be studied.
The accuracy and efficiency of electronic screening for recruitment into a clinical trial on COPD.
Schmickl, Christopher N; Li, Man; Li, Guangxi; Wetzstein, Marnie M; Herasevich, Vitaly; Gajic, Ognjen; Benzo, Roberto P
2011-10-01
Participant recruitment is an important process in successful conduct of randomized controlled trials. To facilitate enrollment into a National Institutes of Health-sponsored clinical trial involving patients with chronic obstructive pulmonary disease (COPD), we developed and prospectively validated an automated electronic screening tool based on boolean free-text search of admission notes in electronic medical records. During a 2-week validation period, all patients admitted to prespecified general medical services were screened for eligibility by both the electronic screening tool and a COPD nurse. Group discussion was the gold standard for confirmation of true-positive results. Compared with the gold standard, electronic screening yielded 100% sensitivity, 92% specificity, 100% negative predictive value, and 72% positive predictive value. Compared with traditional manual screening, electronic screening demonstrated time-saving potential of 76%. Thus, the electronic screening tool accurately identifies potential study subjects and improves efficiency of patient accrual for a clinical trial on COPD. This method may be expanded into other institutional and clinical settings. Copyright © 2011 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taborda, A; Benabdallah, N; Desbree, A
2015-06-15
Purpose: To perform a dosimetry study at the sub-cellular scale of Auger-electron emitter 99m-Tc using a mouse single thyroid cellular model to investigate the contribution of the 99m-Tc Auger-electrons to the absorbed dose and possible link to the thyroid stunning in in vivo experiments in mice, recently reported in literature. Methods: The simulation of S-values for Auger-electron emitting radionuclides was performed using both the recent MCNP6 software and the Geant4-DNA extension of the Geant4 toolkit. The dosimetric calculations were validated through comparison with results from literature, using a simple model of a single cell consisting of two concentric spheres ofmore » unit density water and for six Auger-electron emitting radionuclides. Furthermore, the S-values were calculated using a single thyroid follicle model for uniformly distributed 123-I and 125-I radionuclides and compared with published S-values. After validation, the simulation of the S-values was performed for the 99m-Tc radionuclide within the several mouse thyroid follicle cellular compartments, considering the radiative and non-radiative transitions of the 99m-Tc radiation spectrum. Results: The calculated S-values using MCNP6 are in good agreement with the results from literature, validating its use for the 99m-Tc S-values calculations. The most significant absorbed dose corresponds to the case where the radionuclide is uniformly distributed in the follicular cell’s nucleus, with a S-value of 7.8 mGy/disintegration, due mainly to the absorbed Auger-electrons. The results show that, at a sub-cellular scale, the emitted X-rays and gamma particles do not contribute significantly to the absorbed dose. Conclusion: In this work, MCNP6 was validated for dosimetric studies at the sub-cellular scale. It was shown that the contribution of the Auger-electrons to the absorbed dose is important at this scale compared to the emitted photons’ contribution and can’t be neglected. The obtained S-values of Auger-electron emitting 99m-Tc radionuclide will be presented and discussed.« less
Validation study of an electronic method of condensed outcomes tools reporting in orthopaedics.
Farr, Jack; Verma, Nikhil; Cole, Brian J
2013-12-01
Patient-reported outcomes (PRO) instruments are a vital source of data for evaluating the efficacy of medical treatments. Historically, outcomes instruments have been designed, validated, and implemented as paper-based questionnaires. The collection of paper-based outcomes information may result in patients becoming fatigued as they respond to redundant questions. This problem is exacerbated when multiple PRO measures are provided to a single patient. In addition, the management and analysis of data collected in paper format involves labor-intensive processes to score and render the data analyzable. Computer-based outcomes systems have the potential to mitigate these problems by reformatting multiple outcomes tools into a single, user-friendly tool.The study aimed to determine whether the electronic outcomes system presented produces results comparable with the test-retest correlations reported for the corresponding orthopedic paper-based outcomes instruments.The study is designed as a crossover study based on consecutive orthopaedic patients arriving at one of two designated orthopedic knee clinics.Patients were assigned to complete either a paper or a computer-administered questionnaire based on a similar set of questions (Knee injury and Osteoarthritis Outcome Score, International Knee Documentation Committee form, 36-Item Short Form survey, version 1, Lysholm Knee Scoring Scale). Each patient completed the same surveys using the other instrument, so that all patients had completed both paper and electronic versions. Correlations between the results from the two modes were studied and compared with test-retest data from the original validation studies.The original validation studies established test-retest reliability by computing correlation coefficients for two administrations of the paper instrument. Those correlation coefficients were all in the range of 0.7 to 0.9, which was deemed satisfactory. The present study computed correlation coefficients between the paper and electronic modes of administration. These correlation coefficients demonstrated similar results with an overall value of 0.86.On the basis of the correlation coefficients, the electronic application of commonly used knee outcome scores compare variably to the traditional paper variants with a high rate of test-retest correlation. This equivalence supports the use of the condensed electronic outcomes system and validates comparison of scores between electronic and paper modes. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Student Off-Task Electronic Multitasking Predictors: Scale Development and Validation
ERIC Educational Resources Information Center
Qian, Yuxia; Li, Li
2017-01-01
In an attempt to better understand factors contributing to students' off-task electronic multitasking behavior in class, the research included two studies that developed a scale of students' off-task electronic multitasking predictors (the SOTEMP scale), and explored relationships between the scale and various classroom communication processes and…
Optimisation of 12 MeV electron beam simulation using variance reduction technique
NASA Astrophysics Data System (ADS)
Jayamani, J.; Termizi, N. A. S. Mohd; Kamarulzaman, F. N. Mohd; Aziz, M. Z. Abdul
2017-05-01
Monte Carlo (MC) simulation for electron beam radiotherapy consumes a long computation time. An algorithm called variance reduction technique (VRT) in MC was implemented to speed up this duration. This work focused on optimisation of VRT parameter which refers to electron range rejection and particle history. EGSnrc MC source code was used to simulate (BEAMnrc code) and validate (DOSXYZnrc code) the Siemens Primus linear accelerator model with the non-VRT parameter. The validated MC model simulation was repeated by applying VRT parameter (electron range rejection) that controlled by global electron cut-off energy 1,2 and 5 MeV using 20 × 107 particle history. 5 MeV range rejection generated the fastest MC simulation with 50% reduction in computation time compared to non-VRT simulation. Thus, 5 MeV electron range rejection utilized in particle history analysis ranged from 7.5 × 107 to 20 × 107. In this study, 5 MeV electron cut-off with 10 × 107 particle history, the simulation was four times faster than non-VRT calculation with 1% deviation. Proper understanding and use of VRT can significantly reduce MC electron beam calculation duration at the same time preserving its accuracy.
EMRinger: side chain–directed model and map validation for 3D cryo-electron microscopy
Barad, Benjamin A.; Echols, Nathaniel; Wang, Ray Yu-Ruei; ...
2015-08-17
Advances in high-resolution cryo-electron microscopy (cryo-EM) require the development of validation metrics to independently assess map quality and model geometry. We report that EMRinger is a tool that assesses the precise fitting of an atomic model into the map during refinement and shows how radiation damage alters scattering from negatively charged amino acids. EMRinger (https://github.com/fraser-lab/EMRinger) will be useful for monitoring progress in resolving and modeling high-resolution features in cryo-EM.
Validation of ligands in macromolecular structures determined by X-ray crystallography
Horský, Vladimír; Svobodová Vařeková, Radka; Bendová, Veronika
2018-01-01
Crystallographic studies of ligands bound to biological macromolecules (proteins and nucleic acids) play a crucial role in structure-guided drug discovery and design, and also provide atomic level insights into the physical chemistry of complex formation between macromolecules and ligands. The quality with which small-molecule ligands have been modelled in Protein Data Bank (PDB) entries has been, and continues to be, a matter of concern for many investigators. Correctly interpreting whether electron density found in a binding site is compatible with the soaked or co-crystallized ligand or represents water or buffer molecules is often far from trivial. The Worldwide PDB validation report (VR) provides a mechanism to highlight any major issues concerning the quality of the data and the model at the time of deposition and annotation, so the depositors can fix issues, resulting in improved data quality. The ligand-validation methods used in the generation of the current VRs are described in detail, including an examination of the metrics to assess both geometry and electron-density fit. It is found that the LLDF score currently used to identify ligand electron-density fit outliers can give misleading results and that better ligand-validation metrics are required. PMID:29533230
Validation of Multitemperature Nozzle Flow Code
NASA Technical Reports Server (NTRS)
Park, Chul; Lee, Seung -Ho.
1994-01-01
A computer code nozzle in n-temperatures (NOZNT), which calculates one-dimensional flows of partially dissociated and ionized air in an expanding nozzle, is tested against three existing sets of experimental data taken in arcjet wind tunnels. The code accounts for the differences among various temperatures, i.e., translational-rotational temperature, vibrational temperatures of individual molecular species, and electron-electronic temperature, and the effects of impurities. The experimental data considered are (1) the spectroscopic emission data; (2) electron beam data on vibrational temperature; and (3) mass-spectrometric species concentration data. It is shown that the impurities are inconsequential for the arcjet flows, and the NOZNT code is validated by numerically reproducing the experimental data.
First Results on the High Energy Cosmic Ray Electron Spectrum from Fermi Lat
NASA Technical Reports Server (NTRS)
Moiseev, Alexander
2009-01-01
This viewgraph presentation addresses energy reconstruction, electron-hadron separation, validation of Monte Carlo with flight data and an assessment of systematic errors from the Fermi Large Area Telescope.
Multimethod Investigation of Interpersonal Functioning in Borderline Personality Disorder
Stepp, Stephanie D.; Hallquist, Michael N.; Morse, Jennifer Q.; Pilkonis, Paul A.
2011-01-01
Even though interpersonal functioning is of great clinical importance for patients with borderline personality disorder (BPD), the comparative validity of different assessment methods for interpersonal dysfunction has not yet been tested. This study examined multiple methods of assessing interpersonal functioning, including self- and other-reports, clinical ratings, electronic diaries, and social cognitions in three groups of psychiatric patients (N=138): patients with (1) BPD, (2) another personality disorder, and (3) Axis I psychopathology only. Using dominance analysis, we examined the predictive validity of each method in detecting changes in symptom distress and social functioning six months later. Across multiple methods, the BPD group often reported higher interpersonal dysfunction scores compared to other groups. Predictive validity results demonstrated that self-report and electronic diary ratings were the most important predictors of distress and social functioning. Our findings suggest that self-report scores and electronic diary ratings have high clinical utility, as these methods appear most sensitive to change. PMID:21808661
Gil Montalbán, Elisa; Ortiz Marrón, Honorato; López-Gay Lucio-Villegas, Dulce; Zorrilla Torrás, Belén; Arrieta Blanco, Francisco; Nogales Aguado, Pedro
2014-01-01
To assess the validity and concordance of diabetes data in the electronic health records of primary care (Madrid-PC) by comparing with those from the PREDIMERC study. The sensitivity, specificity, positive predictive value, negative predictive value and kappa index of diabetes cases recorded in the health records of Madrid-PC were calculated by using data from PREDIMERC as the gold standard. The prevalence of diabetes was also determined according to each data source. The sensitivity of diabetes recorded in Madrid-PC was 74%, the specificity was 98.8%, the positive predictive value was 87.9%, the negative predictive value was 97.3%, and the kappa index was 0.78. The prevalence of diabetes recorded in Madrid-PC was 6.7% versus 8.1% by PREDIMERC, where known diabetes was 6.3%. The electronic health records of primary care are a valid source for epidemiological surveillance of diabetes in Madrid. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.
Gelhorn, Heather L; Skalicky, Anne M; Balantac, Zaneta; Eremenco, Sonya; Cimms, Tricia; Halling, Katarina; Hollen, Patricia J; Gralla, Richard J; Mahoney, Martin C; Sexton, Chris
2018-07-01
Obtaining qualitative data directly from the patient perspective enhances the content validity of patient-reported outcome (PRO) instruments. The objective of this qualitative study was to evaluate the content validity of the Lung Cancer Symptom Scale for Mesothelioma (LCSS-Meso) and its usability on an electronic device. A cross-sectional methodological study, using a qualitative approach, was conducted among patients recruited from four clinical sites. The primary target population included patients with pleural mesothelioma; data were also collected from patients with peritoneal mesothelioma on an exploratory basis. Semi-structured interviews were conducted consisting of concept elicitation, cognitive interviewing, and evaluation of electronic patient-reported outcome (ePRO) usability. Participants (n = 21) were interviewed in person (n = 9) or by telephone (n = 12); 71% were male with a mean age of 69 years (SD = 14). The most common signs and symptoms experienced by participants with pleural mesothelioma (n = 18) were shortness of breath, fluid build-up, pain, fatigue, coughing, and appetite loss. The most commonly described symptoms for those with peritoneal mesothelioma (n = 4) were bloating, changes in appetite, fatigue, fluid build-up, shortness of breath, and pain. Participants with pleural mesothelioma commonly described symptoms assessed by the LCSS-Meso in language consistent with the questionnaire and a majority understood and easily completed each of the items. The ePRO version was easy to use, and there was no evidence that the electronic formatting changed the way participants responded to the questions. Results support the content validity of the LCSS-Meso and the usability of the electronic format for use in assessing symptoms among patients with pleural mesothelioma.
Electron Beam-Cure Polymer Matrix Composites: Processing and Properties
NASA Technical Reports Server (NTRS)
Wrenn, G.; Frame, B.; Jensen, B.; Nettles, A.
2001-01-01
Researchers from NASA and Oak Ridge National Laboratory are evaluating a series of electron beam curable composites for application in reusable launch vehicle airframe and propulsion systems. Objectives are to develop electron beam curable composites that are useful at cryogenic to elevated temperatures (-217 C to 200 C), validate key mechanical properties of these composites, and demonstrate cost-saving fabrication methods at the subcomponent level. Electron beam curing of polymer matrix composites is an enabling capability for production of aerospace structures in a non-autoclave process. Payoffs of this technology will be fabrication of composite structures at room temperature, reduced tooling cost and cure time, and improvements in component durability. This presentation covers the results of material property evaluations for electron beam-cured composites made with either unidirectional tape or woven fabric architectures. Resin systems have been evaluated for performance in ambient, cryogenic, and elevated temperature conditions. Results for electron beam composites and similar composites cured in conventional processes are reviewed for comparison. Fabrication demonstrations were also performed for electron beam-cured composite airframe and propulsion piping subcomponents. These parts have been built to validate manufacturing methods with electron beam composite materials, to evaluate electron beam curing processing parameters, and to demonstrate lightweight, low-cost tooling options.
NASA Astrophysics Data System (ADS)
Mandigo Anggana Raras, Gustav
2018-04-01
This research aims to produce a product in the form of flash based interactive learning media on a basic electronic engineering subject that reliable to be used and to know students’ responses about the media. The target of this research is X-TEI 1 class at SMK Negeri 1 Driyorejo – Gresik. The method used in this study is R&D that has been limited into seven stages only (1) potential and problems, (2) data collection, (3) product design, (4) product validation, (5) product revision, (6) field test, and (7) analysis and writing. The obtained result is interactive learning media named MELDASH. Validation process used to produce a valid interactive learning media. The result of media validation state that the interactive learning media has a 90.83% rating. Students’ responses to this interactive learning media is really good with 88.89% rating.
21 CFR 1305.21 - Requirements for electronic orders.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) To be valid, the purchaser must sign an electronic order for a Schedule I or II controlled substance... 1311 of this chapter. (b) The following data fields must be included on an electronic order for... either the purchaser or the supplier). (8) The quantity in a single package or container. (9) The number...
21 CFR 1305.21 - Requirements for electronic orders.
Code of Federal Regulations, 2011 CFR
2011-04-01
...) To be valid, the purchaser must sign an electronic order for a Schedule I or II controlled substance... 1311 of this chapter. (b) The following data fields must be included on an electronic order for... either the purchaser or the supplier). (8) The quantity in a single package or container. (9) The number...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-13
... validate the identity of individuals who enter Department facilities. The data will be entered into the... electronic transferable records. Working Group IV (international electronic commerce) of the United Nations... http://www.uncitral.org/uncitral/en/commission/working_groups/4Electronic_Commerce.html . The ACPIL...
Electronic astronomical information handling and flexible publishing.
NASA Astrophysics Data System (ADS)
Heck, A.
The current dramatic evolution in information technology is bringing major modifications in the way scientists work and communicate. The concept of electronic information handling encompasses the diverse types of information, the different media, as well as the various communication methodologies and technologies. It ranges from the very collection of data until the final publication of results and sharing of knowledge. New problems and challenges result also from the new information culture, especially on legal, ethical, and educational grounds. Electronic publishing will have to diverge from an electronic version of contributions on paper and will be part of a more general flexible-publishing policy. The benefits of private publishing are questioned. The procedures for validating published material and for evaluating scientific activities will have to be adjusted too. Provision of electronic refereed information independently from commercial publishers in now feasible. Scientists and scientific institutions have now the possibility to run an efficient information server with validated (refereed) material without the help of a commercial publishers.
Erickson, Jennifer; Abbott, Kenneth; Susienka, Lucinda
2018-06-01
Homeless patients face a variety of obstacles in pursuit of basic social services. Acknowledging this, the Social Security Administration directs employees to prioritize homeless patients and handle their disability claims with special care. However, under existing manual processes for identification of homelessness, many homeless patients never receive the special service to which they are entitled. In this paper, we explore address validation and automatic annotation of electronic health records to improve identification of homeless patients. We developed a sample of claims containing medical records at the moment of arrival in a single office. Using address validation software, we reconciled patient addresses with public directories of homeless shelters, veterans' hospitals and clinics, and correctional facilities. Other tools annotated electronic health records. We trained random forests to identify homeless patients and validated each model with 10-fold cross validation. For our finished model, the area under the receiver operating characteristic curve was 0.942. The random forest improved sensitivity from 0.067 to 0.879 but decreased positive predictive value to 0.382. Presumed false positive classifications bore many characteristics of homelessness. Organizations could use these methods to prompt early collection of information necessary to avoid labor-intensive attempts to reestablish contact with homeless individuals. Annually, such methods could benefit tens of thousands of patients who are homeless, destitute, and in urgent need of assistance. We were able to identify many more homeless patients through a combination of automatic address validation and natural language processing of unstructured electronic health records. Copyright © 2018. Published by Elsevier Inc.
WINCADRE (COMPUTER-AIDED DATA REVIEW AND EVALUATION)
WinCADRE (Computer-Aided Data Review and Evaluation) is a Windows -based program designed for computer-assisted data validation. WinCADRE is a powerful tool which significantly decreases data validation turnaround time. The electronic-data-deliverable format has been designed ...
Jacqmin, Dustin J; Bredfeldt, Jeremy S; Frigo, Sean P; Smilowitz, Jennifer B
2017-01-01
The AAPM Medical Physics Practice Guideline (MPPG) 5.a provides concise guidance on the commissioning and QA of beam modeling and dose calculation in radiotherapy treatment planning systems. This work discusses the implementation of the validation testing recommended in MPPG 5.a at two institutions. The two institutions worked collaboratively to create a common set of treatment fields and analysis tools to deliver and analyze the validation tests. This included the development of a novel, open-source software tool to compare scanning water tank measurements to 3D DICOM-RT Dose distributions. Dose calculation algorithms in both Pinnacle and Eclipse were tested with MPPG 5.a to validate the modeling of Varian TrueBeam linear accelerators. The validation process resulted in more than 200 water tank scans and more than 50 point measurements per institution, each of which was compared to a dose calculation from the institution's treatment planning system (TPS). Overall, the validation testing recommended in MPPG 5.a took approximately 79 person-hours for a machine with four photon and five electron energies for a single TPS. Of the 79 person-hours, 26 person-hours required time on the machine, and the remainder involved preparation and analysis. The basic photon, electron, and heterogeneity correction tests were evaluated with the tolerances in MPPG 5.a, and the tolerances were met for all tests. The MPPG 5.a evaluation criteria were used to assess the small field and IMRT/VMAT validation tests. Both institutions found the use of MPPG 5.a to be a valuable resource during the commissioning process. The validation testing in MPPG 5.a showed the strengths and limitations of the TPS models. In addition, the data collected during the validation testing is useful for routine QA of the TPS, validation of software upgrades, and commissioning of new algorithms. © 2016 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
A peer-nomination assessment of electronic forms of aggression and victimization.
Badaly, Daryaneh; Duong, Mylien T; Ross, Alexandra C; Schwartz, David
2015-10-01
The perpetration and receipt of electronic aggression have largely been assessed with self-report questionnaires. Using a sample of 573 adolescents, the current study compared the psychometric properties of a peer-nomination measure of electronic aggression and victimization to the more widely used self-report approach. Estimates of the reliability, stability, and concordance of peer- and self-report assessments were adequate, mirroring those from research on aggressive exchanges in school. Analyses of validity and utility revealed that peer-nominations, compared to self-reports, provide overlapping and distinct information on adolescents' social, emotional, and academic adjustment. Overall, these findings provide evidence that peer-nominations are a reliable, valid, and useful means for measuring electronic aggression and victimization. Future work will benefit from their incorporation into multi-method assessments. Copyright © 2015 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ram, Farangis; De Graef, Marc
2018-04-01
In an electron backscatter diffraction pattern (EBSP), the angular distribution of backscattered electrons (BSEs) depends on their energy. Monte Carlo modeling of their depth and energy distributions suggests that the highest energy BSEs are more likely to hit the bottom of the detector than the top. In this paper, we examine experimental EBSPs to validate the modeled angular BSE distribution. To that end, the Kikuchi bandlet method is employed to measure the width of Kikuchi bands in both modeled and measured EBSPs. The results show that in an EBSP obtained with a 15 keV primary probe, the width of a Kikuchi band varies by about 0 .4∘ from the bottom of the EBSD detector to its top. The same is true for a simulated pattern that is composed of BSEs with 5 keV to 15 keV energies, which validates the Monte Carlo simulations.
Validation of multi-temperature nozzle flow code NOZNT
NASA Technical Reports Server (NTRS)
Park, Chul; Lee, Seung-Ho
1993-01-01
A computer code NOZNT (Nozzle in n-Temperatures), which calculates one-dimensional flows of partially dissociated and ionized air in an expanding nozzle, is tested against five existing sets of experimental data. The code accounts for: a) the differences among various temperatures, i.e., translational-rotational temperature, vibrational temperatures of individual molecular species, and electron-electronic temperature, b) radiative cooling, and c) the effects of impurities. The experimental data considered are: 1) the sodium line reversal and 2) the electron temperature and density data, both obtained in a shock tunnel, and 3) the spectroscopic emission data, 4) electron beam data on vibrational temperature, and 5) mass-spectrometric species concentration data, all obtained in arc-jet wind tunnels. It is shown that the impurities are most likely responsible for the observed phenomena in shock tunnels. For the arc-jet flows, impurities are inconsequential and the NOZNT code is validated by numerically reproducing the experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holst, Bastian; French, Martin; Redmer, Ronald
2011-06-15
Using Kubo's linear response theory, we derive expressions for the frequency-dependent electrical conductivity (Kubo-Greenwood formula), thermopower, and thermal conductivity in a strongly correlated electron system. These are evaluated within ab initio molecular dynamics simulations in order to study the thermoelectric transport coefficients in dense liquid hydrogen, especially near the nonmetal-to-metal transition region. We also observe significant deviations from the widely used Wiedemann-Franz law, which is strictly valid only for degenerate systems, and give an estimate for its valid scope of application toward lower densities.
Interacting charges and the classical electron radius
NASA Astrophysics Data System (ADS)
De Luca, Roberto; Di Mauro, Marco; Faella, Orazio; Naddeo, Adele
2018-03-01
The equation of the motion of a point charge q repelled by a fixed point-like charge Q is derived and studied. In solving this problem useful concepts in classical and relativistic kinematics, in Newtonian mechanics and in non-linear ordinary differential equations are revised. The validity of the approximations is discussed from the physical point of view. In particular the classical electron radius emerges naturally from the requirement that the initial distance is large enough for the non-relativistic approximation to be valid. The relevance of this topic for undergraduate physics teaching is pointed out.
Equivalence of electronic and paper-based patient-reported outcome measures.
Campbell, Niloufar; Ali, Faraz; Finlay, Andrew Y; Salek, Sam S
2015-08-01
Electronic formats (ePROs) of paper-based patient-reported outcomes (PROs) should be validated before they can be reliably used. This review aimed to examine studies investigating measurement equivalence between ePROs and their paper originals to identify methodologies used and to determine the extent of such validation. Three databases (OvidSP, Web of Science and PubMed) were searched using a set of keywords. Results were examined for compliance with inclusion criteria. Articles or abstracts that directly compared screen-based electronic versions of PROs with their validated paper-based originals, with regard to their measurement equivalence, were included. Publications were excluded if the only instruments reported were stand-alone visual analogue scales or interactive voice response formats. Papers published before 2007 were excluded, as a previous meta-analysis examined papers published before this time. Fifty-five studies investigating 79 instruments met the inclusion criteria. 53 % of the 79 instruments studied were condition specific. Several instruments, such as the SF-36, were reported in more than one publication. The most frequently reported formats for ePROs were Web-based versions. In 78 % of the publications, there was evidence of equivalence or comparability between the two formats as judged by study authors. Of the 30 publications that provided preference data, 87 % found that overall participants preferred the electronic format. When examining equivalence between paper and electronic versions of PROs, formats are usually judged by authors to be equivalent. Participants prefer electronic formats. This literature review gives encouragement to the further widespread development and use of ePROs.
Al Sallakh, Mohammad A; Vasileiou, Eleftheria; Rodgers, Sarah E; Lyons, Ronan A; Sheikh, Aziz; Davies, Gwyneth A
2017-06-01
There is currently no consensus on approaches to defining asthma or assessing asthma outcomes using electronic health record-derived data. We explored these approaches in the recent literature and examined the clarity of reporting.We systematically searched for asthma-related articles published between January 1, 2014 and December 31, 2015, extracted the algorithms used to identify asthma patients and assess severity, control and exacerbations, and examined how the validity of these outcomes was justified.From 113 eligible articles, we found significant heterogeneity in the algorithms used to define asthma (n=66 different algorithms), severity (n=18), control (n=9) and exacerbations (n=24). For the majority of algorithms (n=106), validity was not justified. In the remaining cases, approaches ranged from using algorithms validated in the same databases to using nonvalidated algorithms that were based on clinical judgement or clinical guidelines. The implementation of these algorithms was suboptimally described overall.Although electronic health record-derived data are now widely used to study asthma, the approaches being used are significantly varied and are often underdescribed, rendering it difficult to assess the validity of studies and compare their findings. Given the substantial growth in this body of literature, it is crucial that scientific consensus is reached on the underlying definitions and algorithms. Copyright ©ERS 2017.
Models of protein–ligand crystal structures: trust, but verify
Deller, Marc C.
2015-01-01
X-ray crystallography provides the most accurate models of protein–ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein–ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein–ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein–ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein–ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein–ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein–ligand models for their computational and biological studies, and we provide an overview of how this can be achieved. PMID:25665575
Models of protein-ligand crystal structures: trust, but verify.
Deller, Marc C; Rupp, Bernhard
2015-09-01
X-ray crystallography provides the most accurate models of protein-ligand structures. These models serve as the foundation of many computational methods including structure prediction, molecular modelling, and structure-based drug design. The success of these computational methods ultimately depends on the quality of the underlying protein-ligand models. X-ray crystallography offers the unparalleled advantage of a clear mathematical formalism relating the experimental data to the protein-ligand model. In the case of X-ray crystallography, the primary experimental evidence is the electron density of the molecules forming the crystal. The first step in the generation of an accurate and precise crystallographic model is the interpretation of the electron density of the crystal, typically carried out by construction of an atomic model. The atomic model must then be validated for fit to the experimental electron density and also for agreement with prior expectations of stereochemistry. Stringent validation of protein-ligand models has become possible as a result of the mandatory deposition of primary diffraction data, and many computational tools are now available to aid in the validation process. Validation of protein-ligand complexes has revealed some instances of overenthusiastic interpretation of ligand density. Fundamental concepts and metrics of protein-ligand quality validation are discussed and we highlight software tools to assist in this process. It is essential that end users select high quality protein-ligand models for their computational and biological studies, and we provide an overview of how this can be achieved.
Evaluation and implementation of chemotherapy regimen validation in an electronic health record.
Diaz, Amber H; Bubalo, Joseph S
2014-12-01
Computerized provider order entry of chemotherapy regimens is quickly becoming the standard for prescribing chemotherapy in both inpatient and ambulatory settings. One of the difficulties with implementation of chemotherapy regimen computerized provider order entry lies in verifying the accuracy and completeness of all regimens built in the system library. Our goal was to develop, implement, and evaluate a process for validating chemotherapy regimens in an electronic health record. We describe our experience developing and implementing a process for validating chemotherapy regimens in the setting of a standard, commercially available computerized provider order entry system. The pilot project focused on validating chemotherapy regimens in the adult inpatient oncology setting and adult ambulatory hematologic malignancy setting. A chemotherapy regimen validation process was defined as a result of the pilot project. Over a 27-week pilot period, 32 chemotherapy regimens were validated using the process we developed. Results of the study suggest that by validating chemotherapy regimens, the amount of time spent by pharmacists in daily chemotherapy review was decreased. In addition, the number of pharmacist modifications required to make regimens complete and accurate were decreased. Both physician and pharmacy disciplines showed improved satisfaction and confidence levels with chemotherapy regimens after implementation of the validation system. Chemotherapy regimen validation required a considerable amount of planning and time but resulted in increased pharmacist efficiency and improved provider confidence and satisfaction. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
WINCADRE INORGANIC (WINDOWS COMPUTER-AIDED DATA REVIEW AND EVALUATION)
WinCADRE (Computer-Aided Data Review and Evaluation) is a Windows -based program designed for computer-assisted data validation. WinCADRE is a powerful tool which significantly decreases data validation turnaround time. The electronic-data-deliverable format has been designed in...
Accessing Electronic Journals.
ERIC Educational Resources Information Center
McKay, Sharon Cline
1999-01-01
Discusses issues librarians need to consider when providing access to electronic journals. Topics include gateways; index and abstract services; validation and pay-per-view; title selection; integration with OPACs (online public access catalogs)or Web sites; paper availability; ownership versus access; usage restrictions; and services offered…
Electronic versus paper-pencil methods for assessing chemotherapy-induced peripheral neuropathy.
Knoerl, Robert; Gray, Evan; Stricker, Carrie; Mitchell, Sandra A; Kippe, Kelsey; Smith, Gloria; Dudley, William N; Lavoie Smith, Ellen M
2017-11-01
The aim of this study is to examine and compare with the validated, paper/pencil European Organisation for Research and Treatment of Cancer Quality of Life Questionnaire-Chemotherapy-Induced Peripheral Neuropathy Scale (QLQ-CIPN20), the psychometric properties of three electronically administered patient reported outcome (PRO) measures of chemotherapy-induced peripheral neuropathy (CIPN): (1) the two neuropathy items from the National Cancer Institute's Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE), (2) the QLQ-CIPN20, and (3) the 0-10 Neuropathy Screening Question (NSQ). We employed a descriptive, cross-sectional design and recruited 25 women with breast cancer who were receiving neurotoxic chemotherapy at an academic hospital. Participants completed the paper/pencil QLQ-CIPN20 and electronic versions of the QLQ-CIPN20, PRO-CTCAE, and NSQ. Internal consistency reliability, intraclass correlation, and concurrent and discriminant validity analyses were conducted. The alpha coefficients for the electronic QLQ-CIPN20 sensory and motor subscales were 0.76 and 0.75. Comparison of the electronic and paper/pencil QLQ-CIPN20 subscales supported mode equivalence (intraclass correlation range >0.91). Participants who reported the presence of numbness/tingling via the single-item NSQ reported higher mean QLQ-CIPN20 sensory subscale scores (p < 0.001). PRO-CTCAE neuropathy severity and interference items correlated well with the QLQ-CIPN20 electronic and paper/pencil sensory (r = 0.76; r = 0.70) and motor (r = 0.55; r = 0.62) subscales, and with the NSQ (r = 0.72; r = 0.44). These data support the validity of the electronically administered PRO-CTCAE neuropathy items, NSQ, and QLQ-CIPN20 for neuropathy screening in clinical practice. The electronic and paper/pencil versions of the QLQ-CIPN can be used interchangeably based on evidence of mode equivalence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, E. S. M.; McEwen, M. R.; Rogers, D. W. O.
2012-11-15
Purpose: In a recent computational study, an improved physics-based approach was proposed for unfolding linac photon spectra and incident electron energies from transmission data. In this approach, energy differentiation is improved by simultaneously using transmission data for multiple attenuators and detectors, and the unfolding robustness is improved by using a four-parameter functional form to describe the photon spectrum. The purpose of the current study is to validate this approach experimentally, and to demonstrate its application on a typical clinical linac. Methods: The validation makes use of the recent transmission measurements performed on the Vickers research linac of National Research Councilmore » Canada. For this linac, the photon spectra were previously measured using a NaI detector, and the incident electron parameters are independently known. The transmission data are for eight beams in the range 10-30 MV using thick Be, Al and Pb bremsstrahlung targets. To demonstrate the approach on a typical clinical linac, new measurements are performed on an Elekta Precise linac for 6, 10 and 25 MV beams. The different experimental setups are modeled using EGSnrc, with the newly added photonuclear attenuation included. Results: For the validation on the research linac, the 95% confidence bounds of the unfolded spectra fall within the noise of the NaI data. The unfolded spectra agree with the EGSnrc spectra (calculated using independently known electron parameters) with RMS energy fluence deviations of 4.5%. The accuracy of unfolding the incident electron energy is shown to be {approx}3%. A transmission cutoff of only 10% is suitable for accurate unfolding, provided that the other components of the proposed approach are implemented. For the demonstration on a clinical linac, the unfolded incident electron energies and their 68% confidence bounds for the 6, 10 and 25 MV beams are 6.1 {+-} 0.1, 9.3 {+-} 0.1, and 19.3 {+-} 0.2 MeV, respectively. The unfolded spectra for the clinical linac agree with the EGSnrc spectra (calculated using the unfolded electron energies) with RMS energy fluence deviations of 3.7%. The corresponding measured and EGSnrc-calculated transmission data agree within 1.5%, where the typical transmission measurement uncertainty on the clinical linac is 0.4% (not including the uncertainties on the incident electron parameters). Conclusions: The approach proposed in an earlier study for unfolding photon spectra and incident electron energies from transmission data is accurate and practical for clinical use.« less
Lee, Jungpyo; Bonoli, Paul; Wright, John
2011-01-01
The quasilinear diffusion coefficient assuming a constant magnetic field along the electron orbit is widely used to describe electron Landau damping of waves in a tokamak where the magnitude of the magnetic field varies on a flux surface. To understand the impact of violating the constant magnetic field assumption, we introduce the effect of a broad-bandwidth wave spectrum which has been used in the past to validate quasilinear theory for the fast decorrelation process between resonances. By the reevaluation of the diffusion coefficient through the level of the phase integral for the tokamak geometry with the broad-band wave effect included,more » we identify the three acceptable errors for the use of the quasilinear diffusion coefficient.« less
A new hybrid code (CHIEF) implementing the inertial electron fluid equation without approximation
NASA Astrophysics Data System (ADS)
Muñoz, P. A.; Jain, N.; Kilian, P.; Büchner, J.
2018-03-01
We present a new hybrid algorithm implemented in the code CHIEF (Code Hybrid with Inertial Electron Fluid) for simulations of electron-ion plasmas. The algorithm treats the ions kinetically, modeled by the Particle-in-Cell (PiC) method, and electrons as an inertial fluid, modeled by electron fluid equations without any of the approximations used in most of the other hybrid codes with an inertial electron fluid. This kind of code is appropriate to model a large variety of quasineutral plasma phenomena where the electron inertia and/or ion kinetic effects are relevant. We present here the governing equations of the model, how these are discretized and implemented numerically, as well as six test problems to validate our numerical approach. Our chosen test problems, where the electron inertia and ion kinetic effects play the essential role, are: 0) Excitation of parallel eigenmodes to check numerical convergence and stability, 1) parallel (to a background magnetic field) propagating electromagnetic waves, 2) perpendicular propagating electrostatic waves (ion Bernstein modes), 3) ion beam right-hand instability (resonant and non-resonant), 4) ion Landau damping, 5) ion firehose instability, and 6) 2D oblique ion firehose instability. Our results reproduce successfully the predictions of linear and non-linear theory for all these problems, validating our code. All properties of this hybrid code make it ideal to study multi-scale phenomena between electron and ion scales such as collisionless shocks, magnetic reconnection and kinetic plasma turbulence in the dissipation range above the electron scales.
NASA Astrophysics Data System (ADS)
Hegde, Ganesh; Povolotskyi, Michael; Kubis, Tillmann; Boykin, Timothy; Klimeck, Gerhard
2014-03-01
Semi-empirical Tight Binding (TB) is known to be a scalable and accurate atomistic representation for electron transport for realistically extended nano-scaled semiconductor devices that might contain millions of atoms. In this paper, an environment-aware and transferable TB model suitable for electronic structure and transport simulations in technologically relevant metals, metallic alloys, metal nanostructures, and metallic interface systems are described. Part I of this paper describes the development and validation of the new TB model. The new model incorporates intra-atomic diagonal and off-diagonal elements for implicit self-consistency and greater transferability across bonding environments. The dependence of the on-site energies on strain has been obtained by appealing to the Moments Theorem that links closed electron paths in the system to energy moments of angular momentum resolved local density of states obtained ab initio. The model matches self-consistent density functional theory electronic structure results for bulk face centered cubic metals with and without strain, metallic alloys, metallic interfaces, and metallic nanostructures with high accuracy and can be used in predictive electronic structure and transport problems in metallic systems at realistically extended length scales.
Designing Interactive Electronic Module in Chemistry Lessons
NASA Astrophysics Data System (ADS)
Irwansyah, F. S.; Lubab, I.; Farida, I.; Ramdhani, M. A.
2017-09-01
This research aims to design electronic module (e-module) oriented to the development of students’ chemical literacy on the solution colligative properties material. This research undergoes some stages including concept analysis, discourse analysis, storyboard design, design development, product packaging, validation, and feasibility test. Overall, this research undertakes three main stages, namely, Define (in the form of preliminary studies); Design (designing e-module); Develop (including validation and model trial). The concept presentation and visualization used in this e-module is oriented to chemical literacy skills. The presentation order carries aspects of scientific context, process, content, and attitude. Chemists and multi media experts have done the validation to test the initial quality of the products and give a feedback for the product improvement. The feasibility test results stated that the content presentation and display are valid and feasible to be used with the value of 85.77% and 87.94%. These values indicate that this e-module oriented to students’ chemical literacy skills for the solution colligative properties material is feasible to be used.
Validation of an electronic device for measuring driving exposure.
Huebner, Kyla D; Porter, Michelle M; Marshall, Shawn C
2006-03-01
This study sought to evaluate an on-board diagnostic system (CarChip) for collecting driving exposure data in older drivers. Drivers (N = 20) aged 60 to 86 years from Winnipeg and surrounding communities participated. Information on driving exposure was obtained via the CarChip and global positioning system (GPS) technology on a driving course, and obtained via the CarChip and surveys over a week of driving. Velocities and distances were measured over the road course to validate the accuracy of the CarChip compared to GPS for those parameters. The results show that the CarChip does provide valid distance measurements and slightly lower maximum velocities than GPS measures. From the results obtained in this study, it was determined that retrospective self-reports of weekly driving distances are inaccurate. Therefore, an on-board diagnostic system (OBDII) electronic device like the CarChip can provide valid and detailed information about driving exposure that would be useful for studies of crash rates or driving behavior.
Besenyi, Gina M; Diehl, Paul; Schooley, Benjamin; Turner-McGrievy, Brie M; Wilcox, Sara; Stanis, Sonja A Wilhelm; Kaczynski, Andrew T
2016-12-01
Creation of mobile technology environmental audit tools can provide a more interactive way for youth to engage with communities and facilitate participation in health promotion efforts. This study describes the development and validity and reliability testing of an electronic version of the Community Park Audit Tool (eCPAT). eCPAT consists of 149 items and incorporates a variety of technology benefits. Criterion-related validity and inter-rater reliability were evaluated using data from 52 youth across 47 parks in Greenville County, SC. A large portion of items (>70 %) demonstrated either fair or moderate to perfect validity and reliability. All but six items demonstrated excellent percent agreement. The eCPAT app is a user-friendly tool that provides a comprehensive assessment of park environments. Given the proliferation of smartphones, tablets, and other electronic devices among both adolescents and adults, the eCPAT app has potential to be distributed and used widely for a variety of health promotion purposes.
Non-local electron transport validation using 2D DRACO simulations
NASA Astrophysics Data System (ADS)
Cao, Duc; Chenhall, Jeff; Moll, Eli; Prochaska, Alex; Moses, Gregory; Delettrez, Jacques; Collins, Tim
2012-10-01
Comparison of 2D DRACO simulations, using a modified versionfootnotetextprivate communications with M. Marinak and G. Zimmerman, LLNL. of the Schurtz, Nicolai and Busquet (SNB) algorithmfootnotetextSchurtz, Nicolai and Busquet, ``A nonlocal electron conduction model for multidimensional radiation hydrodynamics codes,'' Phys. Plasmas 7, 4238(2000). for non-local electron transport, with direct drive shock timing experimentsfootnotetextT. Boehly, et. al., ``Multiple spherically converging shock waves in liquid deuterium,'' Phys. Plasmas 18, 092706(2011). and with the Goncharov non-local modelfootnotetextV. Goncharov, et. al., ``Early stage of implosion in inertial confinement fusion: Shock timing and perturbation evolution,'' Phys. Plasmas 13, 012702(2006). in 1D LILAC will be presented. Addition of an improved SNB non-local electron transport algorithm in DRACO allows direct drive simulations with no need for an electron conduction flux limiter. Validation with shock timing experiments that mimic the laser pulse profile of direct drive ignition targets gives a higher confidence level in the predictive capability of the DRACO code. This research was supported by the University of Rochester Laboratory for Laser Energetics.
76 FR 70146 - Proposed Agency Information Collection Activities; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-10
..., electronically using the Federal Reserve's Internet Electronic Submission (IESUB) application. The IESUB application would validate the report data for mathematical and logical consistency, calculate derived items.... Federal Reserve Board Clearance Officer--Cynthia Ayouch--Division of Research and Statistics, Board of...
NASA Technical Reports Server (NTRS)
Hazelton, R. C.; Yadlowsky, E. J.; Churchill, R. J.; Parker, L. W.; Sellers, B.
1981-01-01
The effect differential charging of spacecraft thermal control surfaces is assessed by studying the dynamics of the charging process. A program to experimentally validate a computer model of the charging process was established. Time resolved measurements of the surface potential were obtained for samples of Kapton and Teflon irradiated with a monoenergetic electron beam. Results indicate that the computer model and experimental measurements agree well and that for Teflon, secondary emission is the governing factor. Experimental data indicate that bulk conductivities play a significant role in the charging of Kapton.
Validation of the electronic version of the BREAST-Q in the army of women study.
Fuzesi, Sarah; Cano, Stefan J; Klassen, Anne F; Atisha, Dunya; Pusic, Andrea L
2017-06-01
Women undergoing surgery for primary breast cancer can choose between breast conserving therapy and mastectomy (with or without breast reconstruction). Patients often turn to outcomes data to help guide the decision-making process. The BREAST-Q is a validated breast surgery-specific patient-reported outcome measure that evaluates satisfaction, quality of life, and patient experience. It was originally developed for paper-and-pencil administration. However, the BREAST-Q has increasingly been administered electronically. Therefore, the aim of this study was to evaluate the psychometric properties of an electronic version of the BREAST-Q in a large online survey. Women with a history of breast cancer surgery recruited from the Love/AVON Army of Women program completed an electronic version of the BREAST-Q in addition to the Impact of Cancer Survey and PTSD Checklist. Traditional psychometric analyses were performed on the collected data. BREAST-Q data were collected from 6748 women (3497 Breast Conserving Therapy module, 1295 Mastectomy module, 1956 Breast Reconstruction module). Acceptability was supported by a high response rate (82%), low frequency of missing data (<5%), and maximum endorsement frequencies (<80%) in all but 17 items. Scale reliability was supported by high Cronbach's α coefficients (≥0.78) and item-total correlations (range of means, 0.65-0.91). Validity was supported by interscale correlations, convergent and divergent hypotheses as well as clinical hypotheses. The electronically administered BREAST-Q yields highly reliable, clinically meaningful data for use in clinical outcomes research. The BREAST-Q can be used in the clinical setting, whether administered electronically or using paper-and-pencil, at the choice of the patient and surgeon. Copyright © 2017 Elsevier Ltd. All rights reserved.
Guiding-center equations for electrons in ultraintense laser fields
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, J.E.; Fisch, N.J.
1994-01-01
The guiding-center equations are derived for electrons in arbitrarily intense laser fields also subject to external fields and ponderomotive forces. Exhibiting the relativistic mass increase of the oscillating electrons, a simple frame-invariant equation is shown to govern the behavior of the electrons for sufficiently weak background fields and ponderomotive forces. The parameter regime for which such a formulation is valid is made precise, and some predictions of the equation are checked by numerical simulation.
On Electron Beam Ionization of Air and Chemical Reactions for Disturbed Air Deionization
1981-09-22
produced mainly by N + 02 - NO + 0 (26) 4(2) + 2 o NO+0 (27) The NO+ ion, however, is produced by electron impact and photoionization of NO, by charge...third body being an electron or a neutral (reactions 32, 33 and 34, 35, respectively) have not been studied extensively. The theoretical efforts have...concentrated on hydrogen plasmas and are generally valid for low electron temperatures. However, theoretical expressionse accurate to within a factor
NASA Astrophysics Data System (ADS)
Anggraini, R.; Darvina, Y.; Amir, H.; Murtiani, M.; Yulkifli, Y.
2018-04-01
The availability of modules in schools is currently lacking. Learners have not used the module as a source in the learning process. In accordance with the demands of the 2013 curriculum, that learning should be conducted using a scientific approach and loaded with character values as well as learning using interactive learning resources. The solution of this problem is to create an interactive module with a scientifically charged character approach. This interactive module can be used by learners outside the classroom or in the classroom. This interactive module contains straight motion material, parabolic motion and circular motion of high school physics class X semester 1. The purpose of this research is to produce an interactive module with a scientific approach charged with character and determine the validity and practicality. The research is Research and Development. This study was conducted only until the validity test and practice test. The validity test was conducted by three lecturers of Physics of FMIPA UNP as experts. The instruments used in this research are validation sheet and worksheet sheet. Data analysis technique used is product validity analysis. The object of this research is electronic module, while the subject of this research is three validator.
Hydrodynamic description of transport in strongly correlated electron systems.
Andreev, A V; Kivelson, Steven A; Spivak, B
2011-06-24
We develop a hydrodynamic description of the resistivity and magnetoresistance of an electron liquid in a smooth disorder potential. This approach is valid when the electron-electron scattering length is sufficiently short. In a broad range of temperatures, the dissipation is dominated by heat fluxes in the electron fluid, and the resistivity is inversely proportional to the thermal conductivity, κ. This is in striking contrast to the Stokes flow, in which the resistance is independent of κ and proportional to the fluid viscosity. We also identify a new hydrodynamic mechanism of spin magnetoresistance.
Esteban, Santiago; Rodríguez Tablado, Manuel; Peper, Francisco; Mahumud, Yamila S; Ricci, Ricardo I; Kopitowski, Karin; Terrasa, Sergio
2017-01-01
Precision medicine requires extremely large samples. Electronic health records (EHR) are thought to be a cost-effective source of data for that purpose. Phenotyping algorithms help reduce classification errors, making EHR a more reliable source of information for research. Four algorithm development strategies for classifying patients according to their diabetes status (diabetics; non-diabetics; inconclusive) were tested (one codes-only algorithm; one boolean algorithm, four statistical learning algorithms and six stacked generalization meta-learners). The best performing algorithms within each strategy were tested on the validation set. The stacked generalization algorithm yielded the highest Kappa coefficient value in the validation set (0.95 95% CI 0.91, 0.98). The implementation of these algorithms allows for the exploitation of data from thousands of patients accurately, greatly reducing the costs of constructing retrospective cohorts for research.
Correction of electronic record for weighing bucket precipitation gauge measurements
USDA-ARS?s Scientific Manuscript database
Electronic sensors generate valuable streams of forcing and validation data for hydrologic models, but are often subject to noise, which must be removed as part of model input and testing database development. We developed Automated Precipitation Correction Program (APCP) for weighting bucket preci...
78 FR 45205 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-26
... associated materials (see ADDRESSES). CMS-10326 Electronic Submission of Medicare Graduate Medical Education... collection; Title of Information Collection: Electronic Submission of Medicare Graduate Medical Education... Education FTE cap slots are valid according to CMS regulations. The affiliation agreements are also used as...
Afshar, Majid; Press, Valerie G; Robison, Rachel G; Kho, Abel N; Bandi, Sindhura; Biswas, Ashvini; Avila, Pedro C; Kumar, Harsha Vardhan Madan; Yu, Byung; Naureckas, Edward T; Nyenhuis, Sharmilee M; Codispoti, Christopher D
2017-10-13
Comprehensive, rapid, and accurate identification of patients with asthma for clinical care and engagement in research efforts is needed. The original development and validation of a computable phenotype for asthma case identification occurred at a single institution in Chicago and demonstrated excellent test characteristics. However, its application in a diverse payer mix, across different health systems and multiple electronic health record vendors, and in both children and adults was not examined. The objective of this study is to externally validate the computable phenotype across diverse Chicago institutions to accurately identify pediatric and adult patients with asthma. A cohort of 900 asthma and control patients was identified from the electronic health record between January 1, 2012 and November 30, 2014. Two physicians at each site independently reviewed the patient chart to annotate cases. The inter-observer reliability between the physician reviewers had a κ-coefficient of 0.95 (95% CI 0.93-0.97). The accuracy, sensitivity, specificity, negative predictive value, and positive predictive value of the computable phenotype were all above 94% in the full cohort. The excellent positive and negative predictive values in this multi-center external validation study establish a useful tool to identify asthma cases in in the electronic health record for research and care. This computable phenotype could be used in large-scale comparative-effectiveness trials.
Computer-Based and Paper-Based Measurement of Recognition Performance
1989-03-01
domains (e.g., ship silhouettes, electronic schemata, human anatomy ) to ascertain the universality of the validity and reliability results...specific graphic 4 database (e.g., ship silhouettes, human anatomy , electronic circuits, topography), con- tributes to its wide applicability. The game, then...seek implementation of FLASH and PICTURE in other content areas or subject-matter domains (e.g., ship silhouettes, electronic schemata, human anatomy ) to
ERIC Educational Resources Information Center
Shriver, Edgar L.; Foley, John P., Jr.
A battery of criterion referenced Job Task Performance Tests (JTPT) was developed because paper and pencil tests of job knowledge and electronic theory had very poor criterion-related or empirical validity with respect to the ability of electronic maintenance men to perform their job. Although the original JTPT required the use of actual…
Newgard, Craig D.; Zive, Dana; Jui, Jonathan; Weathers, Cody; Daya, Mohamud
2011-01-01
Objectives To compare case ascertainment, agreement, validity, and missing values for clinical research data obtained, processed, and linked electronically from electronic health records (EHR), compared to “manual” data processing and record abstraction in a cohort of out-ofhospital trauma patients. Methods This was a secondary analysis of two sets of data collected for a prospective, population-based, out-of-hospital trauma cohort evaluated by 10 emergency medical services (EMS) agencies transporting to 16 hospitals, from January 1, 2006 through October 2, 2007. Eighteen clinical, operational, procedural, and outcome variables were collected and processed separately and independently using two parallel data processing strategies, by personnel blinded to patients in the other group. The electronic approach included electronic health record data exports from EMS agencies, reformatting and probabilistic linkage to outcomes from local trauma registries and state discharge databases. The manual data processing approach included chart matching, data abstraction, and data entry by a trained abstractor. Descriptive statistics, measures of agreement, and validity were used to compare the two approaches to data processing. Results During the 21-month period, 418 patients underwent both data processing methods and formed the primary cohort. Agreement was good to excellent (kappa 0.76 to 0.97; intraclass correlation coefficient 0.49 to 0.97), with exact agreement in 67% to 99% of cases, and a median difference of zero for all continuous and ordinal variables. The proportions of missing out-of-hospital values were similar between the two approaches, although electronic processing generated more missing outcomes (87 out of 418, 21%, 95% CI = 17% to 25%) than the manual approach (11 out of 418, 3%, 95% CI = 1% to 5%). Case ascertainment of eligible injured patients was greater using electronic methods (n = 3,008) compared to manual methods (n = 629). Conclusions In this sample of out-of-hospital trauma patients, an all-electronic data processing strategy identified more patients and generated values with good agreement and validity compared to traditional data collection and processing methods. PMID:22320373
Consumer Understanding of Nutrition Marketing Terms: A Pilot Study
ERIC Educational Resources Information Center
Haroldson, Amber; Yen, Chih-Lun
2016-01-01
The purpose of this pilot study was to examine the validity of a questionnaire developed to assess adult consumer understanding of nutrition marketing terms and the resulting impact on consumer behavior. Participants (n = 40) completed an electronic questionnaire. Efforts to establish validity and reliability suggest that the questionnaire is a…
Myrseth, Helga; Notelaers, Guy; Strand, Leif Åge; Borud, Einar Kristian; Olsen, Olav Kjellevold
2017-09-01
To adapt the four-dimensional Gambling Motives Questionnaire-Revised (GMQ-R) to measure the motivation for engaging in electronic gaming, and to validate the internal structure and investigate the criterion validity of the new Electronic Gaming Motives Questionnaire (EGMQ). The GMQ-R was adapted to measure motivation for playing video games and the new instrument was tested on a sample of Norwegian conscripts selected randomly from the pool of conscripts who started their military service between 2013 and 2015. The questionnaire was administered to all those who had played video games during the last 6 months and consisted of 853 gamers (86.8% men, mean age = 19.4 years). All participants completed the EGMQ, as well as other measures of gaming behaviour, gaming problems, boredom, loneliness and depression. The confirmatory factor analyses showed that the proposed EGMQ (measuring enhancement, coping, social and self-gratification motives) displayed satisfactory fit and internal consistency. Hierarchical regression analyses showed that gender emerged as a significant predictor (P < 0.001) of all the dependent variables (variety, hours weekly gaming, loss of control and gaming problems) and the first step explained between 1 and 6.1% of the variance in the gaming behaviours. In the second step the four motivational dimensions explained an additional 5.8-38.8% of the variance. Coping and self-gratification predicted gaming problems (P < 0.001) and coping alone predicted loss of control (P < 0.001). The four motivational dimensions were also predicted differentially by indicators of psychosocial wellbeing, indicating divergent validity of the four motives. The four-dimensional Electronic Gaming Motives Questionnaire is a valid instrument for measuring motives for gaming. © 2017 Society for the Study of Addiction.
Yazdany, Jinoos; Robbins, Mark; Schmajuk, Gabriela; Desai, Sonali; Lacaille, Diane; Neogi, Tuhina; Singh, Jasvinder A.; Genovese, Mark; Myslinski, Rachel; Fisk, Natalie; Francisco, Melissa; Newman, Eric
2017-01-01
Background Electronic clinical quality measures (eCQMs) rely on computer algorithms to extract data from electronic health records (EHRs). On behalf of the American College of Rheumatology (ACR), we sought to develop and test eCQMs for rheumatoid arthritis (RA). Methods Drawing from published ACR guidelines, a working group developed candidate RA process measures and subsequently assessed face validity through an interdisciplinary panel of health care stakeholders. A public comment period followed. Measures that passed these levels of review were electronically specified using the Quality Data Model, which provides standard nomenclature for data elements (category, datatype, value sets) obtained through an EHR. For each eCQM, 3 clinical sites using different EHR systems tested the scientific feasibility and validity of measures. Measures appropriate for accountability were presented for national endorsement. Results Expert panel validity ratings were high for all measures (median 8–9 out of 9). Health system performance on the eCQMs was 53.6% for RA disease activity assessment, 69.1% for functional status assessment, 93.1% for disease modifying drug (DMARD) use and 72.8% for tuberculosis screening. Kappa statistics, evaluating whether the eCQM validly captured data obtained from manual EHR chart review, demonstrated moderate to substantial agreement (0.54 for functional status assessment, 0.73 for tuberculosis screening, 0.84 for disease activity, and 0.85 for DMARD use). Conclusion Four eCQMs for RA have achieved national endorsement and are recommended for use in federal quality reporting programs. Implementation and further refinement of these measures is ongoing in the ACR’s registry, the Rheumatology Informatics System for Effectiveness (RISE). PMID:27564778
[SCREENING OF NUTRITIONAL STATUS AMONG ELDERLY PEOPLE AT FAMILY MEDICINE].
Račić, M; Ivković, N; Kusmuk, S
2015-11-01
The prevalence of malnutrition in elderly is high. Malnutrition or risk of malnutrition can be detected by use of nutritional screening or assessment tools. This systematic review aimed to identify tools that would be reliable, valid, sensitive and specific for nutritional status screening in patients older than 65 at family medicine. The review was performed following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Studies were retrieved using MEDLINE (via Ovid), PubMed and Cochrane Library electronic databases and by manual searching of relevant articles listed in reference list of key publications. The electronic databases were searched using defined key words adapted to each database and using MESH terms. Manual revision of reviews and original articles was performed using Electronic Journals Library. Included studies involved development and validation of screening tools in the community-dwelling elderly population. The tools, subjected to validity and reliability testing for use in the community-dwelling elderly population were Mini Nutritional Assessment (MNA), Mini Nutritional Assessment-Short Form (MNA-SF), Nutrition Screening Initiative (NSI), which includes DETERMINE list, Level I and II Screen, Seniors in the Community: Risk Evaluation for Eating, and Nutrition (SCREEN I and SCREEN II), Subjective Global Assessment (SGA), Nutritional Risk Index (NRI), and Malaysian and South African tool. MNA and MNA-SF appear to have highest reliability and validity for screening of community-dwelling elderly, while the reliability and validity of SCREEN II are good. The authors conclude that whilst several tools have been developed, most have not undergone extensive testing to demonstrate their ability to identify nutritional risk. MNA and MNA-SF have the highest reliability and validity for screening of nutritional status in the community-dwelling elderly, and the reliability and validity of SCREEN II are satisfactory. These instruments also contain all three nutritional status indicators and are practical for use in family medicine. However, the gold standard for screening cannot be set because testing of reliability and continuous validation in the study with a higher level of evidence need to be conducted in family medicine.
Pham, T. Anh; Nguyen, Huy -Viet; Rocca, Dario; ...
2013-04-26
Inmore » a recent paper we presented an approach to evaluate quasiparticle energies based on the spectral decomposition of the static dielectric matrix. This method does not require the calculation of unoccupied electronic states or the direct diagonalization of large dielectric matrices, and it avoids the use of plasmon-pole models. The numerical accuracy of the approach is controlled by a single parameter, i.e., the number of eigenvectors used in the spectral decomposition of the dielectric matrix. Here we present a comprehensive validation of the method, encompassing calculations of ionization potentials and electron affinities of various molecules and of band gaps for several crystalline and disordered semiconductors. Lastly, we demonstrate the efficiency of our approach by carrying out G W calculations for systems with several hundred valence electrons.« less
Sparrow, J M; Taylor, H; Qureshi, K; Smith, R; Johnston, R L
2011-08-01
To develop a methodology for case-mix adjustment of surgical outcomes for individual cataract surgeons using electronically collected multi-centre data conforming to the cataract national data set (CND). Routinely collected anonymised data were remotely extracted from electronic patient record (EPR) systems in 12 participating NHS Trusts undertaking cataract surgery. Following data checks and cleaning, analyses were carried out to risk adjust outcomes for posterior capsule rupture rates for individual surgeons, with stratification by surgical grade. A total of 406 surgeons from 12 NHS Trusts submitted data on 55,567 cataract operations between November 2001 and July 2006 (86% from January 2004). In all, 283 surgeons contributed data on >25 cases, providing 54,319 operations suitable for detailed analysis. Case-mix adjusted results of individual surgeons are presented as funnel plots for all surgeons together, and separately for three different grades of surgeon. Plots include 95 and 99.8% confidence limits around the case-mix adjusted outcomes for detection of surgical outliers. Routinely collected electronic data conforming to the CND provides sufficient detail for case-mix adjustment of cataract surgical outcomes. The validation of these risk indicators should be carried out using fresh data to confirm the validity of the risk model. Once validated this model should provide an equitable approach for peer-to-peer comparisons in the context of revalidation.
1990-02-01
ELECTRONICS IN ARMOURED VEHICLES byo0 T. Cousins and TJ. Jamieson co N OTIC L , k .. •, ’" DEFENCE RESEARCH ESTABLISHMENT OTTAWA REPORT NO.1032 February...DISPLACEMENT DAMAGE TO ELECTRONICS IN ARMOURED VEHICLES by T. Cousins Nuclear Effects Section EAectronics Divsion and TJ. Jamkson Science Applications...The degree of protection from neutron irradiation afforded to electronics by armoured vehicles is most correctly defined by the outside-to-inside ratio
Thermal analysis of electron gun for travelling wave tubes
NASA Astrophysics Data System (ADS)
Bhat, K. S.; Sreedevi, K.; Ravi, M.
2006-11-01
Thermal analysis of a pierce type electron gun using the FEM software ANSYS and its experimental validation are presented in this paper. Thermal analysis of the electron gun structure has been carried out to find out the effect of heater power on steady state temperature and warm-up time. The thermal drain of the supporting structure has also been analyzed for different materials. These results were experimentally verified in an electron gun. The experimental results closely match the ANSYS results.
NASA Astrophysics Data System (ADS)
Soligo, Riccardo
In this work, the insight provided by our sophisticated Full Band Monte Carlo simulator is used to analyze the behavior of state-of-art devices like GaN High Electron Mobility Transistors and Hot Electron Transistors. Chapter 1 is dedicated to the description of the simulation tool used to obtain the results shown in this work. Moreover, a separate section is dedicated the set up of a procedure to validate to the tunneling algorithm recently implemented in the simulator. Chapter 2 introduces High Electron Mobility Transistors (HEMTs), state-of-art devices characterized by highly non linear transport phenomena that require the use of advanced simulation methods. The techniques for device modeling are described applied to a recent GaN-HEMT, and they are validated with experimental measurements. The main techniques characterization techniques are also described, including the original contribution provided by this work. Chapter 3 focuses on a popular technique to enhance HEMTs performance: the down-scaling of the device dimensions. In particular, this chapter is dedicated to lateral scaling and the calculation of a limiting cutoff frequency for a device of vanishing length. Finally, Chapter 4 and Chapter 5 describe the modeling of Hot Electron Transistors (HETs). The simulation approach is validated by matching the current characteristics with the experimental one before variations of the layouts are proposed to increase the current gain to values suitable for amplification. The frequency response of these layouts is calculated, and modeled by a small signal circuit. For this purpose, a method to directly calculate the capacitance is developed which provides a graphical picture of the capacitative phenomena that limit the frequency response in devices. In Chapter 5 the properties of the hot electrons are investigated for different injection energies, which are obtained by changing the layout of the emitter barrier. Moreover, the large signal characterization of the HET is shown for different layouts, where the collector barrier was scaled.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-16
....300) require the following standard operating procedures to assure appropriate use of, and precautions for, systems using electronic records and signatures: (1) Sec. 11.10 specifies procedures and controls... burden associated with the creation of standard operating procedures, validation, and certification. The...
ERIC Educational Resources Information Center
Manpower Administration (DOL), Washington, DC. U.S. Training and Employment Service.
The United States Training and Employment Service General Aptitude Test Battery (GATB), first published in 1947, has been included in a continuing program of research to validate the tests against success in many different occupations. The GATB consists of 12 tests which measure nine aptitudes: General Learning Ability; Verbal Aptitude; Numerical…
Soft, Transparent, Electronic Skin for Distributed and Multiple Pressure Sensing
Levi, Alessandro; Piovanelli, Matteo; Furlan, Silvano; Mazzolai, Barbara; Beccai, Lucia
2013-01-01
In this paper we present a new optical, flexible pressure sensor that can be applied as smart skin to a robot or to consumer electronic devices. We describe a mechano-optical transduction principle that can allow the encoding of information related to an externally applied mechanical stimulus, e.g., contact, pressure and shape of contact. The physical embodiment that we present in this work is an electronic skin consisting of eight infrared emitters and eight photo-detectors coupled together and embedded in a planar PDMS waveguide of 5.5 cm diameter. When a contact occurs on the sensing area, the optical signals reaching the peripheral detectors experience a loss because of the Frustrated Total Internal Reflection and deformation of the material. The light signal is converted to electrical signal through an electronic system and a reconstruction algorithm running on a computer reconstructs the pressure map. Pilot experiments are performed to validate the tactile sensing principle by applying external pressures up to 160 kPa. Moreover, the capabilities of the electronic skin to detect contact pressure at multiple subsequent positions, as well as its function on curved surfaces, are validated. A weight sensitivity of 0.193 gr−1 was recorded, thus making the electronic skin suitable to detect pressures in the order of few grams. PMID:23686140
Validation of nonlinear gyrokinetic simulations of L- and I-mode plasmas on Alcator C-Mod
DOE Office of Scientific and Technical Information (OSTI.GOV)
Creely, A. J.; Howard, N. T.; Rodriguez-Fernandez, P.
New validation of global, nonlinear, ion-scale gyrokinetic simulations (GYRO) is carried out for L- and I-mode plasmas on Alcator C-Mod, utilizing heat fluxes, profile stiffness, and temperature fluctuations. Previous work at C-Mod found that ITG/TEM-scale GYRO simulations can match both electron and ion heat fluxes within error bars in I-mode [White PoP 2015], suggesting that multi-scale (cross-scale coupling) effects [Howard PoP 2016] may be less important in I-mode than in L-mode. New results presented here, however, show that global, nonlinear, ion-scale GYRO simulations are able to match the experimental ion heat flux, but underpredict electron heat flux (at most radii),more » electron temperature fluctuations, and perturbative thermal diffusivity in both L- and I-mode. Linear addition of electron heat flux from electron scale runs does not resolve this discrepancy. These results indicate that single-scale simulations do not sufficiently describe the I-mode core transport, and that multi-scale (coupled electron- and ion-scale) transport models are needed. In conclusion a preliminary investigation with multi-scale TGLF, however, was unable to resolve the discrepancy between ion-scale GYRO and experimental electron heat fluxes and perturbative diffusivity, motivating further work with multi-scale GYRO simulations and a more comprehensive study with multi-scale TGLF.« less
Validation of nonlinear gyrokinetic simulations of L- and I-mode plasmas on Alcator C-Mod
Creely, A. J.; Howard, N. T.; Rodriguez-Fernandez, P.; ...
2017-03-02
New validation of global, nonlinear, ion-scale gyrokinetic simulations (GYRO) is carried out for L- and I-mode plasmas on Alcator C-Mod, utilizing heat fluxes, profile stiffness, and temperature fluctuations. Previous work at C-Mod found that ITG/TEM-scale GYRO simulations can match both electron and ion heat fluxes within error bars in I-mode [White PoP 2015], suggesting that multi-scale (cross-scale coupling) effects [Howard PoP 2016] may be less important in I-mode than in L-mode. New results presented here, however, show that global, nonlinear, ion-scale GYRO simulations are able to match the experimental ion heat flux, but underpredict electron heat flux (at most radii),more » electron temperature fluctuations, and perturbative thermal diffusivity in both L- and I-mode. Linear addition of electron heat flux from electron scale runs does not resolve this discrepancy. These results indicate that single-scale simulations do not sufficiently describe the I-mode core transport, and that multi-scale (coupled electron- and ion-scale) transport models are needed. In conclusion a preliminary investigation with multi-scale TGLF, however, was unable to resolve the discrepancy between ion-scale GYRO and experimental electron heat fluxes and perturbative diffusivity, motivating further work with multi-scale GYRO simulations and a more comprehensive study with multi-scale TGLF.« less
AI and workflow automation: The prototype electronic purchase request system
NASA Technical Reports Server (NTRS)
Compton, Michael M.; Wolfe, Shawn R.
1994-01-01
Automating 'paper' workflow processes with electronic forms and email can dramatically improve the efficiency of those processes. However, applications that involve complex forms that are used for a variety of purposes or that require numerous and varied approvals often require additional software tools to ensure that (1) the electronic form is correctly and completely filled out, and (2) the form is routed to the proper individuals and organizations for approval. The prototype electronic purchase request (PEPR) system, which has been in pilot use at NASA Ames Research Center since December 1993, seamlessly links a commercial electronics forms package and a CLIPS-based knowledge system that first ensures that electronic forms are correct and complete, and then generates an 'electronic routing slip' that is used to route the form to the people who must sign it. The PEPR validation module is context-sensitive, and can apply different validation rules at each step in the approval process. The PEPR system is form-independent, and has been applied to several different types of forms. The system employs a version of CLIPS that has been extended to support AppleScript, a recently-released scripting language for the Macintosh. This 'scriptability' provides both a transparent, flexible interface between the two programs and a means by which a single copy of the knowledge base can be utilized by numerous remote users.
NASA Technical Reports Server (NTRS)
Panek, Joseph W.
2001-01-01
The proper operation of the Electronically Scanned Pressure (ESP) System critical to accomplish the following goals: acquisition of highly accurate pressure data for the development of aerospace and commercial aviation systems and continuous confirmation of data quality to avoid costly, unplanned, repeat wind tunnel or turbine testing. Standard automated setup and checkout routines are necessary to accomplish these goals. Data verification and integrity checks occur at three distinct stages, pretest pressure tubing and system checkouts, daily system validation and in-test confirmation of critical system parameters. This paper will give an overview of the existing hardware, software and methods used to validate data integrity.
Generalized Spencer-Lewis equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Filippone, W.L.
The Spencer-Lewis equation, which describes electron transport in homogeneous media when continuous slowing down theory is valid, is derived from the Boltzmann equation. Also derived is a time-dependent generalized Spencer-Lewis equation valid for inhomogeneous media. An independent verification of this last equation is obtained for the one-dimensional case using particle balance considerations.
Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles
2004-01-01
The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.
Geographic Information Systems to Assess External Validity in Randomized Trials.
Savoca, Margaret R; Ludwig, David A; Jones, Stedman T; Jason Clodfelter, K; Sloop, Joseph B; Bollhalter, Linda Y; Bertoni, Alain G
2017-08-01
To support claims that RCTs can reduce health disparities (i.e., are translational), it is imperative that methodologies exist to evaluate the tenability of external validity in RCTs when probabilistic sampling of participants is not employed. Typically, attempts at establishing post hoc external validity are limited to a few comparisons across convenience variables, which must be available in both sample and population. A Type 2 diabetes RCT was used as an example of a method that uses a geographic information system to assess external validity in the absence of a priori probabilistic community-wide diabetes risk sampling strategy. A geographic information system, 2009-2013 county death certificate records, and 2013-2014 electronic medical records were used to identify community-wide diabetes prevalence. Color-coded diabetes density maps provided visual representation of these densities. Chi-square goodness of fit statistic/analysis tested the degree to which distribution of RCT participants varied across density classes compared to what would be expected, given simple random sampling of the county population. Analyses were conducted in 2016. Diabetes prevalence areas as represented by death certificate and electronic medical records were distributed similarly. The simple random sample model was not a good fit for death certificate record (chi-square, 17.63; p=0.0001) and electronic medical record data (chi-square, 28.92; p<0.0001). Generally, RCT participants were oversampled in high-diabetes density areas. Location is a highly reliable "principal variable" associated with health disparities. It serves as a directly measurable proxy for high-risk underserved communities, thus offering an effective and practical approach for examining external validity of RCTs. Copyright © 2017 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vriens, L.; Smeets, A.H.M.
1980-09-01
For electron-induced ionization, excitation, and de-excitation, mainly from excited atomic states, a detailed analysis is presented of the dependence of the cross sections and rate coefficients on electron energy and temperature, and on atomic parameters. A wide energy range is covered, including sudden as well as adiabatic collisions. By combining the available experimental and theoretical information, a set of simple analytical formulas is constructed for the cross sections and rate coefficients of the processes mentioned, for the total depopulation, and for three-body recombination. The formulas account for large deviations from classical and semiclassical scaling, as found for excitation. They agreemore » with experimental data and with the theories in their respective ranges of validity, but have a wider range of validity than the separate theories. The simple analytical form further facilitates the application in plasma modeling.« less
The Trojan Lifetime Champions Health Survey: development, validity, and reliability.
Sorenson, Shawn C; Romano, Russell; Scholefield, Robin M; Schroeder, E Todd; Azen, Stanley P; Salem, George J
2015-04-01
Self-report questionnaires are an important method of evaluating lifespan health, exercise, and health-related quality of life (HRQL) outcomes among elite, competitive athletes. Few instruments, however, have undergone formal characterization of their psychometric properties within this population. To evaluate the validity and reliability of a novel health and exercise questionnaire, the Trojan Lifetime Champions (TLC) Health Survey. Descriptive laboratory study. A large National Collegiate Athletic Association Division I university. A total of 63 university alumni (age range, 24 to 84 years), including former varsity collegiate athletes and a control group of nonathletes. Participants completed the TLC Health Survey twice at a mean interval of 23 days with randomization to the paper or electronic version of the instrument. Content validity, feasibility of administration, test-retest reliability, parallel-form reliability between paper and electronic forms, and estimates of systematic and typical error versus differences of clinical interest were assessed across a broad range of health, exercise, and HRQL measures. Correlation coefficients, including intraclass correlation coefficients (ICCs) for continuous variables and κ agreement statistics for ordinal variables, for test-retest reliability averaged 0.86, 0.90, 0.80, and 0.74 for HRQL, lifetime health, recent health, and exercise variables, respectively. Correlation coefficients, again ICCs and κ, for parallel-form reliability (ie, equivalence) between paper and electronic versions averaged 0.90, 0.85, 0.85, and 0.81 for HRQL, lifetime health, recent health, and exercise variables, respectively. Typical measurement error was less than the a priori thresholds of clinical interest, and we found minimal evidence of systematic test-retest error. We found strong evidence of content validity, convergent construct validity with the Short-Form 12 Version 2 HRQL instrument, and feasibility of administration in an elite, competitive athletic population. These data suggest that the TLC Health Survey is a valid and reliable instrument for assessing lifetime and recent health, exercise, and HRQL, among elite competitive athletes. Generalizability of the instrument may be enhanced by additional, larger-scale studies in diverse populations.
Nanotechnology with Carbon Nanotubes: Mechanics, Chemistry, and Electronics
NASA Technical Reports Server (NTRS)
Srivastava, Deepak
2003-01-01
This viewgraph presentation reviews the Nanotechnology of carbon nanotubes. The contents include: 1) Nanomechanics examples; 2) Experimental validation of nanotubes in composites; 3) Anisotropic plastic collapse; 4) Spatio-temporal scales, yielding single-wall nanotubes; 5) Side-wall functionalization of nanotubes; 6) multi-wall Y junction carbon nanotubes; 7) Molecular electronics with Nanotube junctions; 8) Single-wall carbon nanotube junctions; welding; 9) biomimetic dendritic neurons: Carbon nanotube, nanotube electronics (basics), and nanotube junctions for Devices,
EXCITATION OF LEVELS IN Li$sup 7$ BY INELASTIC ELECTRON SCATTERING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernheim, M; Bishop, G R
1963-07-15
Cross sections for the excitation of some levels in Li/sup 7/ up to 8- Mev excitation energy were measured by the iiielastic scattering of electrons for a variety of incident electron energies and scatiering angles. The cross section calculated in first Dorn approximation is expected to be valid for this nucleus. The calculated angular distribution is given for different spin and parity and for different levels of excitation. (R.E.U.)
Approaches for optimizing the first electronic hyperpolarizability of conjugated organic molecules
NASA Technical Reports Server (NTRS)
Marder, S. R.; Beratan, D. N.; Cheng, L.-T.
1991-01-01
Conjugated organic molecules with electron-donating and -accepting moieties can exhibit large electronic second-order nonlinearities, or first hyperpolarizabilities, beta. The present two-state, four-orbital independent-electron analysis of beta leads to the prediction that its absolute value will be maximized at a combination of donor and acceptor strengths for a given conjugated bridge. Molecular design strategies for beta optimization are proposed which give attention to the energetic manipulations of the bridge states. Experimental results have been obtained which support the validity of this approach.
On the Monte Carlo simulation of electron transport in the sub-1 keV energy range.
Thomson, Rowan M; Kawrakow, Iwan
2011-08-01
The validity of "classic" Monte Carlo (MC) simulations of electron and positron transport at sub-1 keV energies is investigated in the context of quantum theory. Quantum theory dictates that uncertainties on the position and energy-momentum four-vectors of radiation quanta obey Heisenberg's uncertainty relation; however, these uncertainties are neglected in "classical" MC simulations of radiation transport in which position and momentum are known precisely. Using the quantum uncertainty relation and electron mean free path, the magnitudes of uncertainties on electron position and momentum are calculated for different kinetic energies; a validity bound on the classical simulation of electron transport is derived. In order to satisfy the Heisenberg uncertainty principle, uncertainties of 5% must be assigned to position and momentum for 1 keV electrons in water; at 100 eV, these uncertainties are 17 to 20% and are even larger at lower energies. In gaseous media such as air, these uncertainties are much smaller (less than 1% for electrons with energy 20 eV or greater). The classical Monte Carlo transport treatment is questionable for sub-1 keV electrons in condensed water as uncertainties on position and momentum must be large (relative to electron momentum and mean free path) to satisfy the quantum uncertainty principle. Simulations which do not account for these uncertainties are not faithful representations of the physical processes, calling into question the results of MC track structure codes simulating sub-1 keV electron transport. Further, the large difference in the scale at which quantum effects are important in gaseous and condensed media suggests that track structure measurements in gases are not necessarily representative of track structure in condensed materials on a micrometer or a nanometer scale.
Predicting neutropenia risk in patients with cancer using electronic data.
Pawloski, Pamala A; Thomas, Avis J; Kane, Sheryl; Vazquez-Benitez, Gabriela; Shapiro, Gary R; Lyman, Gary H
2017-04-01
Clinical guidelines recommending the use of myeloid growth factors are largely based on the prescribed chemotherapy regimen. The guidelines suggest that oncologists consider patient-specific characteristics when prescribing granulocyte-colony stimulating factor (G-CSF) prophylaxis; however, a mechanism to quantify individual patient risk is lacking. Readily available electronic health record (EHR) data can provide patient-specific information needed for individualized neutropenia risk estimation. An evidence-based, individualized neutropenia risk estimation algorithm has been developed. This study evaluated the automated extraction of EHR chemotherapy treatment data and externally validated the neutropenia risk prediction model. A retrospective cohort of adult patients with newly diagnosed breast, colorectal, lung, lymphoid, or ovarian cancer who received the first cycle of a cytotoxic chemotherapy regimen from 2008 to 2013 were recruited from a single cancer clinic. Electronically extracted EHR chemotherapy treatment data were validated by chart review. Neutropenia risk stratification was conducted and risk model performance was assessed using calibration and discrimination. Chemotherapy treatment data electronically extracted from the EHR were verified by chart review. The neutropenia risk prediction tool classified 126 patients (57%) as being low risk for febrile neutropenia, 44 (20%) as intermediate risk, and 51 (23%) as high risk. The model was well calibrated (Hosmer-Lemeshow goodness-of-fit test = 0.24). Discrimination was adequate and slightly less than in the original internal validation (c-statistic 0.75 vs 0.81). Chemotherapy treatment data were electronically extracted from the EHR successfully. The individualized neutropenia risk prediction model performed well in our retrospective external cohort. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
An Empirical Validation of the Effectiveness of a Computerized Game to Teach Troubleshooting.
ERIC Educational Resources Information Center
Simutis, Zita M.; And Others
Forty-two enlisted men and women with no prior knowledge about electronics maintenance or logic diagrams participated in research designed to collect preliminary data on the training effectiveness of a problem solving computerized game for teaching electronics maintenance. Two games available on the University of Illinois PLATO Computer-Based…
8 CFR 217.5 - Electronic System for Travel Authorization.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Waiver Program (VWP) must, within the time specified in paragraph (b) of this section, receive a travel... period of time the travel authorization is valid. An authorization under ESTA is not a determination that... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Electronic System for Travel Authorization...
8 CFR 217.5 - Electronic System for Travel Authorization.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Waiver Program (VWP) must, within the time specified in paragraph (b) of this section, receive a travel... period of time the travel authorization is valid. An authorization under ESTA is not a determination that... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Electronic System for Travel Authorization...
8 CFR 217.5 - Electronic System for Travel Authorization.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Waiver Program (VWP) must, within the time specified in paragraph (b) of this section, receive a travel... admission under the VWP during the period of time the travel authorization is valid. An authorization under... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Electronic System for Travel Authorization...
8 CFR 217.5 - Electronic System for Travel Authorization.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Waiver Program (VWP) must, within the time specified in paragraph (b) of this section, receive a travel... period of time the travel authorization is valid. An authorization under ESTA is not a determination that... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Electronic System for Travel Authorization...
8 CFR 217.5 - Electronic System for Travel Authorization.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Waiver Program (VWP) must, within the time specified in paragraph (b) of this section, receive a travel... period of time the travel authorization is valid. An authorization under ESTA is not a determination that... 8 Aliens and Nationality 1 2011-01-01 2011-01-01 false Electronic System for Travel Authorization...
Diesel Technology: Electrical and Electronic Systems. Teacher Edition [and] Student Edition.
ERIC Educational Resources Information Center
Ready, Allan; Kauffman, Ricky; Bogle, Jerry
This document contains the materials for a competency-based course in diesel technology and electrical and electronic systems that is tied to measurable and observable learning outcomes identified and validated by an advisory committee of business and industry representatives and teachers. The competencies addressed align with the medium/heavy…
Electronic Signatures in Global and National Commerce Act. Public Law.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC.
This document presents the text of Public Law 106-229, the "Electronic Signatures in Global and National Commerce Act." The act states that, with respect to any transaction in or affecting interstate or foreign commerce: a signature, contract, or other record relating to such transaction may not be denied legal effect, validity, or…
Tojo, H; Yamada, I; Yasuhara, R; Ejiri, A; Hiratsuka, J; Togashi, H; Yatsuka, E; Hatae, T; Funaba, H; Hayashi, H; Takase, Y; Itami, K
2016-09-01
This paper evaluates the accuracy of electron temperature measurements and relative transmissivities of double-pass Thomson scattering diagnostics. The electron temperature (T e ) is obtained from the ratio of signals from a double-pass scattering system, then relative transmissivities are calculated from the measured T e and intensity of the signals. How accurate the values are depends on the electron temperature (T e ) and scattering angle (θ), and therefore the accuracy of the values was evaluated experimentally using the Large Helical Device (LHD) and the Tokyo spherical tokamak-2 (TST-2). Analyzing the data from the TST-2 indicates that a high T e and a large scattering angle (θ) yield accurate values. Indeed, the errors for scattering angle θ = 135° are approximately half of those for θ = 115°. The method of determining the T e in a wide T e range spanning over two orders of magnitude (0.01-1.5 keV) was validated using the experimental results of the LHD and TST-2. A simple method to provide relative transmissivities, which include inputs from collection optics, vacuum window, optical fibers, and polychromators, is also presented. The relative errors were less than approximately 10%. Numerical simulations also indicate that the T e measurements are valid under harsh radiation conditions. This method to obtain T e can be considered for the design of Thomson scattering systems where there is high-performance plasma that generates harsh radiation environments.
Sun, Xiyang; Miao, Jiacheng; Wang, You; Luo, Zhiyuan; Li, Guang
2017-01-01
An estimate on the reliability of prediction in the applications of electronic nose is essential, which has not been paid enough attention. An algorithm framework called conformal prediction is introduced in this work for discriminating different kinds of ginsengs with a home-made electronic nose instrument. Nonconformity measure based on k-nearest neighbors (KNN) is implemented separately as underlying algorithm of conformal prediction. In offline mode, the conformal predictor achieves a classification rate of 84.44% based on 1NN and 80.63% based on 3NN, which is better than that of simple KNN. In addition, it provides an estimate of reliability for each prediction. In online mode, the validity of predictions is guaranteed, which means that the error rate of region predictions never exceeds the significance level set by a user. The potential of this framework for detecting borderline examples and outliers in the application of E-nose is also investigated. The result shows that conformal prediction is a promising framework for the application of electronic nose to make predictions with reliability and validity. PMID:28805721
Validation Relaxation: A Quality Assurance Strategy for Electronic Data Collection
Gordon, Nicholas; Griffiths, Thomas; Kraemer, John D; Siedner, Mark J
2017-01-01
Background The use of mobile devices for data collection in developing world settings is becoming increasingly common and may offer advantages in data collection quality and efficiency relative to paper-based methods. However, mobile data collection systems can hamper many standard quality assurance techniques due to the lack of a hardcopy backup of data. Consequently, mobile health data collection platforms have the potential to generate datasets that appear valid, but are susceptible to unidentified database design flaws, areas of miscomprehension by enumerators, and data recording errors. Objective We describe the design and evaluation of a strategy for estimating data error rates and assessing enumerator performance during electronic data collection, which we term “validation relaxation.” Validation relaxation involves the intentional omission of data validation features for select questions to allow for data recording errors to be committed, detected, and monitored. Methods We analyzed data collected during a cluster sample population survey in rural Liberia using an electronic data collection system (Open Data Kit). We first developed a classification scheme for types of detectable errors and validation alterations required to detect them. We then implemented the following validation relaxation techniques to enable data error conduct and detection: intentional redundancy, removal of “required” constraint, and illogical response combinations. This allowed for up to 11 identifiable errors to be made per survey. The error rate was defined as the total number of errors committed divided by the number of potential errors. We summarized crude error rates and estimated changes in error rates over time for both individuals and the entire program using logistic regression. Results The aggregate error rate was 1.60% (125/7817). Error rates did not differ significantly between enumerators (P=.51), but decreased for the cohort with increasing days of application use, from 2.3% at survey start (95% CI 1.8%-2.8%) to 0.6% at day 45 (95% CI 0.3%-0.9%; OR=0.969; P<.001). The highest error rate (84/618, 13.6%) occurred for an intentional redundancy question for a birthdate field, which was repeated in separate sections of the survey. We found low error rates (0.0% to 3.1%) for all other possible errors. Conclusions A strategy of removing validation rules on electronic data capture platforms can be used to create a set of detectable data errors, which can subsequently be used to assess group and individual enumerator error rates, their trends over time, and categories of data collection that require further training or additional quality control measures. This strategy may be particularly useful for identifying individual enumerators or systematic data errors that are responsive to enumerator training and is best applied to questions for which errors cannot be prevented through training or software design alone. Validation relaxation should be considered as a component of a holistic data quality assurance strategy. PMID:28821474
Validation Relaxation: A Quality Assurance Strategy for Electronic Data Collection.
Kenny, Avi; Gordon, Nicholas; Griffiths, Thomas; Kraemer, John D; Siedner, Mark J
2017-08-18
The use of mobile devices for data collection in developing world settings is becoming increasingly common and may offer advantages in data collection quality and efficiency relative to paper-based methods. However, mobile data collection systems can hamper many standard quality assurance techniques due to the lack of a hardcopy backup of data. Consequently, mobile health data collection platforms have the potential to generate datasets that appear valid, but are susceptible to unidentified database design flaws, areas of miscomprehension by enumerators, and data recording errors. We describe the design and evaluation of a strategy for estimating data error rates and assessing enumerator performance during electronic data collection, which we term "validation relaxation." Validation relaxation involves the intentional omission of data validation features for select questions to allow for data recording errors to be committed, detected, and monitored. We analyzed data collected during a cluster sample population survey in rural Liberia using an electronic data collection system (Open Data Kit). We first developed a classification scheme for types of detectable errors and validation alterations required to detect them. We then implemented the following validation relaxation techniques to enable data error conduct and detection: intentional redundancy, removal of "required" constraint, and illogical response combinations. This allowed for up to 11 identifiable errors to be made per survey. The error rate was defined as the total number of errors committed divided by the number of potential errors. We summarized crude error rates and estimated changes in error rates over time for both individuals and the entire program using logistic regression. The aggregate error rate was 1.60% (125/7817). Error rates did not differ significantly between enumerators (P=.51), but decreased for the cohort with increasing days of application use, from 2.3% at survey start (95% CI 1.8%-2.8%) to 0.6% at day 45 (95% CI 0.3%-0.9%; OR=0.969; P<.001). The highest error rate (84/618, 13.6%) occurred for an intentional redundancy question for a birthdate field, which was repeated in separate sections of the survey. We found low error rates (0.0% to 3.1%) for all other possible errors. A strategy of removing validation rules on electronic data capture platforms can be used to create a set of detectable data errors, which can subsequently be used to assess group and individual enumerator error rates, their trends over time, and categories of data collection that require further training or additional quality control measures. This strategy may be particularly useful for identifying individual enumerators or systematic data errors that are responsive to enumerator training and is best applied to questions for which errors cannot be prevented through training or software design alone. Validation relaxation should be considered as a component of a holistic data quality assurance strategy. ©Avi Kenny, Nicholas Gordon, Thomas Griffiths, John D Kraemer, Mark J Siedner. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 18.08.2017.
Quasi-linear analysis of the extraordinary electron wave destabilized by runaway electrons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pokol, G. I.; Kómár, A.; Budai, A.
2014-10-15
Runaway electrons with strongly anisotropic distributions present in post-disruption tokamak plasmas can destabilize the extraordinary electron (EXEL) wave. The present work investigates the dynamics of the quasi-linear evolution of the EXEL instability for a range of different plasma parameters using a model runaway distribution function valid for highly relativistic runaway electron beams produced primarily by the avalanche process. Simulations show a rapid pitch-angle scattering of the runaway electrons in the high energy tail on the 100–1000 μs time scale. Due to the wave-particle interaction, a modification to the synchrotron radiation spectrum emitted by the runaway electron population is foreseen, exposing amore » possible experimental detection method for such an interaction.« less
Validation Results for LEWICE 2.0. [Supplement
NASA Technical Reports Server (NTRS)
Wright, William B.; Rutkowski, Adam
1999-01-01
Two CD-ROMs contain experimental ice shapes and code prediction used for validation of LEWICE 2.0 (see NASA/CR-1999-208690, CASI ID 19990021235). The data include ice shapes for both experiment and for LEWICE, all of the input and output files for the LEWICE cases, JPG files of all plots generated, an electronic copy of the text of the validation report, and a Microsoft Excel(R) spreadsheet containing all of the quantitative measurements taken. The LEWICE source code and executable are not contained on the discs.
NASA Astrophysics Data System (ADS)
Lozano, A. I.; Oller, J. C.; Krupa, K.; Ferreira da Silva, F.; Limão-Vieira, P.; Blanco, F.; Muñoz, A.; Colmenares, R.; García, G.
2018-06-01
A novel experimental setup has been implemented to provide accurate electron scattering cross sections from molecules at low and intermediate impact energies (1-300 eV) by measuring the attenuation of a magnetically confined linear electron beam from a molecular target. High-resolution electron energy is achieved through confinement in a magnetic gas trap where electrons are cooled by successive collisions with N2. Additionally, we developed and present a method to correct systematic errors arising from energy and angular resolution limitations. The accuracy of the entire measurement procedure is validated by comparing the N2 total scattering cross section in the considered energy range with benchmark values available in the literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zlateva, Yana; Seuntjens, Jan; El Naqa, Issam
Purpose: To advance towards clinical Cherenkov emission (CE)-based dosimetry by investigating beam-specific effects on Monte Carlo-calculated electron-beam stopping power-to-CE power ratios (SCRs), addressing electron beam quality specification in terms of CE, and validating simulations with measurements. Methods: The EGSnrc user code SPRRZnrc, used to calculate Spencer-Attix stopping-power ratios, was modified to instead calculate SCRs. SCRs were calculated for 6- to 22-MeV clinical electron beams from Varian TrueBeam, Clinac 21EX, and Clinac 2100C/D accelerators. Experiments were performed with a 20-MeV electron beam from a Varian TrueBeam accelerator, using a diffraction grating spectrometer with optical fiber input and a cooled back-illuminated CCD.more » A fluorophore was dissolved in the water to remove CE signal anisotropy. Results: It was found that angular spread of the incident beam has little effect on the SCR (≤ 0.3% at d{sub max}), while both the electron spectrum and photon contamination increase the SCR at shallow depths and decrease it at large depths. A universal data fit of R{sub 50} in terms of C{sub 50} (50% CE depth) revealed a strong linear dependence (R{sup 2} > 0.9999). The SCR was fit with a Burns-type equation (R{sup 2} = 0.9974, NRMSD = 0.5%). Below-threshold incident radiation was found to have minimal effect on beam quality specification (< 0.1%). Experiments and simulations were in good agreement. Conclusions: Our findings confirm the feasibility of the proposed CE dosimetry method, contingent on computation of SCRs from additional accelerators and on further experimental validation. This work constitutes an important step towards clinical high-resolution out-of-beam CE dosimetry.« less
NASA Astrophysics Data System (ADS)
Kawakami, Todd Mori
In April of 1995, the launch of the GPS Meteorology Experiment (GPS/MET) onboard the Orbview-1 satellite, formerly known as Microlab-1, provided the first technology demonstration of active limb sounding of the Earth's atmosphere with a low Earth orbiting spacecraft utilizing the signals transmitted by the satellites of the Global Positioning System (GPS). Though the experiment's primary mission was to probe the troposphere and stratosphere, GPS/MET was also capable of making radio occultation observations of the ionosphere. The application of the GPS occultation technique to the upper atmosphere created a unique opportunity to conduct ionospheric research with an unprecedented global distribution of observations. For operational support requirements, the Abel transform could be employed to invert the horizontal TEC profiles computed from the L1 and L2 phase measurements observed by GPS/MET into electron density profiles versus altitude in near real time. The usefulness of the method depends on how effectively the TEC limb profiles can be transformed into vertical electron density profiles. An assessment of GPS/MET's ability to determine electron density profiles needs to be examined to validate the significance of the GPS occultation method as a new and complementary ionospheric research tool to enhance the observational databases and improve space weather modeling and forecasting. To that end, simulations of the occultation observations and their inversions have been conducted to test the Abel transform algorithm and to provide qualitative information about the type and range of errors that might be experienced during the processing of real data. Comparisons of the electron density profiles inferred from real GPS/MET observations are then compared with coincident in situ measurements from the satellites of Defense Meteorological Satellite Program (DMSP) and ground-based remote sensing from digisonde and incoherent scatter radar facilities. The principal focus of this study is the validation of the electron density profiles inferred from GPS occultation observations using the Abel transform.
Accuracy of parent-reported measles-containing vaccination status of children with measles.
Liu, G; Liao, Z; Xu, X; Liang, Y; Xiong, Y; Ni, J
2017-03-01
The validity of parent-reported measles-containing vaccination history in children with measles has not been assessed. This study evaluated the accuracy of parental recall of measles-containing vaccination histories in Shenzhen, China. A retrospective study was performed to compare the data from the electronic records with parental recall. The electronic records were regarded as accurate data about the children's measles-containing vaccination status. We collected data from the National Notifiable Diseases Surveillance System and the Immunization Program Information Management System in Shenzhen city, China. Between 2009 and 2014, there were 163 children with measles who had electronic vaccination records; the vaccination status of these cases was reported by the parents in the field epidemiological investigation. We validated parental recall with electronic records. The agreement between parental recall and electronic records was 78.7%. The kappa value was 0.57. The parent-reported measles-containing vaccination rate was higher than the electronic record (48.5% vs 41.7%, χ 2 = 53.64, P < 0.001). The true positive rate for parental recall was 82.4%, and the true negative rate was 75.8%. The positive predictive value was 70.9%, and the negative predictive value was 76.6%. In children with measles, parental recall slightly overestimated the measles vaccination rate, and the vaccination status recalled by parents was in moderate agreement with the electronic record. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Worldwide Protein Data Bank validation information: usage and trends.
Smart, Oliver S; Horský, Vladimír; Gore, Swanand; Svobodová Vařeková, Radka; Bendová, Veronika; Kleywegt, Gerard J; Velankar, Sameer
2018-03-01
Realising the importance of assessing the quality of the biomolecular structures deposited in the Protein Data Bank (PDB), the Worldwide Protein Data Bank (wwPDB) partners established Validation Task Forces to obtain advice on the methods and standards to be used to validate structures determined by X-ray crystallography, nuclear magnetic resonance spectroscopy and three-dimensional electron cryo-microscopy. The resulting wwPDB validation pipeline is an integral part of the wwPDB OneDep deposition, biocuration and validation system. The wwPDB Validation Service webserver (https://validate.wwpdb.org) can be used to perform checks prior to deposition. Here, it is shown how validation metrics can be combined to produce an overall score that allows the ranking of macromolecular structures and domains in search results. The ValTrends DB database provides users with a convenient way to access and analyse validation information and other properties of X-ray crystal structures in the PDB, including investigating trends in and correlations between different structure properties and validation metrics.
Worldwide Protein Data Bank validation information: usage and trends
Horský, Vladimír; Gore, Swanand; Svobodová Vařeková, Radka; Bendová, Veronika
2018-01-01
Realising the importance of assessing the quality of the biomolecular structures deposited in the Protein Data Bank (PDB), the Worldwide Protein Data Bank (wwPDB) partners established Validation Task Forces to obtain advice on the methods and standards to be used to validate structures determined by X-ray crystallography, nuclear magnetic resonance spectroscopy and three-dimensional electron cryo-microscopy. The resulting wwPDB validation pipeline is an integral part of the wwPDB OneDep deposition, biocuration and validation system. The wwPDB Validation Service webserver (https://validate.wwpdb.org) can be used to perform checks prior to deposition. Here, it is shown how validation metrics can be combined to produce an overall score that allows the ranking of macromolecular structures and domains in search results. The ValTrendsDB database provides users with a convenient way to access and analyse validation information and other properties of X-ray crystal structures in the PDB, including investigating trends in and correlations between different structure properties and validation metrics. PMID:29533231
Electronic cooling design and test validation
NASA Astrophysics Data System (ADS)
Murtha, W. B.
1983-07-01
An analytical computer model has been used to design a counterflow air-cooled heat exchanger according to the cooling, structural and geometric requirements of a U.S. Navy shipboard electronics cabinet, emphasizing high reliability performance through the maintenance of electronic component junction temperatures lower than 110 C. Environmental testing of the design obtained has verified that the analytical predictions were conservative. Model correlation to the test data furnishes an upgraded capability for the evaluation of tactical effects, and has established a two-orders of magnitude growth potential for increased electronics capabilities through enhanced heat dissipation. Electronics cabinets of this type are destined for use with Vertical Launching System-type combatant vessel magazines.
Time-saving impact of an algorithm to identify potential surgical site infections.
Knepper, B C; Young, H; Jenkins, T C; Price, C S
2013-10-01
To develop and validate a partially automated algorithm to identify surgical site infections (SSIs) using commonly available electronic data to reduce manual chart review. Retrospective cohort study of patients undergoing specific surgical procedures over a 4-year period from 2007 through 2010 (algorithm development cohort) or over a 3-month period from January 2011 through March 2011 (algorithm validation cohort). A single academic safety-net hospital in a major metropolitan area. Patients undergoing at least 1 included surgical procedure during the study period. Procedures were identified in the National Healthcare Safety Network; SSIs were identified by manual chart review. Commonly available electronic data, including microbiologic, laboratory, and administrative data, were identified via a clinical data warehouse. Algorithms using combinations of these electronic variables were constructed and assessed for their ability to identify SSIs and reduce chart review. The most efficient algorithm identified in the development cohort combined microbiologic data with postoperative procedure and diagnosis codes. This algorithm resulted in 100% sensitivity and 85% specificity. Time savings from the algorithm was almost 600 person-hours of chart review. The algorithm demonstrated similar sensitivity on application to the validation cohort. A partially automated algorithm to identify potential SSIs was highly sensitive and dramatically reduced the amount of manual chart review required of infection control personnel during SSI surveillance.
A patient-centered electronic tool for weight loss outcomes after Roux-en-Y gastric bypass.
Wood, G Craig; Benotti, Peter; Gerhard, Glenn S; Miller, Elaina K; Zhang, Yushan; Zaccone, Richard J; Argyropoulos, George A; Petrick, Anthony T; Still, Christopher D
2014-01-01
BACKGROUND. Current patient education and informed consent regarding weight loss expectations for bariatric surgery candidates are largely based on averages from large patient cohorts. The variation in weight loss outcomes illustrates the need for establishing more realistic weight loss goals for individual patients. This study was designed to develop a simple web-based tool which provides patient-specific weight loss expectations. METHODS. Postoperative weight measurements after Roux-en-Y gastric bypass (RYGB) were collected and analyzed with patient characteristics known to influence weight loss outcomes. Quantile regression was used to create expected weight loss curves (25th, 50th, and 75th %tile) for the 24 months after RYGB. The resulting equations were validated and used to develop web-based tool for predicting weight loss outcomes. RESULTS. Weight loss data from 2986 patients (2608 in the primary cohort and 378 in the validation cohort) were included. Preoperative body mass index (BMI) and age were found to have a high correlation with weight loss accomplishment (P < 0.0001 for each). An electronic tool was created that provides easy access to patient-specific, 24-month weight loss trajectories based on initial BMI and age. CONCLUSIONS. This validated, patient-centered electronic tool will assist patients and providers in patient teaching, informed consent, and postoperative weight loss management.
Barnett, Stephen; Henderson, Joan; Hodgkins, Adam; Harrison, Christopher; Ghosh, Abhijeet; Dijkmans-Hadley, Bridget; Britt, Helena; Bonney, Andrew
2017-05-01
Electronic medical data (EMD) from electronic health records of general practice computer systems have enormous research potential, yet many variables are unreliable. The aim of this study was to compare selected data variables from general practice EMD with a reliable, representative national dataset (Bettering the Evaluation and Care of Health (BEACH)) in order to validate their use for primary care research. EMD variables were compared with encounter data from the nationally representative BEACH program using χ 2 tests and robust 95% confidence intervals to test their validity (measure what they reportedly measure). The variables focused on for this study were patient age, sex, smoking status and medications prescribed at the visit. The EMD sample from six general practices in the Illawarra region of New South Wales, Australia, yielded data on 196,515 patient encounters. Details of 90,553 encounters were recorded in the 2013 BEACH dataset from 924 general practitioners. No significant differences in patient age ( p = 0.36) or sex ( p = 0.39) were found. EMD had a lower rate of current smokers and higher average scripts per visit, but similar prescribing distribution patterns. Validating EMD variables offers avenues for improving primary care delivery and measuring outcomes of care to inform clinical practice and health policy.
USDA-ARS?s Scientific Manuscript database
Background: Dietary intake assessment with diet records (DR) is a standard research and practice tool in nutrition. Manual entry and analysis of DR is time-consuming and expensive. New electronic tools for diet entry by clients and research participants may reduce the cost and effort of nutrient int...
Nonlinear propagation of ion-acoustic waves in electron-positron-ion plasma with trapped electrons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alinejad, H.; Sobhanian, S.; Mahmoodi, J.
2006-01-15
A theoretical investigation has been made for ion-acoustic waves in an unmagnetized electron-positron-ion plasma. A more realistic situation in which plasma consists of a negatively charged ion fluid, free positrons, and trapped as well as free electrons is considered. The properties of stationary structures are studied by the reductive perturbation method, which is valid for small but finite amplitude limit, and by pseudopotential approach, which is valid for large amplitude. With an appropriate modified form of the electron number density, two new equations for the ion dynamics have been found. When deviations from isothermality are finite, the modified Korteweg-deVries equationmore » has been found, and for the case that deviations from isothermality are small, calculations lead to a generalized Korteweg-deVries equation. It is shown from both weakly and highly nonlinear analysis that the presence of the positrons may allow solitary waves to exist. It is found that the effect of the positron density changes the maximum value of the amplitude and M (Mach number) for which solitary waves can exist. The present theory is applicable to analyze arbitrary amplitude ion-acoustic waves associated with positrons which may occur in space plasma.« less
Taste sensing systems (electronic tongues) for pharmaceutical applications.
Woertz, Katharina; Tissen, Corinna; Kleinebudde, Peter; Breitkreutz, Jörg
2011-09-30
Electronic tongues are sensor array systems able to detect single substances as well as complex mixtures by means of particular sensor membranes and electrochemical techniques. Two systems are already commercially available, the Insent taste sensing system and the αAstree electronic tongue. In addition, various laboratory prototype versions exist. Besides the successful use in food industry, the implementation for pharmaceutical purposes has strongly grown within the recent years. A reason for this is the increased interest of developing palatable formulations, especially for children. As taste assessment of drugs comes along with challenges due to possible toxicity and subjectivity of the taste assessors, electronic tongues could offer a safe and objective alternative. In order to provide guidance on the use of these systems, possible fields of interest are presented in this review, as for example, system qualification, quality control, formulation development, comparison between marketed drug products, and the validation of the methods used. Further, different approaches for solid and liquid dosage forms are summarized. But, also the difficulty to obtain absolute statements regarding taste was identified and the need of more validated data was discussed to offer guidance for the next years of research and application of electronic tongues for pharmaceutical applications. Copyright © 2010 Elsevier B.V. All rights reserved.
Trams, trains, planes and automobiles: logistics of conducting a statewide audit of medical records.
Flood, Margaret; Pollock, Wendy; McDonald, Susan; Davey, Mary-Ann
2016-10-01
This paper reports on the logistics of conducting a validation study of a routinely collected dataset against medical records at hospitals to inform planning of similar studies. A stratified random sample of 15 hospitals and two homebirth practitioners was included. Site visits were arranged following consent. In addition to the validation of perinatal data, information was collected regarding logistics. Records at 14 metropolitan and rural hospitals up to 500 km from the research centre, and two homebirth practitioners, were audited. Obtaining consent to participate took between 5 days and 10 months. Auditors visited sites on 101 days, auditing 737 medical record pairs at 16 sites. Median audit time per record was 51.3 minutes; electronic records each took 36 minutes longer than paper. Travel time accounted for nearly one-quarter of audit time. Delays obtaining consents, long travel times and electronic records prolonged audit duration and expense. Employment of experts maximised use of available audit time. Conducting a validation study is a time-consuming and expensive exercise; however, confidence in the accuracy of public health data is vital. Validation studies are unquestionably important. Three alternative strategies have been proposed to make future studies viable. © 2016 Public Health Association of Australia.
Chen, Ren-Ai; Wang, Cong; Li, Sheng; George, Thomas F.
2013-01-01
With the development of experimental techniques, effective injection and transportation of electrons is proven as a way to obtain polymer light-emitting diodes (PLEDs) with high quantum efficiency. This paper reveals a valid mechanism for the enhancement of quantum efficiency in PLEDs. When an external electric field is applied, the interaction between a negative polaron and triplet exciton leads to an electronic two-transition process, which induces the exciton to emit light and thus improve the emission efficiency of PLEDs. PMID:28809346
NASA Technical Reports Server (NTRS)
Freund, H. P.; Wu, C. S.; Gaffey, J. D., Jr.
1984-01-01
An expression for the spectral emissivity of spontaneous synchrotron radiation for a plasma which consists of both thermal and suprathermal electron components is derived using the complete relativistic cyclotron resonance condition. The expression is valid over all angles of propagation. The result is applied to the study of the emission of radiation from an energetic population of electrons with a loss-cone distribution in a relatively low-density plasma (i.e., the electron plasma frequency is less than the cyclotron frequency).
Transferable Pseudo-Classical Electrons for Aufbau of Atomic Ions
Ekesan, Solen; Kale, Seyit; Herzfeld, Judith
2014-01-01
Generalizing the LEWIS reactive force field from electron pairs to single electrons, we present LEWIS• in which explicit valence electrons interact with each other and with nuclear cores via pairwise interactions. The valence electrons are independently mobile particles, following classical equations of motion according to potentials modified from Coulombic as required to capture quantum characteristics. As proof of principle, the aufbau of atomic ions is described for diverse main group elements from the first three rows of the periodic table, using a single potential for interactions between electrons of like spin and another for electrons of unlike spin. The electrons of each spin are found to distribute themselves in a fashion akin to the major lobes of the hybrid atomic orbitals, suggesting a pointillist description of the electron density. The broader validity of the LEWIS• force field is illustrated by predicting the vibrational frequencies of diatomic and triatomic hydrogen species. PMID:24752384
Transferable pseudoclassical electrons for aufbau of atomic ions.
Ekesan, Solen; Kale, Seyit; Herzfeld, Judith
2014-06-05
Generalizing the LEWIS reactive force field from electron pairs to single electrons, we present LEWIS• in which explicit valence electrons interact with each other and with nuclear cores via pairwise interactions. The valence electrons are independently mobile particles, following classical equations of motion according to potentials modified from Coulombic as required to capture quantum characteristics. As proof of principle, the aufbau of atomic ions is described for diverse main group elements from the first three rows of the periodic table, using a single potential for interactions between electrons of like spin and another for electrons of unlike spin. The electrons of each spin are found to distribute themselves in a fashion akin to the major lobes of the hybrid atomic orbitals, suggesting a pointillist description of the electron density. The broader validity of the LEWIS• force field is illustrated by predicting the vibrational frequencies of diatomic and triatomic hydrogen species. Copyright © 2014 Wiley Periodicals, Inc.
Li, Tsung-Lung; Lu, Wen-Cai
2015-10-05
In this work, Koopmans' theorem for Kohn-Sham density functional theory (KS-DFT) is applied to the photoemission spectra (PES) modeling over the entire valence-band. To examine the validity of this application, a PES modeling scheme is developed to facilitate a full valence-band comparison of theoretical PES spectra with experiments. The PES model incorporates the variations of electron ionization cross-sections over atomic orbitals and a linear dispersion of spectral broadening widths. KS-DFT simulations of pristine rubrene (5,6,11,12-tetraphenyltetracene) and potassium-rubrene complex are performed, and the simulation results are used as the input to the PES models. Two conclusions are reached. First, decompositions of the theoretical total spectra show that the dissociated electron of the potassium mainly remains on the backbone and has little effect on the electronic structures of phenyl side groups. This and other electronic-structure results deduced from the spectral decompositions have been qualitatively obtained with the anionic approximation to potassium-rubrene complexes. The qualitative validity of the anionic approximation is thus verified. Second, comparison of the theoretical PES with the experiments shows that the full-scale simulations combined with the PES modeling methods greatly enhance the agreement on spectral shapes over the anionic approximation. This agreement of the theoretical PES spectra with the experiments over the full valence-band can be regarded, to some extent, as a collective validation of the application of Koopmans' theorem for KS-DFT to valence-band PES, at least, for this hydrocarbon and its alkali-adsorbed complex. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tojo, H., E-mail: tojo.hiroshi@qst.go.jp; Hiratsuka, J.; Yatsuka, E.
2016-09-15
This paper evaluates the accuracy of electron temperature measurements and relative transmissivities of double-pass Thomson scattering diagnostics. The electron temperature (T{sub e}) is obtained from the ratio of signals from a double-pass scattering system, then relative transmissivities are calculated from the measured T{sub e} and intensity of the signals. How accurate the values are depends on the electron temperature (T{sub e}) and scattering angle (θ), and therefore the accuracy of the values was evaluated experimentally using the Large Helical Device (LHD) and the Tokyo spherical tokamak-2 (TST-2). Analyzing the data from the TST-2 indicates that a high T{sub e} andmore » a large scattering angle (θ) yield accurate values. Indeed, the errors for scattering angle θ = 135° are approximately half of those for θ = 115°. The method of determining the T{sub e} in a wide T{sub e} range spanning over two orders of magnitude (0.01–1.5 keV) was validated using the experimental results of the LHD and TST-2. A simple method to provide relative transmissivities, which include inputs from collection optics, vacuum window, optical fibers, and polychromators, is also presented. The relative errors were less than approximately 10%. Numerical simulations also indicate that the T{sub e} measurements are valid under harsh radiation conditions. This method to obtain T{sub e} can be considered for the design of Thomson scattering systems where there is high-performance plasma that generates harsh radiation environments.« less
Bakshi, Nitya; Stinson, Jennifer N; Ross, Diana; Lukombo, Ines; Mittal, Nonita; Joshi, Saumya V; Belfer, Inna; Krishnamurti, Lakshmanan
2015-06-01
Vaso-occlusive pain, the hallmark of sickle cell disease (SCD), is a major contributor to morbidity, poor health-related quality of life, and health care utilization associated with this disease. There is wide variation in the burden, frequency, and severity of pain experienced by patients with SCD. As compared with health care utilization for pain, a daily pain diary captures the breadth of the pain experience and is a superior measure of pain burden and its impact on patients. Electronic pain diaries based on real-time data capture methods overcome methodological barriers and limitations of paper pain diaries, but their psychometric properties have not been formally established in patients with SCD. To develop and establish the content validity of a web-based multidimensional pain diary for adolescents and young adults with SCD and conduct an end-user review to refine the prototype. Following identification of items, a conceptual model was developed. Interviews with adolescents and young adults with SCD were conducted. Subsequently, end-user review with use of the electronic pain diary prototype was conducted. Two iterative cycles of in-depth cognitive interviews in adolescents and young adults with SCD informed the design and guided the addition, removal, and modification of items in the multidimensional pain diary. Potential end-users provided positive feedback on the design and prototype of the electronic diary. A multidimensional web-based electronic pain diary for adolescents and young adults with SCD has been developed and content validity and initial end-user reviews have been completed.
Han, Myung-Geun; Garlow, Joseph A.; Marshall, Matthew S. J.; ...
2017-03-23
The ability to map out electrostatic potentials in materials is critical for the development and the design of nanoscale electronic and spintronic devices in modern industry. Electron holography has been an important tool for revealing electric and magnetic field distributions in microelectronics and magnetic-based memory devices, however, its utility is hindered by several practical constraints, such as charging artifacts and limitations in sensitivity and in field of view. In this article, we report electron-beam-induced-current (EBIC) and secondary-electron voltage-contrast (SE-VC) with an aberration-corrected electron probe in a transmission electron microscope (TEM), as complementary techniques to electron holography, to measure electric fieldsmore » and surface potentials, respectively. These two techniques were applied to ferroelectric thin films, multiferroic nanowires, and single crystals. Electrostatic potential maps obtained by off-axis electron holography were compared with EBIC and SE-VC to show that these techniques can be used as a complementary approach to validate quantitative results obtained from electron holography analysis.« less
Model-Based Verification and Validation of Spacecraft Avionics
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Sievers, Michael; Standley, Shaun
2012-01-01
Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.
Validation of the 'Test of the Adherence to Inhalers' (TAI) for Asthma and COPD Patients.
Plaza, Vicente; Fernández-Rodríguez, Concepción; Melero, Carlos; Cosío, Borja G; Entrenas, Luís Manuel; de Llano, Luis Pérez; Gutiérrez-Pereyra, Fernando; Tarragona, Eduard; Palomino, Rosa; López-Viña, Antolín
2016-04-01
To validate the 'Test of Adherence to Inhalers' (TAI), a 12-item questionnaire designed to assess the adherence to inhalers in patients with COPD or asthma. A total of 1009 patients with asthma or COPD participated in a cross-sectional multicenter study. Patients with electronic adherence ≥80% were defined as adherents. Construct validity, internal validity, and criterion validity were evaluated. Self-reported adherence was compared with the Morisky-Green questionnaire. Factor analysis study demonstrated two factors, factor 1 was coincident with TAI patient domain (items 1 to 10) and factor 2 with TAI health-care professional domain (items 11 and 12). The Cronbach's alpha was 0.860 and the test-retest reliability 0.883. TAI scores correlated with electronic adherence (ρ=0.293, p=0.01). According to the best cut-off for 10 items (score 50, area under the ROC curve 0.7), 569 (62.5%) patients were classified as non-adherents. The non-adherence behavior pattern was: erratic 527 (57.9%), deliberate 375 (41.2%), and unwitting 242 (26.6%) patients. As compared to Morisky-Green test, TAI showed better psychometric properties. The TAI is a reliable and homogeneous questionnaire to identify easily non-adherence and to classify from a clinical perspective the barriers related to the use of inhalers in asthma and COPD.
NASA Astrophysics Data System (ADS)
Sanford, T. W. L.; Beutler, D. E.; Halbleib, J. A.; Knott, D. P.
1991-12-01
The radiation produced by a 15.5-MeV monoenergetic electron beam incident on optimized and nonoptimized bremsstrahlung targets is characterized using the ITS Monte Carlo code and measurements with equilibrated and nonequilibrated TLD dosimetry. Comparisons between calculations and measurements verify the calculations and demonstrate that the code can be used to predict both bremsstrahlung production and TLD response for radiation fields that are characteristic of those produced by pulsed simulators of gamma rays. The comparisons provide independent confirmation of the validity of the TLD calibration for photon fields characteristic of gamma-ray simulators. The empirical Martin equation, which is often used to calculate radiation dose from optimized bremsstrahlung targets, is examined, and its range of validity is established.
Validation of a quantized-current source with 0.2 ppm uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Friederike; Fricke, Lukas, E-mail: lukas.fricke@ptb.de; Scherer, Hansjörg
2015-09-07
We report on high-accuracy measurements of quantized current, sourced by a tunable-barrier single-electron pump at frequencies f up to 1 GHz. The measurements were performed with an ultrastable picoammeter instrument, traceable to the Josephson and quantum Hall effects. Current quantization according to I = ef with e being the elementary charge was confirmed at f = 545 MHz with a total relative uncertainty of 0.2 ppm, improving the state of the art by about a factor of 5. The accuracy of a possible future quantum current standard based on single-electron transport was experimentally validated to be better than the best (indirect) realization of the ampere within themore » present SI.« less
Carbon dioxide electron cooling rates in the atmospheres of Mars and Venus
NASA Astrophysics Data System (ADS)
Campbell, L.; Brunger, M. J.; Rescigno, T. N.
2008-08-01
The cooling of electrons in collisions with carbon dioxide in the atmospheres of Venus and Mars is investigated. Calculations are performed with both previously accepted electron energy transfer rates and with new ones determined using more recent theoretical and experimental cross sections for electron impact on CO2. Emulation of a previous model for Venus confirms the validity of the current model and shows that use of the updated cross sections leads to cooling rates that are lower by one third. Application of the same model to the atmosphere of Mars gives more than double the previous cooling rates at altitudes where the electron temperature is very low.
Electronic Publishing or Electronic Information Handling?
NASA Astrophysics Data System (ADS)
Heck, A.
The current dramatic evolution in information technology is bringing major modifications in the way scientists communicate. The concept of 'electronic publishing' is too restrictive and has often different, sometimes conflicting, interpretations. It is thus giving way to the broader notion of 'electronic information handling' encompassing the diverse types of information, the different media, as well as the various communication methodologies and technologies. New problems and challenges result also from this new information culture, especially on legal, ethical, and educational grounds. The procedures for validating 'published material' and for evaluating scientific activities will have to be adjusted too. 'Fluid' information is becoming a common concept. Electronic publishing cannot be conceived without link to knowledge bases nor without intelligent information retrieval tools.
MATLAB/Simulink Pulse-Echo Ultrasound System Simulator Based on Experimentally Validated Models.
Kim, Taehoon; Shin, Sangmin; Lee, Hyongmin; Lee, Hyunsook; Kim, Heewon; Shin, Eunhee; Kim, Suhwan
2016-02-01
A flexible clinical ultrasound system must operate with different transducers, which have characteristic impulse responses and widely varying impedances. The impulse response determines the shape of the high-voltage pulse that is transmitted and the specifications of the front-end electronics that receive the echo; the impedance determines the specification of the matching network through which the transducer is connected. System-level optimization of these subsystems requires accurate modeling of pulse-echo (two-way) response, which in turn demands a unified simulation of the ultrasonics and electronics. In this paper, this is realized by combining MATLAB/Simulink models of the high-voltage transmitter, the transmission interface, the acoustic subsystem which includes wave propagation and reflection, the receiving interface, and the front-end receiver. To demonstrate the effectiveness of our simulator, the models are experimentally validated by comparing the simulation results with the measured data from a commercial ultrasound system. This simulator could be used to quickly provide system-level feedback for an optimized tuning of electronic design parameters.
Direct drive: Simulations and results from the National Ignition Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radha, P. B., E-mail: rbah@lle.rochester.edu; Hohenberger, M.; Edgell, D. H.
Direct-drive implosion physics is being investigated at the National Ignition Facility. The primary goal of the experiments is twofold: to validate modeling related to implosion velocity and to estimate the magnitude of hot-electron preheat. Implosion experiments indicate that the energetics is well-modeled when cross-beam energy transfer (CBET) is included in the simulation and an overall multiplier to the CBET gain factor is employed; time-resolved scattered light and scattered-light spectra display the correct trends. Trajectories from backlit images are well modeled, although those from measured self-emission images indicate increased shell thickness and reduced shell density relative to simulations. Sensitivity analyses indicatemore » that the most likely cause for the density reduction is nonuniformity growth seeded by laser imprint and not laser-energy coupling. Hot-electron preheat is at tolerable levels in the ongoing experiments, although it is expected to increase after the mitigation of CBET. Future work will include continued model validation, imprint measurements, and mitigation of CBET and hot-electron preheat.« less
Experimental validation of wireless communication with chaos.
Ren, Hai-Peng; Bai, Chao; Liu, Jian; Baptista, Murilo S; Grebogi, Celso
2016-08-01
The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and an integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.
Development and validation of an electronic phenotyping algorithm for chronic kidney disease
Nadkarni, Girish N; Gottesman, Omri; Linneman, James G; Chase, Herbert; Berg, Richard L; Farouk, Samira; Nadukuru, Rajiv; Lotay, Vaneet; Ellis, Steve; Hripcsak, George; Peissig, Peggy; Weng, Chunhua; Bottinger, Erwin P
2014-01-01
Twenty-six million Americans are estimated to have chronic kidney disease (CKD) with increased risk for cardiovascular disease and end stage renal disease. CKD is frequently undiagnosed and patients are unaware, hampering intervention. A tool for accurate and timely identification of CKD from electronic medical records (EMR) could improve healthcare quality and identify patients for research. As members of eMERGE (electronic medical records and genomics) Network, we developed an automated phenotyping algorithm that can be deployed to identify rapidly diabetic and/or hypertensive CKD cases and controls in health systems with EMRs It uses diagnostic codes, laboratory results, medication and blood pressure records, and textual information culled from notes. Validation statistics demonstrated positive predictive values of 96% and negative predictive values of 93.3. Similar results were obtained on implementation by two independent eMERGE member institutions. The algorithm dramatically outperformed identification by ICD-9-CM codes with 63% positive and 54% negative predictive values, respectively. PMID:25954398
Experimental validation of wireless communication with chaos
NASA Astrophysics Data System (ADS)
Ren, Hai-Peng; Bai, Chao; Liu, Jian; Baptista, Murilo S.; Grebogi, Celso
2016-08-01
The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and an integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.
Experimental validation of wireless communication with chaos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Hai-Peng; Bai, Chao; Liu, Jian
The constraints of a wireless physical media, such as multi-path propagation and complex ambient noises, prevent information from being communicated at low bit error rate. Surprisingly, it has only recently been shown that, from a theoretical perspective, chaotic signals are optimal for communication. It maximises the receiver signal-to-noise performance, consequently minimizing the bit error rate. This work demonstrates numerically and experimentally that chaotic systems can in fact be used to create a reliable and efficient wireless communication system. Toward this goal, we propose an impulsive control method to generate chaotic wave signals that encode arbitrary binary information signals and anmore » integration logic together with the match filter capable of decreasing the noise effect over a wireless channel. The experimental validation is conducted by inputting the signals generated by an electronic transmitting circuit to an electronic circuit that emulates a wireless channel, where the signals travel along three different paths. The output signal is decoded by an electronic receiver, after passing through a match filter.« less
Excitation of levels in Li6 by inelastic electron scattering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernheim, M.; Bishop, G. R.
1963-07-01
Inelastic scattering of electrons from metallic targets of Li 6 was studied as part of a program to establish the validity of the Born approximation calculation of the cross section. This calculation predicts the separation of the inelastic form factor into two contributions corresponding to the absorption of longitudinal and transverse virtual photons by the bombarded system. (R.E.U.)
Link Between Deployment Factors and Parenting Stress in Navy Families
2016-04-11
eligible participants completed an electronic survey which consisted of demographic information, and eight validated psychosocial scales. Sample: The...military personnel and their families on a daily basis: nurses can identify families at risk and intervene early to prevent harm to the family. 15...variable was parenting stress. Methods: All eligible participants completed an electronic survey which consisted of demographic information, and
NASA Astrophysics Data System (ADS)
Kabachnik, N. M.; Sazhina, I. P.
2001-09-01
New relations between the intrinsic parameters δk which describe the longitudinal spin polarization of Auger electrons and αk which describe the anisotropy of their angular distribution are found. The relations are valid for arbitrary Auger transitions with initial (Ji) and final (Jf) angular momenta satisfying the condition Ji > Jf.
ERIC Educational Resources Information Center
Akussah, Maxwell; Asante, Edward; Adu-Sarkodee, Rosemary
2015-01-01
The study investigates the relationship between impact of electronic resources and its usage in academic libraries in Ghana: evidence from Koforidua Polytechnic & All Nations University College, Ghana. The study was a quantitative approach using questionnaire to gather data and information. A valid response rate of 58.5% was assumed. SPSS…
VALIDATION OF THE CORONAL THICK TARGET SOURCE MODEL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fleishman, Gregory D.; Xu, Yan; Nita, Gelu N.
2016-01-10
We present detailed 3D modeling of a dense, coronal thick-target X-ray flare using the GX Simulator tool, photospheric magnetic measurements, and microwave imaging and spectroscopy data. The developed model offers a remarkable agreement between the synthesized and observed spectra and images in both X-ray and microwave domains, which validates the entire model. The flaring loop parameters are chosen to reproduce the emission measure, temperature, and the nonthermal electron distribution at low energies derived from the X-ray spectral fit, while the remaining parameters, unconstrained by the X-ray data, are selected such as to match the microwave images and total power spectra.more » The modeling suggests that the accelerated electrons are trapped in the coronal part of the flaring loop, but away from where the magnetic field is minimal, and, thus, demonstrates that the data are clearly inconsistent with electron magnetic trapping in the weak diffusion regime mediated by the Coulomb collisions. Thus, the modeling supports the interpretation of the coronal thick-target sources as sites of electron acceleration in flares and supplies us with a realistic 3D model with physical parameters of the acceleration region and flaring loop.« less
Determining Core Plasmaspheric Electron Densities with the Van Allen Probes
NASA Astrophysics Data System (ADS)
De Pascuale, S.; Hartley, D.; Kurth, W. S.; Kletzing, C.; Thaller, S. A.; Wygant, J. R.
2016-12-01
We survey three methods for obtaining electron densities inside of the core plasmasphere region (L < 4) to the perigee of the Van Allen Probes (L 1.1) from September 2012 to December 2014. Using the EMFISIS instrument on board the Van Allen Probes, electron densities are extracted from the upper hybrid resonance to an uncertainty of 10%. Some measurements are subject to larger errors given interpretational issues, especially at low densities (L > 4) resulting from geomagnetic activity. At high densities EMFISIS is restricted by an upper observable limit near 3000 cm-3. As this limit is encountered above perigee, we employ two additional methods validated against EMFISIS measurements to determine electron densities deep within the plasmasphere (L < 2). EMFISIS can extrapolate density estimates to lower L by calculating high densities, in good agreement with the upper hybrid technique when applicable, from plasma wave properties. Calibrated measurements, from the Van Allen Probes EFW potential instrument, also extend into this range. In comparison with the published EMFISIS database we provide a metric for the validity of core plasmaspheric density measurements obtained from these methods and an empirical density model for use in wave and particle simulations.
Multiprofissional electronic protocol in ophtalmology with enfasis in strabismus.
Ribeiro, Christie Graf; Moreira, Ana Tereza Ramos; Pinto, José Simão DE Paula; Malafaia, Osvaldo
2016-01-01
to create and validate an electronic database in ophthalmology focused on strabismus, to computerize this database in the form of a systematic data collection software named Electronic Protocol, and to incorporate this protocol into the Integrated System of Electronic Protocols (SINPE(c)). this is a descriptive study, with the methodology divided into three phases: (1) development of a theoretical ophthalmologic database with emphasis on strabismus; (2) computerization of this theoretical ophthalmologic database using SINPE(c) and (3) interpretation of the information with demonstration of results to validate the protocol. We inputed data from the charts of fifty patients with known strabismus through the Electronic Protocol for testing and validation. the new electronic protocol was able to store information regarding patient history, physical examination, laboratory exams, imaging results, diagnosis and treatment of patients with ophthalmologic diseases, with emphasis on strabismus. We included 2,141 items in this master protocol and created 20 new specific electronic protocols for strabismus, each with its own specifics. Validation was achieved through correlation and corroboration of the symptoms and confirmed diagnoses of the fifty included patients with the diagnostic criteria for the twenty new strabismus protocols. a new, validated electronic database focusing on ophthalmology, with emphasis on strabismus, was successfully created through the standardized collection of information, and computerization of the database using proprietary software. This protocol is ready for deployment to facilitate data collection, sorting and application for practitioners and researchers in numerous specialties. criar uma base eletrônica de dados em oftalmologia com ênfase em estrabismo através da coleta padronizada de informações. Informatizar esta base sob a forma de software para a coleta sistemática de dados chamado "Protocolo Eletrônico" e incorporar este "Protocolo Eletrônico" da Oftalmologia ao Sistema Integrado de Protocolos Eletrônicos (SINPE(c)). este é um estudo descritivo e a metodologia aplicada em seu desenvolvimento está didaticamente dividida em três fases: 1) criação da base teórica de dados clínicos de oftalmologia com ênfase em estrabismo; 2) informatização da base teórica dos dados utilizando o SINPE(c); e 3) interpretação das informações com demonstração dos resultados. A informatização da base de dados foi realizada pela utilização da concessão de uso do SINPE(c). Foram incluídos neste protocolo 50 pacientes com estrabismo para validação do protocolo. o protocolo eletrônico desenvolvido permitiu armazenar informações relacionadas à anamnese, exame físico, exames complementares, diagnóstico e tratamento de pacientes com doenças oftalmológicas, com ênfase em estrabismo. Foram incluídos neste trabalho 2141 itens no protocolo mestre e foram criados 20 protocolos específicos de estrabismo, cada um com suas particularidades. Os 50 pacientes que foram incluídos nos protocolos específicos demonstraram a eficácia do método empregado. foi criada uma base eletrônica de dados em oftalmologia com ênfase em estrabismo através da coleta padronizada de informações. Esta base de dados foi informatizada sob a forma de software onde os futuros usuários poderão utilizar o protocolo eletrônico multiprofissional de doenças oftalmológicas com ênfase em estrabismo para a coleta de seus dados.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Shangjie; Department of Radiation Oncology, Stanford University School of Medicine, Palo Alto, California; Hara, Wendy
Purpose: To develop a reliable method to estimate electron density based on anatomic magnetic resonance imaging (MRI) of the brain. Methods and Materials: We proposed a unifying multi-atlas approach for electron density estimation based on standard T1- and T2-weighted MRI. First, a composite atlas was constructed through a voxelwise matching process using multiple atlases, with the goal of mitigating effects of inherent anatomic variations between patients. Next we computed for each voxel 2 kinds of conditional probabilities: (1) electron density given its image intensity on T1- and T2-weighted MR images; and (2) electron density given its spatial location in a referencemore » anatomy, obtained by deformable image registration. These were combined into a unifying posterior probability density function using the Bayesian formalism, which provided the optimal estimates for electron density. We evaluated the method on 10 patients using leave-one-patient-out cross-validation. Receiver operating characteristic analyses for detecting different tissue types were performed. Results: The proposed method significantly reduced the errors in electron density estimation, with a mean absolute Hounsfield unit error of 119, compared with 140 and 144 (P<.0001) using conventional T1-weighted intensity and geometry-based approaches, respectively. For detection of bony anatomy, the proposed method achieved an 89% area under the curve, 86% sensitivity, 88% specificity, and 90% accuracy, which improved upon intensity and geometry-based approaches (area under the curve: 79% and 80%, respectively). Conclusion: The proposed multi-atlas approach provides robust electron density estimation and bone detection based on anatomic MRI. If validated on a larger population, our work could enable the use of MRI as a primary modality for radiation treatment planning.« less
Moon, Kyoung-Ja; Jin, Yinji; Jin, Taixian; Lee, Sun-Mi
2018-01-01
A key component of the delirium management is prevention and early detection. To develop an automated delirium risk assessment system (Auto-DelRAS) that automatically alerts health care providers of an intensive care unit (ICU) patient's delirium risk based only on data collected in an electronic health record (EHR) system, and to evaluate the clinical validity of this system. Cohort and system development designs were used. Medical and surgical ICUs in two university hospitals in Seoul, Korea. A total of 3284 patients for the development of Auto-DelRAS, 325 for external validation, 694 for validation after clinical applications. The 4211 data items were extracted from the EHR system and delirium was measured using CAM-ICU (Confusion Assessment Method for Intensive Care Unit). The potential predictors were selected and a logistic regression model was established to create a delirium risk scoring algorithm to construct the Auto-DelRAS. The Auto-DelRAS was evaluated at three months and one year after its application to clinical practice to establish the predictive validity of the system. Eleven predictors were finally included in the logistic regression model. The results of the Auto-DelRAS risk assessment were shown as high/moderate/low risk on a Kardex screen. The predictive validity, analyzed after the clinical application of Auto-DelRAS after one year, showed a sensitivity of 0.88, specificity of 0.72, positive predictive value of 0.53, negative predictive value of 0.94, and a Youden index of 0.59. A relatively high level of predictive validity was maintained with the Auto-DelRAS system, even one year after it was applied to clinical practice. Copyright © 2017. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Hallaq, Thomas G.
2013-01-01
While new technology continues to develop and become increasingly affordable, and students have increased access to electronic media, one might wonder if requiring such technology in the classroom is akin to throwing the car keys to a teen-ager who has not completed a driver's education course. Currently, no validated survey has been created…
The Trojan Lifetime Champions Health Survey: Development, Validity, and Reliability
Sorenson, Shawn C.; Romano, Russell; Scholefield, Robin M.; Schroeder, E. Todd; Azen, Stanley P.; Salem, George J.
2015-01-01
Context Self-report questionnaires are an important method of evaluating lifespan health, exercise, and health-related quality of life (HRQL) outcomes among elite, competitive athletes. Few instruments, however, have undergone formal characterization of their psychometric properties within this population. Objective To evaluate the validity and reliability of a novel health and exercise questionnaire, the Trojan Lifetime Champions (TLC) Health Survey. Design Descriptive laboratory study. Setting A large National Collegiate Athletic Association Division I university. Patients or Other Participants A total of 63 university alumni (age range, 24 to 84 years), including former varsity collegiate athletes and a control group of nonathletes. Intervention(s) Participants completed the TLC Health Survey twice at a mean interval of 23 days with randomization to the paper or electronic version of the instrument. Main Outcome Measure(s) Content validity, feasibility of administration, test-retest reliability, parallel-form reliability between paper and electronic forms, and estimates of systematic and typical error versus differences of clinical interest were assessed across a broad range of health, exercise, and HRQL measures. Results Correlation coefficients, including intraclass correlation coefficients (ICCs) for continuous variables and κ agreement statistics for ordinal variables, for test-retest reliability averaged 0.86, 0.90, 0.80, and 0.74 for HRQL, lifetime health, recent health, and exercise variables, respectively. Correlation coefficients, again ICCs and κ, for parallel-form reliability (ie, equivalence) between paper and electronic versions averaged 0.90, 0.85, 0.85, and 0.81 for HRQL, lifetime health, recent health, and exercise variables, respectively. Typical measurement error was less than the a priori thresholds of clinical interest, and we found minimal evidence of systematic test-retest error. We found strong evidence of content validity, convergent construct validity with the Short-Form 12 Version 2 HRQL instrument, and feasibility of administration in an elite, competitive athletic population. Conclusions These data suggest that the TLC Health Survey is a valid and reliable instrument for assessing lifetime and recent health, exercise, and HRQL, among elite competitive athletes. Generalizability of the instrument may be enhanced by additional, larger-scale studies in diverse populations. PMID:25611315
Electronic health records: what are the most important barriers?
Ayatollahi, Haleh; Mirani, Nader; Haghani, Hamid
2014-01-01
The process of design and adoption of electronic health records may face a number of barriers. This study aimed to compare the importance of the main barriers from the experts' point of views in Iran. This survey study was completed in 2011. The potential participants (62 experts) included faculty members who worked in departments of health information technology and individuals who worked in the Ministry of Health in Iran and were in charge of the development and adoption of electronic health records. No sampling method was used in this study. Data were collected using a Likert-scale questionnaire ranging from 1 to 5. The validity of the questionnaire was established using content and face validity methods, and the reliability was calculated using Cronbach's alpha coefficient. The response rate was 51.6 percent. The participants' perspectives showed that the most important barriers in the process of design and adoption of electronic health records were technical barriers (mean = 3.84). Financial and ethical-legal barriers, with the mean value of 3.80 were other important barriers, and individual and organizational barriers, with the mean values of 3.59 and 3.50 were found to be less important than other barriers from the experts' perspectives. Strategic planning for the creation and adoption of electronic health records in the country, creating a team of experts to assess the potential barriers and develop strategies to eliminate them, and allocating financial resources can help to overcome most important barriers to the adoption of electronic health records.
Render, Marta L; Freyberg, Ron W; Hasselbeck, Rachael; Hofer, Timothy P; Sales, Anne E; Deddens, James; Levesque, Odette; Almenoff, Peter L
2011-06-01
BACKGROUND Veterans Health Administration (VA) intensive care units (ICUs) develop an infrastructure for quality improvement using information technology and recruiting leadership. METHODS Setting Participation by the 183 ICUs in the quality improvement program is required. Infrastructure includes measurement (electronic data extraction, analysis), quarterly web-based reporting and implementation support of evidence-based practices. Leaders prioritise measures based on quality improvement objectives. The electronic extraction is validated manually against the medical record, selecting hospitals whose data elements and measures fall at the extremes (10th, 90th percentile). results are depicted in graphic, narrative and tabular reports benchmarked by type and complexity of ICU. RESULTS The VA admits 103 689±1156 ICU patients/year. Variation in electronic business practices, data location and normal range of some laboratory tests affects data quality. A data management website captures data elements important to ICU performance and not available electronically. A dashboard manages the data overload (quarterly reports ranged 106-299 pages). More than 85% of ICU directors and nurse managers review their reports. Leadership interest is sustained by including ICU targets in executive performance contracts, identification of local improvement opportunities with analytic software, and focused reviews. CONCLUSION Lessons relevant to non-VA institutions include the: (1) need for ongoing data validation, (2) essential involvement of leadership at multiple levels, (3) supplementation of electronic data when key elements are absent, (4) utility of a good but not perfect electronic indicator to move practice while improving data elements and (5) value of a dashboard.
NASA Astrophysics Data System (ADS)
Gok, Gokhan; Mosna, Zbysek; Arikan, Feza; Arikan, Orhan; Erdem, Esra
2016-07-01
Ionospheric observation is essentially accomplished by specialized radar systems called ionosondes. The time delay between the transmitted and received signals versus frequency is measured by the ionosondes and the received signals are processed to generate ionogram plots, which show the time delay or reflection height of signals with respect to transmitted frequency. The critical frequencies of ionospheric layers and virtual heights, that provide useful information about ionospheric structurecan be extracted from ionograms . Ionograms also indicate the amount of variability or disturbances in the ionosphere. With special inversion algorithms and tomographical methods, electron density profiles can also be estimated from the ionograms. Although structural pictures of ionosphere in the vertical direction can be observed from ionosonde measurements, some errors may arise due to inaccuracies that arise from signal propagation, modeling, data processing and tomographic reconstruction algorithms. Recently IONOLAB group (www.ionolab.org) developed a new algorithm for effective and accurate extraction of ionospheric parameters and reconstruction of electron density profile from ionograms. The electron density reconstruction algorithm applies advanced optimization techniques to calculate parameters of any existing analytical function which defines electron density with respect to height using ionogram measurement data. The process of reconstructing electron density with respect to height is known as the ionogram scaling or true height analysis. IONOLAB-RAY algorithm is a tool to investigate the propagation path and parameters of HF wave in the ionosphere. The algorithm models the wave propagation using ray representation under geometrical optics approximation. In the algorithm , the structural ionospheric characteristics arerepresented as realistically as possible including anisotropicity, inhomogenity and time dependence in 3-D voxel structure. The algorithm is also used for various purposes including calculation of actual height and generation of ionograms. In this study, the performance of electron density reconstruction algorithm of IONOLAB group and standard electron density profile algorithms of ionosondes are compared with IONOLAB-RAY wave propagation simulation in near vertical incidence. The electron density reconstruction and parameter extraction algorithms of ionosondes are validated with the IONOLAB-RAY results both for quiet anddisturbed ionospheric states in Central Europe using ionosonde stations such as Pruhonice and Juliusruh . It is observed that IONOLAB ionosonde parameter extraction and electron density reconstruction algorithm performs significantly better compared to standard algorithms especially for disturbed ionospheric conditions. IONOLAB-RAY provides an efficient and reliable tool to investigate and validate ionosonde electron density reconstruction algorithms, especially in determination of reflection height (true height) of signals and critical parameters of ionosphere. This study is supported by TUBITAK 114E541, 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.
Design for low-power and reliable flexible electronics
NASA Astrophysics Data System (ADS)
Huang, Tsung-Ching (Jim)
Flexible electronics are emerging as an alternative to conventional Si electronics for large-area low-cost applications such as e-paper, smart sensors, and disposable RFID tags. By utilizing inexpensive manufacturing methods such as ink-jet printing and roll-to-roll imprinting, flexible electronics can be made on low-cost plastics just like printing a newspaper. However, the key elements of exible electronics, thin-film transistors (TFTs), have slower operating speeds and less reliability than their Si electronics counterparts. Furthermore, depending on the material property, TFTs are usually mono-type -- either p- or n-type -- devices. Making air-stable complementary TFT circuits is very challenging and not applicable to most TFT technologies. Existing design methodologies for Si electronics, therefore, cannot be directly applied to exible electronics. Other inhibiting factors such as high supply voltage, large process variation, and lack of trustworthy device modeling also make designing larger-scale and robust TFT circuits a significant challenge. The major goal of this dissertation is to provide a viable solution for robust circuit design in exible electronics. I will first introduce a reliability simulation framework that can predict the degraded TFT circuits' performance under bias-stress. This framework has been validated using the amorphous-silicon (a-Si) TFT scan driver for TFT-LCD displays. To reuse the existing CMOS design ow for exible electronics, I propose a Pseudo-CMOS cell library that can make TFT circuits operable under low supply voltage and which has post-fabrication tunability for reliability and performance enhancement. This cell library has been validated using 2V self-assembly-monolayer (SAM) organic TFTs with a low-cost shadow-mask deposition process. I will also demonstrate a 3-bit 1.25KS/s Flash ADC in a-Si TFTs, which is based on the proposed Pseudo-CMOS cell library, and explore more possibilities in display, energy, and sensing applications.
Validation of GC and HPLC systems for residue studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, M.
1995-12-01
For residue studies, GC and HPLC system performance must be validated prior to and during use. One excellent measure of system performance is the standard curve and associated chromatograms used to construct that curve. The standard curve is a model of system response to an analyte over a specific time period, and is prima facia evidence of system performance beginning at the auto sampler and proceeding through the injector, column, detector, electronics, data-capture device, and printer/plotter. This tool measures the performance of the entire chromatographic system; its power negates most of the benefits associated with costly and time-consuming validation ofmore » individual system components. Other measures of instrument and method validation will be discussed, including quality control charts and experimental designs for method validation.« less
Multidimensional measures validated for home health needs of older persons: A systematic review.
de Rossi Figueiredo, Daniela; Paes, Lucilene Gama; Warmling, Alessandra Martins; Erdmann, Alacoque Lorenzini; de Mello, Ana Lúcia Schaefer Ferreira
2018-01-01
To conduct a systematic review of the literature on valid and reliable multidimensional instruments to assess home health needs of older persons. Systematic review. Electronic databases, PubMed/Medline, Web of Science, Scopus, Cumulative Index to Nursing and Allied Health Literature, Scientific Electronic Library Online and the Latin American and Caribbean Health Sciences Information. All English, Portuguese and Spanish literature which included studies of reliability and validity of instruments that assessed at least two dimensions: physical, psychological, social support and functional independence, self-rated health behaviors and contextual environment and if such instruments proposed interventions after evaluation and/or monitoring changes over a period of time. Older persons aged 60 years or older. Of the 2397 studies identified, 32 were considered eligible. Two-thirds of the instruments proposed the physical, psychological, social support and functional independence dimensions. Inter-observer and intra-observer reliability and internal consistency values were 0.7 or above. More than two-thirds of the studies included validity (n=26) and more than one validity was tested in 15% (n=4) of these. Only 7% (n=2) proposed interventions after evaluation and/or monitoring changes over a period of time. Although the multidimensional assessment was performed, and the reliability values of the reviewed studies were satisfactory, different validity tests were not present in several studies. A gap at the instrument conception was observed related to interventions after evaluation and/or monitoring changes over a period of time. Further studies with this purpose are necessary for home health needs of the older persons. Copyright © 2017 Elsevier Ltd. All rights reserved.
Clegg, Andrew; Bates, Chris; Young, John; Ryan, Ronan; Nichols, Linda; Ann Teale, Elizabeth; Mohammed, Mohammed A.; Parry, John; Marshall, Tom
2016-01-01
Background: frailty is an especially problematic expression of population ageing. International guidelines recommend routine identification of frailty to provide evidence-based treatment, but currently available tools require additional resource. Objectives: to develop and validate an electronic frailty index (eFI) using routinely available primary care electronic health record data. Study design and setting: retrospective cohort study. Development and internal validation cohorts were established using a randomly split sample of the ResearchOne primary care database. External validation cohort established using THIN database. Participants: patients aged 65–95, registered with a ResearchOne or THIN practice on 14 October 2008. Predictors: we constructed the eFI using the cumulative deficit frailty model as our theoretical framework. The eFI score is calculated by the presence or absence of individual deficits as a proportion of the total possible. Categories of fit, mild, moderate and severe frailty were defined using population quartiles. Outcomes: outcomes were 1-, 3- and 5-year mortality, hospitalisation and nursing home admission. Statistical analysis: hazard ratios (HRs) were estimated using bivariate and multivariate Cox regression analyses. Discrimination was assessed using receiver operating characteristic (ROC) curves. Calibration was assessed using pseudo-R2 estimates. Results: we include data from a total of 931,541 patients. The eFI incorporates 36 deficits constructed using 2,171 CTV3 codes. One-year adjusted HR for mortality was 1.92 (95% CI 1.81–2.04) for mild frailty, 3.10 (95% CI 2.91–3.31) for moderate frailty and 4.52 (95% CI 4.16–4.91) for severe frailty. Corresponding estimates for hospitalisation were 1.93 (95% CI 1.86–2.01), 3.04 (95% CI 2.90–3.19) and 4.73 (95% CI 4.43–5.06) and for nursing home admission were 1.89 (95% CI 1.63–2.15), 3.19 (95% CI 2.73–3.73) and 4.76 (95% CI 3.92–5.77), with good to moderate discrimination but low calibration estimates. Conclusions: the eFI uses routine data to identify older people with mild, moderate and severe frailty, with robust predictive validity for outcomes of mortality, hospitalisation and nursing home admission. Routine implementation of the eFI could enable delivery of evidence-based interventions to improve outcomes for this vulnerable group. PMID:26944937
NASA Astrophysics Data System (ADS)
Förster, Matthias; Rashev, Mikhail; Haaland, Stein
2017-04-01
The Electron Drift Instrument (EDI) onboard Cluster can measure 500 eV and 1 keV electron fluxes with high time resolution during passive operation phases in its Ambient Electron (AE) mode. Data from this mode is available in the Cluster Science Archive since October 2004 with a cadence of 16 Hz in the normal mode or 128 Hz for burst mode telemetry intervals. The fluxes are recorded at pitch angles of 0, 90, and 180 degrees. This paper describes the calibration and validation of these measurements. The high resolution AE data allow precise temporal and spatial diagnostics of magnetospheric boundaries and will be used for case studies and statistical studies of low energy electron fluxes in the near-Earth space. We show examples of applications.
Figure Control of Lightweight Optical Structures
NASA Technical Reports Server (NTRS)
Main, John A.; Song, Haiping
2005-01-01
The goal of this paper is to demonstrate the use of fuzzy logic controllers in modifying the figure of a piezoceramic bimorph mirror. Non-contact electron actuation technology is used to actively control a bimorph mirror comprised two PZT-5H wafers by varying the electron flux and electron voltages. Due to electron blooming generated by the electron flux, it is difficult to develop an accurate control model for the bimorph mirror through theoretical analysis alone. The non-contact shape control system with electron flux blooming can be approximately described with a heuristic model based on experimental data. Two fuzzy logic feedback controllers are developed to control the shape of the bimorph mirror according to heuristic fuzzy inference rules generated from previous experimental results. Validation of the proposed fuzzy logic controllers is also discussed.
Quantitative studies on structure-DPPH• scavenging activity relationships of food phenolic acids.
Jing, Pu; Zhao, Shu-Juan; Jian, Wen-Jie; Qian, Bing-Jun; Dong, Ying; Pang, Jie
2012-11-01
Phenolic acids are potent antioxidants, yet the quantitative structure-activity relationships of phenolic acids remain unclear. The purpose of this study was to establish 3D-QSAR models able to predict phenolic acids with high DPPH• scavenging activity and understand their structure-activity relationships. The model has been established by using a training set of compounds with cross-validated q2 = 0.638/0.855, non-cross-validated r2 = 0.984/0.986, standard error of estimate = 0.236/0.216, and F = 139.126/208.320 for the best CoMFA/CoMSIA models. The predictive ability of the models was validated with the correlation coefficient r2(pred) = 0.971/0.996 (>0.6) for each model. Additionally, the contour map results suggested that structural characteristics of phenolics acids favorable for the high DPPH• scavenging activity might include: (1) bulky and/or electron-donating substituent groups on the phenol ring; (2) electron-donating groups at the meta-position and/or hydrophobic groups at the meta-/ortho-position; (3) hydrogen-bond donor/electron-donating groups at the ortho-position. The results have been confirmed based on structural analyses of phenolic acids and their DPPH• scavenging data from eight recent publications. The findings may provide deeper insight into the antioxidant mechanisms and provide useful information for selecting phenolic acids for free radical scavenging properties.
Note: Design of transverse electron gun for electron beam based reactive evaporation system.
Maiti, Namita; Barve, U D; Bhatia, M S; Das, A K
2011-05-01
In this paper design of a 10 kV, 10 kW transverse electron gun, suitable for reactive evaporation, supported by simulation and modeling, is presented. Simulation of the electron beam trajectory helps in locating the emergence aperture after 90° bend and also in designing the crucible on which the beam is finally incident after 270° bend. The dimension of emergence aperture plays a vital role in designing the differential pumping system between the gun chamber and the substrate chamber. Experimental validation is done for beam trajectory by piercing a stainless steel plate at 90° position which is kept above the crucible.
Shepard, John; Hadhazy, Eric; Frederick, John; Nicol, Spencer; Gade, Padmaja; Cardon, Andrew; Wilson, Jorge; Vetteth, Yohan; Madison, Sasha
2014-03-01
Streamlining health care-associated infection surveillance is essential for health care facilities owing to the continuing increases in reporting requirements. Stanford Hospital, a 583-bed adult tertiary care center, used their electronic medical record (EMR) to develop an electronic algorithm to reduce the time required to conduct catheter-associated urinary tract infection (CAUTI) surveillance in adults. The algorithm provides inclusion and exclusion criteria, using the National Healthcare Safety Network definitions, for patients with a CAUTI. The algorithm was validated by trained infection preventionists through complete chart review for a random sample of cultures collected during the study period, September 1, 2012, to February 28, 2013. During the study period, a total of 6,379 positive urine cultures were identified. The Stanford Hospital electronic CAUTI algorithm identified 6,101 of these positive cultures (95.64%) as not a CAUTI, 191 (2.99%) as a possible CAUTI requiring further validation, and 87 (1.36%) as a definite CAUTI. Overall, use of the algorithm reduced CAUTI surveillance requirements at Stanford Hospital by 97.01%. The electronic algorithm proved effective in increasing the efficiency of CAUTI surveillance. The data suggest that CAUTI surveillance using the National Healthcare Safety Network definitions can be fully automated. Copyright © 2014 Association for Professionals in Infection Control and Epidemiology, Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yi; Park, Yang-Kyun; Doppke, Karen P.
2015-06-15
Purpose: This study evaluated the performance of the electron Monte Carlo dose calculation algorithm in RayStation v4.0 for an Elekta machine with Agility™ treatment head. Methods: The machine has five electron energies (6–8 MeV) and five applicators (6×6 to 25×25 cm {sup 2}). The dose (cGy/MU at d{sub max}), depth dose and profiles were measured in water using an electron diode at 100 cm SSD for nine square fields ≥2×2 cm{sup 2} and four complex fields at normal incidence, and a 14×14 cm{sup 2} field at 15° and 30° incidence. The dose was also measured for three square fields ≥4×4more » cm{sup 2} at 98, 105 and 110 cm SSD. Using selected energies, the EBT3 radiochromic film was used for dose measurements in slab-shaped inhomogeneous phantoms and a breast phantom with surface curvature. The measured and calculated doses were analyzed using a gamma criterion of 3%/3 mm. Results: The calculated and measured doses varied by <3% for 116 of the 120 points, and <5% for the 4×4 cm{sup 2} field at 110 cm SSD at 9–18 MeV. The gamma analysis comparing the 105 pairs of in-water isodoses passed by >98.1%. The planar doses measured from films placed at 0.5 cm below a lung/tissue layer (12 MeV) and 1.0 cm below a bone/air layer (15 MeV) showed excellent agreement with calculations, with gamma passing by 99.9% and 98.5%, respectively. At the breast-tissue interface, the gamma passing rate is >98.8% at 12–18 MeV. The film results directly validated the accuracy of MU calculation and spatial dose distribution in presence of tissue inhomogeneity and surface curvature - situations challenging for simpler pencil-beam algorithms. Conclusion: The electron Monte Carlo algorithm in RayStation v4.0 is fully validated for clinical use for the Elekta Agility™ machine. The comprehensive validation included small fields, complex fields, oblique beams, extended distance, tissue inhomogeneity and surface curvature.« less
Validation of a Delirium Risk Assessment Using Electronic Medical Record Information.
Rudolph, James L; Doherty, Kelly; Kelly, Brittany; Driver, Jane A; Archambault, Elizabeth
2016-03-01
Identifying patients at risk for delirium allows prompt application of prevention, diagnostic, and treatment strategies; but is rarely done. Once delirium develops, patients are more likely to need posthospitalization skilled care. This study developed an a priori electronic prediction rule using independent risk factors identified in a National Center of Clinical Excellence meta-analysis and validated the ability to predict delirium in 2 cohorts. Retrospective analysis followed by prospective validation. Tertiary VA Hospital in New England. A total of 27,625 medical records of hospitalized patients and 246 prospectively enrolled patients admitted to the hospital. The electronic delirium risk prediction rule was created using data obtained from the patient electronic medical record (EMR). The primary outcome, delirium, was identified 2 ways: (1) from the EMR (retrospective cohort) and (2) clinical assessment on enrollment and daily thereafter (prospective participants). We assessed discrimination of the delirium prediction rule with the C-statistic. Secondary outcomes were length of stay and discharge to rehabilitation. Retrospectively, delirium was identified in 8% of medical records (n = 2343); prospectively, delirium during hospitalization was present in 26% of participants (n = 64). In the retrospective cohort, medical record delirium was identified in 2%, 3%, 11%, and 38% of the low, intermediate, high, and very high-risk groups, respectively (C-statistic = 0.81; 95% confidence interval 0.80-0.82). Prospectively, the electronic prediction rule identified delirium in 15%, 18%, 31%, and 55% of these groups (C-statistic = 0.69; 95% confidence interval 0.61-0.77). Compared with low-risk patients, those at high- or very high delirium risk had increased length of stay (5.7 ± 5.6 vs 3.7 ± 2.7 days; P = .001) and higher rates of discharge to rehabilitation (8.9% vs 20.8%; P = .02). Automatic calculation of delirium risk using an EMR algorithm identifies patients at risk for delirium, which creates a critical opportunity for gaining clinical efficiencies and improving delirium identification, including those needing skilled care. Published by Elsevier Inc.
Automated quality checks on repeat prescribing.
Rogers, Jeremy E; Wroe, Christopher J; Roberts, Angus; Swallow, Angela; Stables, David; Cantrill, Judith A; Rector, Alan L
2003-01-01
BACKGROUND: Good clinical practice in primary care includes periodic review of repeat prescriptions. Markers of prescriptions that may need review have been described, but manually checking all repeat prescriptions against the markers would be impractical. AIM: To investigate the feasibility of computerising the application of repeat prescribing quality checks to electronic patient records in United Kingdom (UK) primary care. DESIGN OF STUDY: Software performance test against benchmark manual analysis of cross-sectional convenience sample of prescribing documentation. SETTING: Three general practices in Greater Manchester, in the north west of England, during a 4-month period in 2001. METHOD: A machine-readable drug information resource, based on the British National Formulary (BNF) as the 'gold standard' for valid drug indications, was installed in three practices. Software raised alerts for each repeat prescribed item where the electronic patient record contained no valid indication for the medication. Alerts raised by the software in two practices were analysed manually. Clinical reaction to the software was assessed by semi-structured interviews in three practices. RESULTS: There was no valid indication in the electronic medical records for 14.8% of repeat prescribed items. Sixty-two per cent of all alerts generated were incorrect. Forty-three per cent of all incorrect alerts were as a result of errors in the drug information resource, 44% to locally idiosyncratic clinical coding, 8% to the use of the BNF without adaptation as a gold standard, and 5% to the inability of the system to infer diagnoses that, although unrecorded, would be 'obvious' to a clinical reading the record. The interviewed clinicians supported the goals of the software. CONCLUSION: Using electronic records for secondary decision support purposes will benefit from (and may require) both more consistent electronic clinical data collection across multiple sites, and reconciling clinicians' willingness to infer unstated but 'obvious' diagnoses with the machine's inability to do the same. PMID:14702902
Sebrow, Dov; Lavery, Hugh J; Brajtbord, Jonathan S; Hobbs, Adele; Levinson, Adam W; Samadi, David B
2012-02-01
To describe a novel, low-cost, online health-related quality of life (HRQOL) survey that allows for automated follow-up and convenient access for patients in geographically diverse locations. Clinicians and investigators have been encouraged to use validated HRQOL instruments when reporting outcomes after radical prostatectomy. The institutional review board approved our protocol and the use of a secure web site (http://www.SurveyMonkey.com) to send patients a collection of validated postprostatectomy HRQOL instruments by electronic mail. To assess compliance with the electronic mail format, a pilot study of cross-sectional surveys was sent to patients who presented for follow-up after robotic-assisted laparoscopic prostatectomy. The response data were transmitted in secure fashion in compliance with the Health Insurance Portability and Accountability Act. After providing written informed consent, 514 patients who presented for follow-up after robotic-assisted laparoscopic prostatectomy from March 2010 to February 2011 were sent the online survey. A total of 293 patients (57%) responded, with an average age of 60 years and a median interval from surgery of 12 months. Of the respondents, 75% completed the survey within 4 days of receiving the electronic mail, with a median completion time of 15 minutes. The total survey administration costs were limited to the web site's $200 annual fee-for-service. An online survey can be a low-cost, efficient, and confidential modality for assessing validated HRQOL outcomes in patients who undergo treatment of localized prostate cancer. This method could be especially useful for those who cannot return for follow-up because of geographic reasons. Copyright © 2012 Elsevier Inc. All rights reserved.
Wright, J M; Mattu, G S; Perry, T L; Gelferc, M E; Strange, K D; Zorn, A; Chen, Y
2001-06-01
To test the accuracy of a new algorithm for the BPM-100, an automated oscillometric blood pressure (BP) monitor, using stored data from an independently conducted validation trial comparing the BPM-100(Beta) with a mercury sphygmomanometer. Raw pulse wave and cuff pressure data were stored electronically using embedded software in the BPM-100(Beta), during the validation trial. The 391 sets of measurements were separated objectively into two subsets. A subset of 136 measurements was used to develop a new algorithm to enhance the accuracy of the device when reading higher systolic pressures. The larger subset of 255 measurements (three readings for 85 subjects) was used as test data to validate the accuracy of the new algorithm. Differences between the new algorithm BPM-100 and the reference (mean of two observers) were determined and expressed as the mean difference +/- SD, plus the percentage of measurements within 5, 10, and 15 mmHg. The mean difference between the BPM-100 and reference systolic BP was -0.16 +/- 5.13 mmHg, with 73.7% < or = 5 mmHg, 94.9% < or = 10 mmHg and 98.8% < or = 15 mmHg. The mean difference between the BPM-100 and reference diastolic BP was -1.41 +/- 4.67 mmHg, with 78.4% < or = 5 mmHg, 92.5% < or = 10 mmHg, and 99.2% < or = 15 mmHg. These data improve upon that of the BPM-100(Beta) and pass the AAMI standard, and 'A' grade BHS protocol. This study illustrates a new method for developing and testing a change in an algorithm for an oscillometric BP monitor utilizing collected and stored electronic data and demonstrates that the new algorithm meets the AAMI standard and BHS protocol.
Wierzbicki, W; Nicol, S; Furstoss, C; Brunet-Benkhoucha, M; Leduc, V
2012-07-01
A newly acquired nanoDot In-Light system was compared with TLD-100 dosimeters to confirm the treatment dose in the multiple cases: an electron eye treatment, H&N IMRT and VMAT validation for small targets. Eye tumour treatment with 9 MeV electrons A dose of 1.8 Gy per fraction was prescribed to the 85% isodose. The average dose measured by three TLDs and three Dots was 1.90 and 1.97 Gy. Both detectors overestimated dose, by 2.9% and 6.7% respectively. H&N IMRT treatment of skin cancer with 6 MV photons Dose per fraction is 2.5 Gy. The average doses measured by two TLDs and two Dots were 2.48 and 2.56 Gy, which represent errors of -0.8% and 2.2%, respectively. VMAT validation for small targets using an Agarose phantom, dose 15 Gy A single-tumour brain treatment was delivered using two coplanar arcs to an Agarise phantom containing a large plastic insert holding 3 nanoDots and 4 TLDs. The difference between the average Pinnacle dose and the average dose of the corresponding detectors was -0.6% for Dots and -1.7% for TLDs. A two-tumour brain treatment was delivered using three non-coplanar arcs. Small and large plastic inserts separated by 5 cm were used to validate the dose. The difference between the average Pinnacle dose and the average dose of the corresponding detectors was the following; small phantom 0.7% for Dots and 0.3% for TLDs, large phantom-1.9% for Dots and -0.6% for TLDs. In conclusion, nanoDot detectors are suitable for in-vivo dosimetry with photon and electron beams. © 2012 American Association of Physicists in Medicine.
PheKB: a catalog and workflow for creating electronic phenotype algorithms for transportability.
Kirby, Jacqueline C; Speltz, Peter; Rasmussen, Luke V; Basford, Melissa; Gottesman, Omri; Peissig, Peggy L; Pacheco, Jennifer A; Tromp, Gerard; Pathak, Jyotishman; Carrell, David S; Ellis, Stephen B; Lingren, Todd; Thompson, Will K; Savova, Guergana; Haines, Jonathan; Roden, Dan M; Harris, Paul A; Denny, Joshua C
2016-11-01
Health care generated data have become an important source for clinical and genomic research. Often, investigators create and iteratively refine phenotype algorithms to achieve high positive predictive values (PPVs) or sensitivity, thereby identifying valid cases and controls. These algorithms achieve the greatest utility when validated and shared by multiple health care systems.Materials and Methods We report the current status and impact of the Phenotype KnowledgeBase (PheKB, http://phekb.org), an online environment supporting the workflow of building, sharing, and validating electronic phenotype algorithms. We analyze the most frequent components used in algorithms and their performance at authoring institutions and secondary implementation sites. As of June 2015, PheKB contained 30 finalized phenotype algorithms and 62 algorithms in development spanning a range of traits and diseases. Phenotypes have had over 3500 unique views in a 6-month period and have been reused by other institutions. International Classification of Disease codes were the most frequently used component, followed by medications and natural language processing. Among algorithms with published performance data, the median PPV was nearly identical when evaluated at the authoring institutions (n = 44; case 96.0%, control 100%) compared to implementation sites (n = 40; case 97.5%, control 100%). These results demonstrate that a broad range of algorithms to mine electronic health record data from different health systems can be developed with high PPV, and algorithms developed at one site are generally transportable to others. By providing a central repository, PheKB enables improved development, transportability, and validity of algorithms for research-grade phenotypes using health care generated data. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
PheKB: a catalog and workflow for creating electronic phenotype algorithms for transportability
Kirby, Jacqueline C; Speltz, Peter; Rasmussen, Luke V; Basford, Melissa; Gottesman, Omri; Peissig, Peggy L; Pacheco, Jennifer A; Tromp, Gerard; Pathak, Jyotishman; Carrell, David S; Ellis, Stephen B; Lingren, Todd; Thompson, Will K; Savova, Guergana; Haines, Jonathan; Roden, Dan M; Harris, Paul A
2016-01-01
Objective Health care generated data have become an important source for clinical and genomic research. Often, investigators create and iteratively refine phenotype algorithms to achieve high positive predictive values (PPVs) or sensitivity, thereby identifying valid cases and controls. These algorithms achieve the greatest utility when validated and shared by multiple health care systems. Materials and Methods We report the current status and impact of the Phenotype KnowledgeBase (PheKB, http://phekb.org), an online environment supporting the workflow of building, sharing, and validating electronic phenotype algorithms. We analyze the most frequent components used in algorithms and their performance at authoring institutions and secondary implementation sites. Results As of June 2015, PheKB contained 30 finalized phenotype algorithms and 62 algorithms in development spanning a range of traits and diseases. Phenotypes have had over 3500 unique views in a 6-month period and have been reused by other institutions. International Classification of Disease codes were the most frequently used component, followed by medications and natural language processing. Among algorithms with published performance data, the median PPV was nearly identical when evaluated at the authoring institutions (n = 44; case 96.0%, control 100%) compared to implementation sites (n = 40; case 97.5%, control 100%). Discussion These results demonstrate that a broad range of algorithms to mine electronic health record data from different health systems can be developed with high PPV, and algorithms developed at one site are generally transportable to others. Conclusion By providing a central repository, PheKB enables improved development, transportability, and validity of algorithms for research-grade phenotypes using health care generated data. PMID:27026615
Śliwińska, Magdalena; Garcia-Hernandez, Celia; Kościński, Mikołaj; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek; Śliwińska-Bartkowiak, Małgorzata; Jurga, Stefan; Garcia-Cabezon, Cristina; Rodriguez-Mendez, Maria Luz
2016-01-01
The capability of a phthalocyanine-based voltammetric electronic tongue to analyze strong alcoholic beverages has been evaluated and compared with the performance of spectroscopic techniques coupled to chemometrics. Nalewka Polish liqueurs prepared from five apple varieties have been used as a model of strong liqueurs. Principal Component Analysis has demonstrated that the best discrimination between liqueurs prepared from different apple varieties is achieved using the e-tongue and UV-Vis spectroscopy. Raman spectra coupled to chemometrics have not been efficient in discriminating liqueurs. The calculated Euclidean distances and the k-Nearest Neighbors algorithm (kNN) confirmed these results. The main advantage of the e-tongue is that, using PLS-1, good correlations have been found simultaneously with the phenolic content measured by the Folin–Ciocalteu method (R2 of 0.97 in calibration and R2 of 0.93 in validation) and also with the density, a marker of the alcoholic content method (R2 of 0.93 in calibration and R2 of 0.88 in validation). UV-Vis coupled with chemometrics has shown good correlations only with the phenolic content (R2 of 0.99 in calibration and R2 of 0.99 in validation) but correlations with the alcoholic content were low. Raman coupled with chemometrics has shown good correlations only with density (R2 of 0.96 in calibration and R2 of 0.85 in validation). In summary, from the three holistic methods evaluated to analyze strong alcoholic liqueurs, the voltammetric electronic tongue using phthalocyanines as sensing elements is superior to Raman or UV-Vis techniques because it shows an excellent discrimination capability and remarkable correlations with both antioxidant capacity and alcoholic content—the most important parameters to be measured in this type of liqueurs. PMID:27735832
Śliwińska, Magdalena; Garcia-Hernandez, Celia; Kościński, Mikołaj; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek; Śliwińska-Bartkowiak, Małgorzata; Jurga, Stefan; Garcia-Cabezon, Cristina; Rodriguez-Mendez, Maria Luz
2016-10-09
The capability of a phthalocyanine-based voltammetric electronic tongue to analyze strong alcoholic beverages has been evaluated and compared with the performance of spectroscopic techniques coupled to chemometrics. Nalewka Polish liqueurs prepared from five apple varieties have been used as a model of strong liqueurs. Principal Component Analysis has demonstrated that the best discrimination between liqueurs prepared from different apple varieties is achieved using the e-tongue and UV-Vis spectroscopy. Raman spectra coupled to chemometrics have not been efficient in discriminating liqueurs. The calculated Euclidean distances and the k-Nearest Neighbors algorithm (kNN) confirmed these results. The main advantage of the e-tongue is that, using PLS-1, good correlations have been found simultaneously with the phenolic content measured by the Folin-Ciocalteu method (R² of 0.97 in calibration and R² of 0.93 in validation) and also with the density, a marker of the alcoholic content method (R² of 0.93 in calibration and R² of 0.88 in validation). UV-Vis coupled with chemometrics has shown good correlations only with the phenolic content (R² of 0.99 in calibration and R² of 0.99 in validation) but correlations with the alcoholic content were low. Raman coupled with chemometrics has shown good correlations only with density (R² of 0.96 in calibration and R² of 0.85 in validation). In summary, from the three holistic methods evaluated to analyze strong alcoholic liqueurs, the voltammetric electronic tongue using phthalocyanines as sensing elements is superior to Raman or UV-Vis techniques because it shows an excellent discrimination capability and remarkable correlations with both antioxidant capacity and alcoholic content-the most important parameters to be measured in this type of liqueurs.
Validation of an electronic version of the Mini Asthma Quality of Life Questionnaire.
Olajos-Clow, J; Minard, J; Szpiro, K; Juniper, E F; Turcotte, S; Jiang, X; Jenkins, B; Lougheed, M D
2010-05-01
The Mini Asthma Quality of Life Questionnaire (MiniAQLQ) is a validated disease-specific quality of life (QOL) paper (p) questionnaire. Electronic (e) versions enable inclusion of asthma QOL in electronic medical records and research databases. To validate an e-version of the MiniAQLQ, compare time required for completion of e- and p-versions, and determine which version participants prefer. Adults with stable asthma were randomized to complete either the e- or p-MiniAQLQ, followed by a 2-h rest period before completing the other version. Agreement between versions was measured using the intraclass correlation coefficient (ICC) and Bland-Altman analysis. Two participants with incomplete p-MiniAQLQ responses were excluded. Forty participants (85% female; age 47.7 +/- 14.9 years; asthma duration 22.6 +/- 16.1 years; FEV(1) 87.1 +/- 21.6% predicted) with both AQLQ scores <6.0 completed the study. Agreement between e- and p-versions for the overall score was acceptable (ICC=0.95) with no bias (difference (Delta) p-e=0.1; P=0.21). ICCs for the symptom, activity limitation, emotional function and environmental stimuli domains were 0.94, 0.89, 0.90, and 0.91 respectively. A small but significant bias (Delta=0.3; P=0.004) was noted in the activity limitation domain. Completion time was significantly longer for the e-version (3.8 +/- 1.9min versus 2.7 +/- 1.1min; P<0.0001). The majority of patients (57.5%) preferred the e-MiniAQLQ; 35% had no preference. This e-version of the MiniAQLQ is valid and was preferred by most participants despite taking slightly longer to complete. Generalizabilty may be limited in younger (12-17) and older (>65) adults.
Development of a Hand Held Thromboelastograph
2015-01-01
documents will be referenced during the Entegrion PCM System design, verification and validation activities. EN 61010 -1:2010 (Edition3.0) Safety...requirements for electrical equipment for measurement, control, and laboratory use – Part 1: General requirements. EN 61010 -2-101:2002 Safety...IPC-A-610E Acceptability of Electronic Assemblies IPC 7711/21B Rework, Modification and Repair of Electronic Assemblies. IEC 62304:2006/AC:2008
An Experiment on the Limits of Quantum Electro-dynamics
DOE R&D Accomplishments Database
Barber, W. C.; Richter, B.; Panofsky, W. K. H.; O'Neill, G. K.; Gittelman, B.
1959-06-01
The limitations of previously performed or suggested electrodynamic cutoff experiments are reviewed, and an electron-electron scattering experiment to be performed with storage rings to investigate further the limits of the validity of quantum electrodynamics is described. The foreseen experimental problems are discussed, and the results of the associated calculations are given. The parameters and status of the equipment are summarized. (D.C.W.)
Electronic structure and electron momentum densities of Ag2CrO4
NASA Astrophysics Data System (ADS)
Meena, Seema Kumari; Ahuja, B. L.
2018-05-01
We present the first-ever experimental electron momentum density of Ag2CrO4 using 661.65 keV γ-rays from 20 Ci 137Cs source. To validate our experimental data, we have also deduced theoretical Compton profiles, energy bands and density of states using linear combination of atomic orbitals (LCAO) method in the framework of density functional theory. It is seen that the DFT-LDA gives a better agreement with experimental data than free atom model. The energy bands and density of states are also discussed.
Partial Wave Dispersion Relations: Application to Electron-Atom Scattering
NASA Technical Reports Server (NTRS)
Temkin, A.; Drachman, Richard J.
1999-01-01
In this Letter we propose the use of partial wave dispersion relations (DR's) as the way of solving the long-standing problem of correctly incorporating exchange in a valid DR for electron-atom scattering. In particular a method is given for effectively calculating the contribution of the discontinuity and/or poles of the partial wave amplitude which occur in the negative E plane. The method is successfully tested in three cases: (i) the analytically solvable exponential potential, (ii) the Hartree potential, and (iii) the S-wave exchange approximation for electron-hydrogen scattering.
NASA Technical Reports Server (NTRS)
Temkin, A.
1984-01-01
Temkin (1982) has derived the ionization threshold law based on a Coulomb-dipole theory of the ionization process. The present investigation is concerned with a reexamination of several aspects of the Coulomb-dipole threshold law. Attention is given to the energy scale of the logarithmic denominator, the spin-asymmetry parameter, and an estimate of alpha and the energy range of validity of the threshold law, taking into account the result of the two-electron photodetachment experiment conducted by Donahue et al. (1984).
Experimental validation of a phenomenological model of the plasma contacting process
NASA Technical Reports Server (NTRS)
Williams, John D.; Wilbur, Paul J.; Monheiser, Jeff M.
1988-01-01
A preliminary model of the plasma coupling process is presented which describes the phenomena observed in ground-based experiments using a hollow cathode plasma contactor to collect electrons from a dilute ambient plasma under conditions where magnetic field effects can be neglected. The locations of the double-sheath region boundaries are estimated and correlated with experimental results. Ion production mechanisms in the plasma plume caused by discharge electrons from the contactor cathode and by electrons streaming into the plasma plume through the double-sheath from the ambient plasma are also discussed.
Modification and benchmarking of MCNP for low-energy tungsten spectra.
Mercier, J R; Kopp, D T; McDavid, W D; Dove, S B; Lancaster, J L; Tucker, D M
2000-12-01
The MCNP Monte Carlo radiation transport code was modified for diagnostic medical physics applications. In particular, the modified code was thoroughly benchmarked for the production of polychromatic tungsten x-ray spectra in the 30-150 kV range. Validating the modified code for coupled electron-photon transport with benchmark spectra was supplemented with independent electron-only and photon-only transport benchmarks. Major revisions to the code included the proper treatment of characteristic K x-ray production and scoring, new impact ionization cross sections, and new bremsstrahlung cross sections. Minor revisions included updated photon cross sections, electron-electron bremsstrahlung production, and K x-ray yield. The modified MCNP code is benchmarked to electron backscatter factors, x-ray spectra production, and primary and scatter photon transport.
Convergent Validity of the Arab Teens Lifestyle Study (ATLS) Physical Activity Questionnaire
Al-Hazzaa, Hazzaa M.; Al-Sobayel, Hana I.; Musaiger, Abdulrahman O.
2011-01-01
The Arab Teens Lifestyle Study (ATLS) is a multicenter project for assessing the lifestyle habits of Arab adolescents. This study reports on the convergent validity of the physical activity questionnaire used in ATLS against an electronic pedometer. Participants were 39 males and 36 females randomly selected from secondary schools, with a mean age of 16.1 ± 1.1 years. ATLS self-reported questionnaire was validated against the electronic pedometer for three consecutive weekdays. Mean steps counts were 6,866 ± 3,854 steps/day with no significant gender difference observed. Questionnaire results showed no significant gender differences in time spent on total or moderate-intensity activities. However, males spent significantly more time than females on vigorous-intensity activity. The correlation of steps counts with total time spent on all activities by the questionnaire was 0.369. Relationship of steps counts was higher with vigorous-intensity (r = 0.338) than with moderate-intensity activity (r = 0.265). Pedometer steps counts showed higher correlations with time spent on walking (r = 0.350) and jogging (r = 0.383) than with the time spent on other activities. Active participants, based on pedometer assessment, were also most active by the questionnaire. It appears that ATLS questionnaire is a valid instrument for assessing habitual physical activity among Arab adolescents. PMID:22016718
Springate, David A; Kontopantelis, Evangelos; Ashcroft, Darren M; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David
2014-01-01
Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects.
A validation procedure for a LADAR system radiometric simulation model
NASA Astrophysics Data System (ADS)
Leishman, Brad; Budge, Scott; Pack, Robert
2007-04-01
The USU LadarSIM software package is a ladar system engineering tool that has recently been enhanced to include the modeling of the radiometry of Ladar beam footprints. This paper will discuss our validation of the radiometric model and present a practical approach to future validation work. In order to validate complicated and interrelated factors affecting radiometry, a systematic approach had to be developed. Data for known parameters were first gathered then unknown parameters of the system were determined from simulation test scenarios. This was done in a way to isolate as many unknown variables as possible, then build on the previously obtained results. First, the appropriate voltage threshold levels of the discrimination electronics were set by analyzing the number of false alarms seen in actual data sets. With this threshold set, the system noise was then adjusted to achieve the appropriate number of dropouts. Once a suitable noise level was found, the range errors of the simulated and actual data sets were compared and studied. Predicted errors in range measurements were analyzed using two methods: first by examining the range error of a surface with known reflectivity and second by examining the range errors for specific detectors with known responsivities. This provided insight into the discrimination method and receiver electronics used in the actual system.
Convergent validity of the Arab Teens Lifestyle Study (ATLS) physical activity questionnaire.
Al-Hazzaa, Hazzaa M; Al-Sobayel, Hana I; Musaiger, Abdulrahman O
2011-09-01
The Arab Teens Lifestyle Study (ATLS) is a multicenter project for assessing the lifestyle habits of Arab adolescents. This study reports on the convergent validity of the physical activity questionnaire used in ATLS against an electronic pedometer. Participants were 39 males and 36 females randomly selected from secondary schools, with a mean age of 16.1 ± 1.1 years. ATLS self-reported questionnaire was validated against the electronic pedometer for three consecutive weekdays. Mean steps counts were 6,866 ± 3,854 steps/day with no significant gender difference observed. Questionnaire results showed no significant gender differences in time spent on total or moderate-intensity activities. However, males spent significantly more time than females on vigorous-intensity activity. The correlation of steps counts with total time spent on all activities by the questionnaire was 0.369. Relationship of steps counts was higher with vigorous-intensity (r = 0.338) than with moderate-intensity activity (r = 0.265). Pedometer steps counts showed higher correlations with time spent on walking (r = 0.350) and jogging (r = 0.383) than with the time spent on other activities. Active participants, based on pedometer assessment, were also most active by the questionnaire. It appears that ATLS questionnaire is a valid instrument for assessing habitual physical activity among Arab adolescents.
Springate, David A.; Kontopantelis, Evangelos; Ashcroft, Darren M.; Olier, Ivan; Parisi, Rosa; Chamapiwa, Edmore; Reeves, David
2014-01-01
Lists of clinical codes are the foundation for research undertaken using electronic medical records (EMRs). If clinical code lists are not available, reviewers are unable to determine the validity of research, full study replication is impossible, researchers are unable to make effective comparisons between studies, and the construction of new code lists is subject to much duplication of effort. Despite this, the publication of clinical codes is rarely if ever a requirement for obtaining grants, validating protocols, or publishing research. In a representative sample of 450 EMR primary research articles indexed on PubMed, we found that only 19 (5.1%) were accompanied by a full set of published clinical codes and 32 (8.6%) stated that code lists were available on request. To help address these problems, we have built an online repository where researchers using EMRs can upload and download lists of clinical codes. The repository will enable clinical researchers to better validate EMR studies, build on previous code lists and compare disease definitions across studies. It will also assist health informaticians in replicating database studies, tracking changes in disease definitions or clinical coding practice through time and sharing clinical code information across platforms and data sources as research objects. PMID:24941260
McCoy, A B; Wright, A; Krousel-Wood, M; Thomas, E J; McCoy, J A; Sittig, D F
2015-01-01
Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes.
Wright, A.; Krousel-Wood, M.; Thomas, E. J.; McCoy, J. A.; Sittig, D. F.
2015-01-01
Summary Background Clinical knowledge bases of problem-medication pairs are necessary for many informatics solutions that improve patient safety, such as clinical summarization. However, developing these knowledge bases can be challenging. Objective We sought to validate a previously developed crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large, non-university health care system with a widely used, commercially available electronic health record. Methods We first retrieved medications and problems entered in the electronic health record by clinicians during routine care during a six month study period. Following the previously published approach, we calculated the link frequency and link ratio for each pair then identified a threshold cutoff for estimated problem-medication pair appropriateness through clinician review; problem-medication pairs meeting the threshold were included in the resulting knowledge base. We selected 50 medications and their gold standard indications to compare the resulting knowledge base to the pilot knowledge base developed previously and determine its recall and precision. Results The resulting knowledge base contained 26,912 pairs, had a recall of 62.3% and a precision of 87.5%, and outperformed the pilot knowledge base containing 11,167 pairs from the previous study, which had a recall of 46.9% and a precision of 83.3%. Conclusions We validated the crowdsourcing approach for generating a knowledge base of problem-medication pairs in a large non-university health care system with a widely used, commercially available electronic health record, indicating that the approach may be generalizable across healthcare settings and clinical systems. Further research is necessary to better evaluate the knowledge, to compare crowdsourcing with other approaches, and to evaluate if incorporating the knowledge into electronic health records improves patient outcomes. PMID:26171079
On the Use of Topside RO-Derived Electron Density for Model Validation
NASA Astrophysics Data System (ADS)
Shaikh, M. M.; Nava, B.; Haralambous, H.
2018-05-01
In this work, the standard Abel inversion has been exploited as a powerful observation tool, which may be helpful to model the topside of the ionosphere and therefore to validate ionospheric models. A thorough investigation on the behavior of radio occultation (RO)-derived topside electron density (Ne(h))-profiles has therefore been performed with the main purpose to understand whether it is possible to predict the accuracy of a single RO-retrieved topside by comparing the peak density and height of the retrieved profile to the true values. As a first step, a simulation study based on the use of the NeQuick2 model has been performed to show that when the RO-derived electron density peak and height match the true peak values, the full topside Ne(h)-profile may be considered accurate. In order to validate this hypothesis with experimental data, electron density profiles obtained from four different incoherent scatter radars have therefore been considered together with co-located RO-derived Ne(h)-profiles. The evidence presented in this paper show that in all cases examined, if the incoherent scatter radar and the corresponding co-located RO profile have matching peak parameter values, their topsides are in very good agreement. The simulation results presented in this work also highlighted the importance of considering the occultation plane azimuth while inverting RO data to obtain Ne(h)-profile. In particular, they have indicated that there is a preferred range of azimuths of the occultation plane (80°-100°) for which the difference between the "true" and the RO-retrieved Ne(h)-profile in the topside is generally minimal.
Validation of Ground-based Optical Estimates of Auroral Electron Precipitation Energy Deposition
NASA Astrophysics Data System (ADS)
Hampton, D. L.; Grubbs, G. A., II; Conde, M.; Lynch, K. A.; Michell, R.; Zettergren, M. D.; Samara, M.; Ahrns, M. J.
2017-12-01
One of the major energy inputs into the high latitude ionosphere and mesosphere is auroral electron precipitation. Not only does the kinetic energy get deposited, the ensuing ionization in the E and F-region ionosphere modulates parallel and horizontal currents that can dissipate in the form of Joule heating. Global models to simulate these interactions typically use electron precipitation models that produce a poor representation of the spatial and temporal complexity of auroral activity as observed from the ground. This is largely due to these precipitation models being based on averages of multiple satellite overpasses separated by periods much longer than typical auroral feature durations. With the development of regional and continental observing networks (e.g. THEMIS ASI), the possibility of ground-based optical observations producing quantitative estimates of energy deposition with temporal and spatial scales comparable to those known to be exhibited in auroral activity become a real possibility. Like empirical precipitation models based on satellite overpasses such optics-based estimates are subject to assumptions and uncertainties, and therefore require validation. Three recent sounding rocket missions offer such an opportunity. The MICA (2012), GREECE (2014) and Isinglass (2017) missions involved detailed ground based observations of auroral arcs simultaneously with extensive on-board instrumentation. These have afforded an opportunity to examine the results of three optical methods of determining auroral electron energy flux, namely 1) ratio of auroral emissions, 2) green line temperature vs. emission altitude, and 3) parametric estimates using white-light images. We present comparisons from all three methods for all three missions and summarize the temporal and spatial scales and coverage over which each is valid.
Paty, Jean; Elash, Celeste A; Turner-Bowker, Diane M
2017-02-01
Varicose veins are common and can impact patients' quality of life, but consensus regarding the evaluation of varicose vein symptoms is lacking and existing measures have limitations. This research aimed to develop and establish the content validity of a new electronic patient-reported outcome (PRO) measure, the VVSymQ ® instrument, to assess symptoms of superficial venous insufficiency (varicose veins) in clinical trials. The development of the VVSymQ ® instrument began with qualitative interviews with patients based on the symptom domain of the VEINES-QOL/Sym, an existing PRO instrument for chronic venous disorders of the leg. Three phases of qualitative research were conducted to examine the relevance and importance of the symptoms to patients with varicose veins, and the patients' ability to understand and use the VVSymQ ® instrument. The development included evaluating questions that had 1-week and 24-h recall periods, and paper and electronic versions of the new instrument. Five symptoms (heaviness, achiness, swelling, throbbing, and itching [HASTI™]) were consistently reported by patients across all sources of qualitative data. The final version of the VVSymQ ® instrument queries patients on the HASTI™ symptoms using a 24-h recall period and a 6-point duration-based response scale ranging from "None of the time" to "All of the time," and is administered daily via an electronic diary. Cognitive interviews demonstrated varicose vein patients' understanding of and their ability to use the final version of the VVSymQ ® instrument. Content validity was established for the VVSymQ ® instrument, which assesses the five HASTI™ symptoms of varicose veins daily via an electronic diary and has promise for use in research and practice.
Cave, Andrew J; Davey, Christina; Ahmadi, Elaheh; Drummond, Neil; Fuentes, Sonia; Kazemi-Bajestani, Seyyed Mohammad Reza; Sharpe, Heather; Taylor, Matt
2016-01-01
An accurate estimation of the prevalence of paediatric asthma in Alberta and elsewhere is hampered by uncertainty regarding disease definition and diagnosis. Electronic medical records (EMRs) provide a rich source of clinical data from primary-care practices that can be used in better understanding the occurrence of the disease. The Canadian Primary Care Sentinel Surveillance Network (CPCSSN) database includes cleaned data extracted from the EMRs of primary-care practitioners. The purpose of the study was to develop and validate a case definition of asthma in children 1–17 who consult family physicians, in order to provide primary-care estimates of childhood asthma in Alberta as accurately as possible. The validation involved the comparison of the application of a theoretical algorithm (to identify patients with asthma) to a physician review of records included in the CPCSSN database (to confirm an accurate diagnosis). The comparison yielded 87.4% sensitivity, 98.6% specificity and a positive and negative predictive value of 91.2% and 97.9%, respectively, in the age group 1–17 years. The algorithm was also run for ages 3–17 and 6–17 years, and was found to have comparable statistical values. Overall, the case definition and algorithm yielded strong sensitivity and specificity metrics and was found valid for use in research in CPCSSN primary-care practices. The use of the validated asthma algorithm may improve insight into the prevalence, diagnosis, and management of paediatric asthma in Alberta and Canada. PMID:27882997
Cave, Andrew J; Davey, Christina; Ahmadi, Elaheh; Drummond, Neil; Fuentes, Sonia; Kazemi-Bajestani, Seyyed Mohammad Reza; Sharpe, Heather; Taylor, Matt
2016-11-24
An accurate estimation of the prevalence of paediatric asthma in Alberta and elsewhere is hampered by uncertainty regarding disease definition and diagnosis. Electronic medical records (EMRs) provide a rich source of clinical data from primary-care practices that can be used in better understanding the occurrence of the disease. The Canadian Primary Care Sentinel Surveillance Network (CPCSSN) database includes cleaned data extracted from the EMRs of primary-care practitioners. The purpose of the study was to develop and validate a case definition of asthma in children 1-17 who consult family physicians, in order to provide primary-care estimates of childhood asthma in Alberta as accurately as possible. The validation involved the comparison of the application of a theoretical algorithm (to identify patients with asthma) to a physician review of records included in the CPCSSN database (to confirm an accurate diagnosis). The comparison yielded 87.4% sensitivity, 98.6% specificity and a positive and negative predictive value of 91.2% and 97.9%, respectively, in the age group 1-17 years. The algorithm was also run for ages 3-17 and 6-17 years, and was found to have comparable statistical values. Overall, the case definition and algorithm yielded strong sensitivity and specificity metrics and was found valid for use in research in CPCSSN primary-care practices. The use of the validated asthma algorithm may improve insight into the prevalence, diagnosis, and management of paediatric asthma in Alberta and Canada.
NASA Astrophysics Data System (ADS)
Koda, Daniel S.; Bechstedt, Friedhelm; Marques, Marcelo; Teles, Lara K.
2018-04-01
Van der Waals (vdW) heterostructures are promising candidates for building blocks in novel electronic and optoelectronic devices with tailored properties, since their electronic action is dominated by the band alignments upon their contact. In this work, we analyze 10 vdW heterobilayers based on tin dichalcogenides by first-principles calculations. Structural studies show that all systems are stable, and that commensurability leads to smaller interlayer distances. Using hybrid functional calculations, we derive electronic properties and band alignments for all the heterosystems and isolated two-dimensional (2D) crystals. Natural band offsets are derived from calculated electron affinities and ionization energies of 11 freestanding 2D crystals. They are compared with band alignments in true heterojunctions, using a quantum mechanical criterion, and available experimental data. For the hBN/SnSe 2 system, we show that hBN suffers an increase in band gap, while leaving almost unchanged the electronic properties of SnSe2. Similarly, MX2 (M = Mo, W; X = S, Se) over SnX2 preserve the natural discontinuities from each side of the heterobilayer. Significant charge transfer occurs in junctions with graphene, which becomes p-doped and forms an Ohmic contact with SnX2. Zirconium and hafnium dichalcogenides display stronger interlayer interactions, leading to larger shifts in band alignments with tin dichalcogenides. Significant orbital overlap is found, which creates zero conduction band offset systems. The validity of the Anderson electron affinity rule is discussed. Failures of this model are traced back to interlayer interaction, band hybridization, and quantum dipoles. The systematic work sheds light on interfacial engineering for future vdW electronic and optoelectronic devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kang, Nam Lyong; Lee, Sang-Seok; Graduate School of Engineering, Tottori University, 4-101 Koyama-Minami, Tottori
2013-07-15
The projection-reduction method introduced by the present authors is known to give a validated theory for optical transitions in the systems of electrons interacting with phonons. In this work, using this method, we derive the linear and first order nonlinear optical conductivites for an electron-impurity system and examine whether the expressions faithfully satisfy the quantum mechanical philosophy, in the same way as for the electron-phonon systems. The result shows that the Fermi distribution function for electrons, energy denominators, and electron-impurity coupling factors are contained properly in organized manners along with absorption of photons for each electron transition process in themore » final expressions. Furthermore, the result is shown to be represented properly by schematic diagrams, as in the formulation of electron-phonon interaction. Therefore, in conclusion, we claim that this method can be applied in modeling optical transitions of electrons interacting with both impurities and phonons.« less
Zhang, Hai-Bo; Zhang, Xiang-Liang; Wang, Yong; Takaoka, Akio
2007-01-01
The possibility of utilizing high-energy electron tomography to characterize the micron-scale three dimensional (3D) structures of integrated circuits has been demonstrated experimentally. First, electron transmission through a tilted SiO(2) film was measured with an ultrahigh-voltage electron microscope (ultra-HVEM) and analyzed from the point of view of elastic scattering of electrons, showing that linear attenuation of the logarithmic electron transmission still holds valid for effective specimen thicknesses up to 5 microm under 2 MV accelerating voltages. Electron tomography of a micron-order thick integrated circuit specimen including the Cu/via interconnect was then tried with 3 MeV electrons in the ultra-HVEM. Serial projection images of the specimen tilted at different angles over the range of +/-90 degrees were acquired, and 3D reconstruction was performed with the images by means of the IMOD software package. Consequently, the 3D structures of the Cu lines, via and void, were revealed by cross sections and surface rendering.
Molybdenum electron impact width parameter measurement by laser-induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Sternberg, E. M. A.; Rodrigues, N. A. S.; Amorim, J.
2016-01-01
In this work, we suggest a method for electron impact width parameter calculation based on Stark broadening of emission lines of a laser-ablated plasma plume. First, electron density and temperature must be evaluated by means of the Saha-Boltzmann plot method for neutral and ionized species of the plasma. The method was applied for laser-ablated molybdenum plasma plume. For molybdenum plasma electron temperature, which varies around 10,000 K, and electron density, which reaches values around 1018 cm-3, and considering that total measured line broadening was due experimental and Stark broadening mainly, electron impact width parameter of molybdenum emission lines was determined as (0.01 ± 0.02) nm. Intending to validate the presented method, it was analyzed the laser-ablated aluminum plasma plume and the obtained results were in agreement with the predicted on the literature.
Fully relativistic form factor for Thomson scattering.
Palastro, J P; Ross, J S; Pollock, B; Divol, L; Froula, D H; Glenzer, S H
2010-03-01
We derive a fully relativistic form factor for Thomson scattering in unmagnetized plasmas valid to all orders in the normalized electron velocity, beta[over ]=v[over ]/c. The form factor is compared to a previously derived expression where the lowest order electron velocity, beta[over], corrections are included [J. Sheffield, (Academic Press, New York, 1975)]. The beta[over ] expansion approach is sufficient for electrostatic waves with small phase velocities such as ion-acoustic waves, but for electron-plasma waves the phase velocities can be near luminal. At high phase velocities, the electron motion acquires relativistic corrections including effective electron mass, relative motion of the electrons and electromagnetic wave, and polarization rotation. These relativistic corrections alter the scattered emission of thermal plasma waves, which manifest as changes in both the peak power and width of the observed Thomson-scattered spectra.
40 CFR 1045.115 - What other requirements apply?
Code of Federal Regulations, 2014 CFR
2014-07-01
... area networks. Your broadcasting protocol must allow for valid measurements using the field-testing... information broadcast by an engine's on-board computers and electronic control modules. If you broadcast a...
NASA Astrophysics Data System (ADS)
Pietrella, M.; Pignalberi, A.; Pezzopane, M.; Pignatelli, A.; Azzarone, A.; Rizzi, R.
2018-05-01
Three-dimensional (3-D) electron density matrices, computed in the Mediterranean area by the IRI climatological model and IRIEup and ISP nowcasting models, during some intense and severe geomagnetic-ionospheric storms, were ingested by the ray tracing software tool IONORT, to synthesize quasi-vertical ionograms. IRIEup model was run in different operational modes: (1) assimilating validated autoscaled electron density profiles only from a limited area which, in our case, is the Mediterranean sector (IRIEup_re(V) mode); (2) assimilating electron density profiles from a larger region including several stations spread across Europe: (a) without taking care of validating the autoscaled data in the assimilation process (IRIEup(NV)); (b) validating carefully the autoscaled electron density profiles before their assimilation (IRIEup(V)). The comparative analysis was carried out comparing IRI, IRIEup_re(V), ISP, IRIEup(NV), and IRIEup(V) foF2 synthesized values, with corresponding foF2 measurements autoscaled by ARTIST, and then validated, at the truth sites of Roquetes (40.80°N, 0.50°E, Spain), San Vito (40.60°N, 17.80°E, Italy), Athens (38.00°N, 23.50°E, Greece), and Nicosia, (35.03°N, 33.16°E, Cyprus). The outcomes demonstrate that: (1) IRIEup_re(V), performs better than ISP in the western Mediterranean (around Roquetes); (2) ISP performs slightly better than IRIEup_re(V) in the central part of Mediterranean (around Athens and San Vito); (3) ISP performance is better than the IRIEup_re(V) one in the eastern Mediterranean (around Nicosia); (4) IRIEup(NV) performance is worse than the IRIEup(V) one; (5) in the central Mediterranean area, IRIEup(V) performance is better than the IRIEup_re(V) one, and it is practically the same for the western and eastern sectors. Concerning the overall performance, nowcasting models proved to be considerably more reliable than the climatological IRI model to represent the ionosphere behaviour during geomagnetic-ionospheric storm conditions; ISP and IRIEup(V) provided the best performance, but neither of them has clearly prevailed over the other one.
Burfeind, O; Bruins, M; Bos, A; Sannmann, I; Voigtsberger, R; Heuwieser, W
2014-07-01
The objective of this study was to estimate the diagnostic accuracy of an electronic nose device using vaginal discharge samples to diagnose acute puerperal metritis (APM) in dairy cows. Uterine fluid was sampled manually with a gloved hand and under sterile conditions for electronic nose device analysis (day in milk (DIM) 2, 5, and 10) and bacteriologic examination (DIM 5), respectively, and on additional days, if APM was diagnosed during the daily clinical examinations. A dataset containing samples from 70 cows was used to create a model and to validate the APM status predicted by this model, respectively. Half of the dataset (n = 35; 14 healthy and 21 metritic cows) was provided with information regarding the APM diagnosis and contained all three measurements (DIM 2, 5, and 10) for each cow and was used as a training set whereas the second half was blinded (n = 35; 14 healthy and 21 metritic cows) and contained only the samples collected on DIM 5 of each cow and was used to validate the created prediction model. A receiver operating characteristic curve was calculated using the prediction results of the validation test. The best observed sensitivity was 100% with specificity of 91.6% when using a threshold value of 0.3. The calculated P-value for the receiver operating characteristic curve was less than 0.01. Overall, Escherichia coli was isolated in eight of 28 (28.6%) and 22 of 42 (52.4%) samples collected from healthy and metritic cows, respectively. Trueperella pyogenes and Fusobacterium necrophorum were isolated in 14 and six of 28 (50.0% and 21.4%) and 17 and 16 of 42 (40.5% and 38.1%) samples collected from healthy and metritic cows, respectively. The prevalence of Escherichia coli and Trueperella pyogenes was similar in the samples obtained from metritic cows used for the training set and the validation test. The results are promising especially because of the objective nature of the measurements obtained by the electronic nose device. Copyright © 2014 Elsevier Inc. All rights reserved.
Energy regeneration model of self-consistent field of electron beams into electric power*
NASA Astrophysics Data System (ADS)
Kazmin, B. N.; Ryzhov, D. R.; Trifanov, I. V.; Snezhko, A. A.; Savelyeva, M. V.
2016-04-01
We consider physic-mathematical models of electric processes in electron beams, conversion of beam parameters into electric power values and their transformation into users’ electric power grid (onboard spacecraft network). We perform computer simulation validating high energy efficiency of the studied processes to be applied in the electric power technology to produce the power as well as electric power plants and propulsion installation in the spacecraft.
Electronics: Mott Transistor: Fundamental Studies and Device Operation Mechanisms
2016-03-21
display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. Harvard University Office for Sponsored Programs...including journal references , in the following categories: (b) Papers published in non-peer-reviewed journals (N/A for none) 03/21/2016 03/21/2016 03...limited kinetics of electron doping in correlated oxides, Applied Physics Letters (07 2015) TOTAL: 1 Books Number of Manuscripts: Patents Submitted
1995-12-01
reactions . The following figure shows the relationship of common electron acceptors with regard to their redox potential. Redox Potential (pH = 7) in...Model 8 Fuel-Spill Plume Profile 8 Hydrocarbon Biodegradation 1° Oxygen 11 Anaerobic Electron Acceptors 12 Redox Potential *4 Contaminants of...Biodegradation Reactions 21 Oxygen Reactions 21 Nitrate 22 Manganese (IV) 22 Iron (III) 23 Sulfate 24 in Page Intrinsic Bioremediation Model
A model of electron collecting plasma contractors
NASA Technical Reports Server (NTRS)
Davis, V. A.; Katz, I.; Mandell, M. J.; Parks, D. E.
1989-01-01
A model of plasma contractors is being developed, which can be used to describe electron collection in a laboratory test tank and in the space environment. To validate the model development, laboratory experiments are conducted in which the source plasma is separated from the background plasma by a double layer. Model calculations show that an increase in ionization rate with potential produces a steep rise in collected current with increasing potential.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaufman, H.R.
Bohm diffusion has been found to be approximately valid for many plasmas in strong magnetic fields. Assuming Bohm diffusion describes electron diffusion directly (H. R. Kaufman, AIAA J. {bold 23}, 78 (1985)), with an equal ion loss possible from the ambipolar field that is generated (F. F. Chen, {ital Introduction} {ital to} {ital Plasma} {ital Physics} (Plenum, New York, 1974), p. 169), an order-of-magnitude analysis can show why such electron diffusion should be expected.
Electron microscopy using the genetically encoded APEX2 tag in cultured mammalian cells
Martell, Jeffrey D; Deerinck, Thomas J; Lam, Stephanie S; Ellisman, Mark H; Ting, Alice Y
2018-01-01
Electron microscopy (EM) is the premiere technique for high-resolution imaging of cellular ultrastructure. Unambiguous identification of specific proteins or cellular compartments in electron micrographs, however, remains challenging because of difficulties in delivering electron-dense contrast agents to specific subcellular targets within intact cells. We recently reported enhanced ascorbate peroxidase 2 (APEX2) as a broadly applicable genetic tag that generates EM contrast on a specific protein or subcellular compartment of interest. This protocol provides guidelines for designing and validating APEX2 fusion constructs, along with detailed instructions for cell culture, transfection, fixation, heavy-metal staining, embedding in resin, and EM imaging. Although this protocol focuses on EM in cultured mammalian cells, APEX2 is applicable to many cell types and contexts, including intact tissues and organisms, and is useful for numerous applications beyond EM, including live-cell proteomic mapping. This protocol, which describes procedures for sample preparation from cell monolayers and cell pellets, can be completed in 10 d, including time for APEX2 fusion construct validation, cell growth, and solidification of embedding resins. Notably, the only additional steps required relative to a standard EM sample preparation are cell transfection and a 2- to 45-min staining period with 3,3′-diaminobenzidine (DAB) and hydrogen peroxide (H2O2). PMID:28796234
Berger, Cezar; Freitas, Renato; Malafaia, Osvaldo; Pinto, José Simão de Paula; Mocellin, Marcos; Macedo, Evaldo; Fagundes, Marina Serrato Coelho
2012-01-01
Summary Introduction: In the health field, computerization has become increasingly necessary in professional practice, since it facilitates data recovery and assists in the development of research with greater scientific rigor. Objective: the present work aimed to develop, apply, and validate specific electronic protocols for patients referred for rhinoplasty. Methods: The prospective research had 3 stages: (1) preparation of theoretical data bases; (2) creation of a master protocol using Integrated System of Electronic Protocol (SINPE©); and (3) elaboration, application, and validation of a specific protocol for the nose and sinuses regarding rhinoplasty. Results: After the preparation of the master protocol, which dealt with the entire field of otorhinolaryngology, we idealized a specific protocol containing all matters related to the patient. In particular, the aesthetic and functional nasal complaints referred for surgical treatment (i.e., rhinoplasty) were organized into 6 main hierarchical categories: anamnesis, physical examination, complementary exams, diagnosis, treatment, and outcome. This protocol utilized these categories and their sub-items: finality; access; surgical maneuvers on the nasal dorsum, tip, and base; clinical evolution after 3, 6, and 12 months; revisional surgery; and quantitative and qualitative evaluations. Conclusion: The developed electronic-specific protocol is feasible and important for information registration from patients referred to rhinoplasty. PMID:25991979
Counterfeit Electronic Parts Controls in the Department of Defense Supply Chain
2015-06-01
Equipment Manufacturer RFID Radio Frequency Identification SASC Senate Armed Services Committee SECDEF Secretary of Defense SWPaC Space, Weight...radio frequency identification ( RFID ). Upon the use of that printer, the RFID is checked to ensure it is a valid platform, and will not work if the... RFID is not confirmed to be valid (Richetto, 2011). Greater collaboration with industry would also help with one of the primary drivers of
NASA Astrophysics Data System (ADS)
Couturier, C.; Riffard, Q.; Sauzet, N.; Guillaudin, O.; Naraghi, F.; Santos, D.
2017-11-01
Low-pressure gaseous TPCs are well suited detectors to correlate the directions of nuclear recoils to the galactic Dark Matter (DM) halo. Indeed, in addition to providing a measure of the energy deposition due to the elastic scattering of a DM particle on a nucleus in the target gas, they allow for the reconstruction of the track of the recoiling nucleus. In order to exclude the background events originating from radioactive decays on the surfaces of the detector materials within the drift volume, efforts are ongoing to precisely localize the track nuclear recoil in the drift volume along the axis perpendicular to the cathode plane. We report here the implementation of the measure of the signal induced on the cathode by the motion of the primary electrons toward the anode in a MIMAC chamber. As a validation, we performed an independent measurement of the drift velocity of the electrons in the considered gas mixture, correlating in time the cathode signal with the measure of the arrival times of the electrons on the anode.
Direct drive: Simulations and results from the National Ignition Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radha, P. B.; Hohenberger, M.; Edgell, D. H.
Here, the direct-drive implosion physics is being investigated at the National Ignition Facility. The primary goal of the experiments is twofold: to validate modeling related to implosion velocity and to estimate the magnitude of hot-electron preheat. Implosion experiments indicate that the energetics is well-modeled when cross-beam energy transfer (CBET) is included in the simulation and an overall multiplier to the CBET gain factor is employed; time-resolved scattered light and scattered-light spectra display the correct trends. Trajectories from backlit images are well modeled, although those from measured self-emission images indicate increased shell thickness and reduced shell density relative to simulations. Sensitivitymore » analyses indicate that the most likely cause for the density reduction is nonuniformity growth seeded by laser imprint and not laser-energy coupling. Hot-electron preheat is at tolerable levels in the ongoing experiments, although it is expected to increase after the mitigation of CBET. Future work will include continued model validation, imprint measurements, and mitigation of CBET and hot-electron preheat.« less
Mapping Perinatal Nursing Process Measurement Concepts to Standard Terminologies.
Ivory, Catherine H
2016-07-01
The use of standard terminologies is an essential component for using data to inform practice and conduct research; perinatal nursing data standardization is needed. This study explored whether 76 distinct process elements important for perinatal nursing were present in four American Nurses Association-recognized standard terminologies. The 76 process elements were taken from a valid paper-based perinatal nursing process measurement tool. Using terminology-supported browsers, the elements were manually mapped to the selected terminologies by the researcher. A five-member expert panel validated 100% of the mapping findings. The majority of the process elements (n = 63, 83%) were present in SNOMED-CT, 28% (n = 21) in LOINC, 34% (n = 26) in ICNP, and 15% (n = 11) in CCC. SNOMED-CT and LOINC are terminologies currently recommended for use to facilitate interoperability in the capture of assessment and problem data in certified electronic medical records. Study results suggest that SNOMED-CT and LOINC contain perinatal nursing process elements and are useful standard terminologies to support perinatal nursing practice in electronic health records. Terminology mapping is the first step toward incorporating traditional paper-based tools into electronic systems.
Direct drive: Simulations and results from the National Ignition Facility
Radha, P. B.; Hohenberger, M.; Edgell, D. H.; ...
2016-04-19
Here, the direct-drive implosion physics is being investigated at the National Ignition Facility. The primary goal of the experiments is twofold: to validate modeling related to implosion velocity and to estimate the magnitude of hot-electron preheat. Implosion experiments indicate that the energetics is well-modeled when cross-beam energy transfer (CBET) is included in the simulation and an overall multiplier to the CBET gain factor is employed; time-resolved scattered light and scattered-light spectra display the correct trends. Trajectories from backlit images are well modeled, although those from measured self-emission images indicate increased shell thickness and reduced shell density relative to simulations. Sensitivitymore » analyses indicate that the most likely cause for the density reduction is nonuniformity growth seeded by laser imprint and not laser-energy coupling. Hot-electron preheat is at tolerable levels in the ongoing experiments, although it is expected to increase after the mitigation of CBET. Future work will include continued model validation, imprint measurements, and mitigation of CBET and hot-electron preheat.« less
GYROKINETIC PARTICLE SIMULATION OF TURBULENT TRANSPORT IN BURNING PLASMAS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horton, Claude Wendell
2014-06-10
The SciDAC project at the IFS advanced the state of high performance computing for turbulent structures and turbulent transport. The team project with Prof Zhihong Lin [PI] at Univ California Irvine produced new understanding of the turbulent electron transport. The simulations were performed at the Texas Advanced Computer Center TACC and the NERSC facility by Wendell Horton, Lee Leonard and the IFS Graduate Students working in that group. The research included a Validation of the electron turbulent transport code using the data from a steady state university experiment at the University of Columbia in which detailed probe measurements of themore » turbulence in steady state were used for wide range of temperature gradients to compare with the simulation data. These results were published in a joint paper with Texas graduate student Dr. Xiangrong Fu using the work in his PhD dissertation. X.R. Fu, W. Horton, Y. Xiao, Z. Lin, A.K. Sen and V. Sokolov, “Validation of electron Temperature gradient turbulence in the Columbia Linear Machine, Phys. Plasmas 19, 032303 (2012).« less
Analytic solution of the Spencer-Lewis angular-spatial moments equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Filippone, W.L.
A closed-form solution for the angular-spatial moments of the Spencer-Lewis equation is presented that is valid for infinite homogeneous media. From the moments, the electron density distribution as a function of position and path length (energy) is reconstructed for several sample problems involving plane isotropic sources of electrons in aluminium. The results are in excellent agreement with those determined numerically using the streaming ray method. The primary use of the closed form solution will most likely be to generate accurate electron transport benchmark solutions. In principle, the electron density as a function of space, path length, and direction can bemore » determined for planar sources of arbitrary angular distribution.« less
Tavares, Leoberto Costa; do Amaral, Antonia Tavares
2004-03-15
It was determined, with a systematic mode, the carbonyl group frequency in the region of the infrared of N-[(dimethylamine)methyl] benzamides 4-substituted (set A) and their hydrochlorides (set B), that had its local anesthetical activity evaluated. The application of the Hammett equation considering the values of the absorption frequency of carbonyl group, nu(C=O,) using the electronic constants sigma, sigma(I), sigma(R), I and R leads to meaningful correlation. The nature and the contribution of substituent group electronic effects on the polarity of the carbonyl group was also analyzed. The use of the nu(C=O) as an experimental electronic parameter for QSPR studies was validated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernstein, A. M., E-mail: bernstein@mit.edu
Small angle electron scattering with intense electron beams opens up the possibility of performing almost real photon induced reactions with thin, polarized hydrogen and few body targets, allowing for the detection of low energy charged particles. This promises to be much more effective than conventional photon tagging techniques. For photo-pion reactions some fundamental new possibilities include: tests of charge symmetry in the N-N system by measurement of the neutron-neutron scattering length a{sub nn} in the and ggrD → π{sup +}nn reaction; tests of isospin breaking due to the mass difference of the up and down quarks; measurements with polarized targetsmore » are sensitive to πN phase shifts and will test the validity of the Fermi-Watson (final state interaction) theorem. All of these experiments will test the accuracy and energy region of validity of chiral effective theories.« less
Accelerating electron tomography reconstruction algorithm ICON with GPU.
Chen, Yu; Wang, Zihao; Zhang, Jingrong; Li, Lun; Wan, Xiaohua; Sun, Fei; Zhang, Fa
2017-01-01
Electron tomography (ET) plays an important role in studying in situ cell ultrastructure in three-dimensional space. Due to limited tilt angles, ET reconstruction always suffers from the "missing wedge" problem. With a validation procedure, iterative compressed-sensing optimized NUFFT reconstruction (ICON) demonstrates its power in the restoration of validated missing information for low SNR biological ET dataset. However, the huge computational demand has become a major problem for the application of ICON. In this work, we analyzed the framework of ICON and classified the operations of major steps of ICON reconstruction into three types. Accordingly, we designed parallel strategies and implemented them on graphics processing units (GPU) to generate a parallel program ICON-GPU. With high accuracy, ICON-GPU has a great acceleration compared to its CPU version, up to 83.7×, greatly relieving ICON's dependence on computing resource.
NASA Astrophysics Data System (ADS)
Syha, M.; Rheinheimer, W.; Loedermann, B.; Graff, A.; Trenkle, A.; Baeurer, M.; Weygand, D.; Ludwig, W.; Gumbsch, P.
The microstructural evolution of polycrystalline strontium titanate was investigated in three dimensions (3D) using X-ray diffraction contrast tomography (DCT) before and after ex-situ annealing at 1600°C. Post-annealing, the specimen was additionally subjected to phase contrast tomography (PCT) in order to finely resolve the porosities. The resulting microstructure reconstructions were studied with special emphasis on morphology and interface orientation during microstructure evolution. Subsequently, cross-sections of the specimen were studied using electron backscatter diffraction (EBSD). Corresponding cross-sections through the 3D reconstruction were identified and the quality of the reconstruction is validated with special emphasis on the spatial resolution at the grain boundaries, the size and location of pores contained in the material and the accuracy of the orientation determination.
Peissig, Peggy L; Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B
2012-01-01
There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oudini, N.; Sirse, N.; Ellingboe, A. R.
2015-07-15
This paper presents a critical assessment of the theory of photo-detachment diagnostic method used to probe the negative ion density and electronegativity α = n{sub -}/n{sub e}. In this method, a laser pulse is used to photo-detach all negative ions located within the electropositive channel (laser spot region). The negative ion density is estimated based on the assumption that the increase of the current collected by an electrostatic probe biased positively to the plasma is a result of only the creation of photo-detached electrons. In parallel, the background electron density and temperature are considered as constants during this diagnostics. While the numericalmore » experiments performed here show that the background electron density and temperature increase due to the formation of an electrostatic potential barrier around the electropositive channel. The time scale of potential barrier rise is about 2 ns, which is comparable to the time required to completely photo-detach the negative ions in the electropositive channel (∼3 ns). We find that neglecting the effect of the potential barrier on the background plasma leads to an erroneous determination of the negative ion density. Moreover, the background electron velocity distribution function within the electropositive channel is not Maxwellian. This is due to the acceleration of these electrons through the electrostatic potential barrier. In this work, the validity of the photo-detachment diagnostic assumptions is questioned and our results illustrate the weakness of these assumptions.« less
Glauber exchange amplitudes. [electron scattering from H atoms
NASA Technical Reports Server (NTRS)
Madan, R. N.
1975-01-01
The extrapolation method of Ochkur, valid for intermediate energies (about 50 eV), is applied to the exchange form of the Glauber amplitudes. In the case of elastic scattering of electrons from hydrogen atoms at 54.4 Ev the 'post' and 'prior' forms of the exchange amplitude are equivalent, whereas for the case of inelastic scattering there is a minute discrepancy between the two forms of the amplitude. The results are compared with the close-coupling calculation. The investigation is expected to be useful for optically forbidden exchange-allowed transitions due to electron impact at intermediate energies.
A unitary convolution approximation for the impact-parameter dependent electronic energy loss
NASA Astrophysics Data System (ADS)
Schiwietz, G.; Grande, P. L.
1999-06-01
In this work, we propose a simple method to calculate the impact-parameter dependence of the electronic energy loss of bare ions for all impact parameters. This perturbative convolution approximation (PCA) is based on first-order perturbation theory, and thus, it is only valid for fast particles with low projectile charges. Using Bloch's stopping-power result and a simple scaling, we get rid of the restriction to low charge states and derive the unitary convolution approximation (UCA). Results of the UCA are then compared with full quantum-mechanical coupled-channel calculations for the impact-parameter dependent electronic energy loss.
Electronic labelling in recycling of manufactured articles.
Olejnik, Lech; Krammer, Alfred
2002-12-01
The concept of a recycling system aiming at the recovery of resources from manufactured articles is proposed. The system integrates electronic labels for product identification and internet for global data exchange. A prototype for the recycling of electric motors has been developed, which implements a condition-based recycling decision system to automatically select the environmentally and economically appropriate recycling strategy, thereby opening a potential market for second-hand motors and creating a profitable recycling process itself. The project has been designed to evaluate the feasibility of electronic identification applied on a large number of motors and to validate the system in real field conditions.
Polarization of photons scattered by electrons in any spectral distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Zhe; Lin, Hai-Nan; Jiang, Yunguo, E-mail: jiangyg@ihep.ac.cn
On the basis of the quantum electrodynamics, we present a generic formalism of the polarization for beamed monochromatic photons scattered by electrons in any spectral distribution. The formulae reduce to the components of the Fano matrix when electrons are at rest. We mainly investigate the polarization in three scenarios, i.e., electrons at rest, isotropic electrons with a power-law spectrum, and thermal electrons. If the incident beam is polarized, the polarization is reduced significantly by isotropic electrons at large viewing angles; the degree of polarization caused by thermal electrons is about half of that caused by power-law electrons. If the incidentmore » bean is unpolarized, soft γ-rays can lead to about 15% polarization at viewing angles around π/4. For isotropic electrons, one remarkable feature is that the polarization as a function of the incident photon energy always peaks roughly at 1 MeV; this is valid for both the thermal and power-law cases. This feature can be used to distinguish the model of the inverse Compton scattering from that of the synchrotron radiation.« less
NASA Astrophysics Data System (ADS)
Herrera-Basurto, R.; Mercader-Trejo, F.; Muñoz-Madrigal, N.; Juárez-García, J. M.; Rodriguez-López, A.; Manzano-Ramírez, A.
2016-07-01
The main goal of method validation is to demonstrate that the method is suitable for its intended purpose. One of the advantages of analytical method validation is translated into a level of confidence about the measurement results reported to satisfy a specific objective. Elemental composition determination by wavelength dispersive spectrometer (WDS) microanalysis has been used over extremely wide areas, mainly in the field of materials science, impurity determinations in geological, biological and food samples. However, little information is reported about the validation of the applied methods. Herein, results of the in-house method validation for elemental composition determination by WDS are shown. SRM 482, a binary alloy Cu-Au of different compositions, was used during the validation protocol following the recommendations for method validation proposed by Eurachem. This paper can be taken as a reference for the evaluation of the validation parameters more frequently requested to get the accreditation under the requirements of the ISO/IEC 17025 standard: selectivity, limit of detection, linear interval, sensitivity, precision, trueness and uncertainty. A model for uncertainty estimation was proposed including systematic and random errors. In addition, parameters evaluated during the validation process were also considered as part of the uncertainty model.
Development, validation and utilisation of food-frequency questionnaires - a review.
Cade, Janet; Thompson, Rachel; Burley, Victoria; Warm, Daniel
2002-08-01
The purpose of this review is to provide guidance on the development, validation and use of food-frequency questionnaires (FFQs) for different study designs. It does not include any recommendations about the most appropriate method for dietary assessment (e.g. food-frequency questionnaire versus weighed record). A comprehensive search of electronic databases was carried out for publications from 1980 to 1999. Findings from the review were then commented upon and added to by a group of international experts. Recommendations have been developed to aid in the design, validation and use of FFQs. Specific details of each of these areas are discussed in the text. FFQs are being used in a variety of ways and different study designs. There is no gold standard for directly assessing the validity of FFQs. Nevertheless, the outcome of this review should help those wishing to develop or adapt an FFQ to validate it for its intended use.
Steele, John C; Clark, Hadleigh J; Hong, Catherine H L; Jurge, Sabine; Muthukrishnan, Arvind; Kerr, A Ross; Wray, David; Prescott-Clements, Linda; Felix, David H; Sollecito, Thomas P
2015-08-01
To explore international consensus for the validation of clinical competencies for advanced training in Oral Medicine. An electronic survey of clinical competencies was designed. The survey was sent to and completed by identified international stakeholders during a 10-week period. To be validated, an individual competency had to achieve 90% or greater consensus to keep it in its current format. Stakeholders from 31 countries responded. High consensus agreement was achieved with 93 of 101 (92%) competencies exceeding the benchmark for agreement. Only 8 warranted further attention and were reviewed by a focus group. No additional competencies were suggested. This is the first international validated study of clinical competencies for advanced training in Oral Medicine. These validated clinical competencies could provide a model for countries developing an advanced training curriculum for Oral Medicine and also inform review of existing curricula. Copyright © 2015 Elsevier Inc. All rights reserved.
Radial Profiles of the Plasma Electron Characteristics in a 30 kW Arc Jet
NASA Technical Reports Server (NTRS)
Codron, Douglas A.; Nawaz, Anuscheh
2013-01-01
The present effort aims to strengthen modeling work conducted at the NASA Ames Research Center by measuring the critical plasma electron characteristics within and slightly outside of an arc jet plasma column. These characteristics are intended to give physical insights while assisting in the formulation of boundary conditions to validate full scale simulations. Single and triple Langmuir probes have been used to achieve estimates of the electron temperature (T(sub e)), electron number density (n(sub e)) and plasma potential (outside of the plasma column) as probing location is varied radially from the flow centerline. Both the electron temperature and electron number density measurements show a large dependence on radial distance from the plasma column centerline with T(sub e) approx. = (3 - 12 eV and n(sub e) approx. = 10(exp 12) - 10(exp 14)/cu cm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yu, E-mail: zhy@yangtze.hku.hk; Chen, GuanHua, E-mail: ghc@everest.hku.hk; Yam, ChiYung
2015-04-28
A time-dependent inelastic electron transport theory for strong electron-phonon interaction is established via the equations of motion method combined with the small polaron transformation. In this work, the dissipation via electron-phonon coupling is taken into account in the strong coupling regime, which validates the small polaron transformation. The corresponding equations of motion are developed, which are used to study the quantum interference effect and phonon-induced decoherence dynamics in molecular junctions. Numerical studies show clearly quantum interference effect of the transport electrons through two quasi-degenerate states with different couplings to the leads. We also found that the quantum interference can bemore » suppressed by the electron-phonon interaction where the phase coherence is destroyed by phonon scattering. This indicates the importance of electron-phonon interaction in systems with prominent quantum interference effect.« less
Jet production in the CoLoRFulNNLO method: Event shapes in electron-positron collisions
NASA Astrophysics Data System (ADS)
Del Duca, Vittorio; Duhr, Claude; Kardos, Adam; Somogyi, Gábor; Szőr, Zoltán; Trócsányi, Zoltán; Tulipánt, Zoltán
2016-10-01
We present the CoLoRFulNNLO method to compute higher order radiative corrections to jet cross sections in perturbative QCD. We apply our method to the computation of event shape observables in electron-positron collisions at NNLO accuracy and validate our code by comparing our predictions to previous results in the literature. We also calculate for the first time jet cone energy fraction at NNLO.
DC and small-signal physical models for the AlGaAs/GaAs high electron mobility transistor
NASA Technical Reports Server (NTRS)
Sarker, J. C.; Purviance, J. E.
1991-01-01
Analytical and numerical models are developed for the microwave small-signal performance, such as transconductance, gate-to-source capacitance, current gain cut-off frequency and the optimum cut-off frequency of the AlGaAs/GaAs High Electron Mobility Transistor (HEMT), in both normal and compressed transconductance regions. The validated I-V characteristics and the small-signal performances of four HeMT's are presented.
Elaboration and Validation of the Medication Prescription Safety Checklist 1
Pires, Aline de Oliveira Meireles; Ferreira, Maria Beatriz Guimarães; do Nascimento, Kleiton Gonçalves; Felix, Márcia Marques dos Santos; Pires, Patrícia da Silva; Barbosa, Maria Helena
2017-01-01
ABSTRACT Objective: to elaborate and validate a checklist to identify compliance with the recommendations for the structure of medication prescriptions, based on the Protocol of the Ministry of Health and the Brazilian Health Surveillance Agency. Method: methodological research, conducted through the validation and reliability analysis process, using a sample of 27 electronic prescriptions. Results: the analyses confirmed the content validity and reliability of the tool. The content validity, obtained by expert assessment, was considered satisfactory as it covered items that represent the compliance with the recommendations regarding the structure of the medication prescriptions. The reliability, assessed through interrater agreement, was excellent (ICC=1.00) and showed perfect agreement (K=1.00). Conclusion: the Medication Prescription Safety Checklist showed to be a valid and reliable tool for the group studied. We hope that this study can contribute to the prevention of adverse events, as well as to the improvement of care quality and safety in medication use. PMID:28793128
1987-08-01
POR A ENVIOMNE. STRATEGIC DEFENSE INITRIEDNU T IATI V EI 1 0193 O RTEI ONE..()SRTGC DEFENSE INITIATV E S RATION V EO ORORNIZATION WASINZGTON DC...facilities where Demonstration/Validation activities are planned.- e Ten areas of environmental consideration are addressed: (1) air quality; (2) . water...air quality, rater quality, and hazardous vaste (63). 2.2 ELCTRONIC SYSTEMS DIVISION The Electronic Systems Division administrative offices are located
1997-09-01
Illinois Institute of Technology Research Institute (IITRI) calibrated seven parametric models including SPQR /20, the forerunner of CHECKPOINT. The...a semicolon); thus, SPQR /20 was calibrated using SLOC sizing data (IITRI, 1989: 3-4). The results showed only slight overall improvements in accuracy...even when validating the calibrated models with the same data sets. The IITRI study demonstrated SPQR /20 to be one of two models that were most
2012-08-01
The first phase consisted of Shared Services , Threat Detection and Reporting, and the Remote Weapon Station (RWS) build up and validation. The...Awareness build up and validation. The first phase consisted of the development of the shared services or core services that are required by many...C4ISR/EW systems. The shared services include: time synchronization, position, direction of travel, and orientation. Time synchronization is
Validation of the openEHR archetype library by using OWL reasoning.
Menárguez-Tortosa, Marcos; Fernández-Breis, Jesualdo Tomás
2011-01-01
Electronic Health Record architectures based on the dual model architecture use archetypes for representing clinical knowledge. Therefore, ensuring their correctness and consistency is a fundamental research goal. In this work, we explore how an approach based on OWL technologies can be used for such purpose. This method has been applied to the openEHR archetype repository, which is the largest available one nowadays. The results of this validation are also reported in this study.
Maier, Jürgen; Hampe, J Felix; Jahn, Nico
2016-01-01
Real-time response (RTR) measurement is an important technique for analyzing human processing of electronic media stimuli. Although it has been demonstrated that RTR data are reliable and internally valid, some argue that they lack external validity. The reason for this is that RTR measurement is restricted to a laboratory environment due to its technical requirements. This paper introduces a smartphone app that 1) captures real-time responses using the dial technique and 2) provides a solution for one of the most important problems in RTR measurement, the (automatic) synchronization of RTR data. In addition, it explores the reliability and validity of mobile RTR measurement by comparing the real-time reactions of two samples of young and well-educated voters to the 2013 German televised debate. Whereas the first sample participated in a classical laboratory study, the second sample was equipped with our mobile RTR system and watched the debate at home. Results indicate that the mobile RTR system yields similar results to the lab-based RTR measurement, providing evidence that laboratory studies using RTR are externally valid. In particular, the argument that the artificial reception situation creates artificial results has to be questioned. In addition, we conclude that RTR measurement outside the lab is possible. Hence, mobile RTR opens the door for large-scale studies to better understand the processing and impact of electronic media content.
Theodoros, Deborah G.; Russell, Trevor G.
2015-01-01
Background: Usability is an emerging domain of outcomes measurement in assistive technology provision. Currently, no questionnaires exist to test the usability of mobile shower commodes (MSCs) used by adults with spinal cord injury (SCI). Objective: To describe the development, construction, and initial content validation of an electronic questionnaire to test mobile shower commode usability for this population. Methods: The questionnaire was constructed using a mixed-methods approach in 5 phases: determining user preferences for the questionnaire’s format, developing an item bank of usability indicators from the literature and judgement of experts, constructing a preliminary questionnaire, assessing content validity with a panel of experts, and constructing the final questionnaire. Results: The electronic Mobile Shower Commode Assessment Tool Version 1.0 (eMAST 1.0) questionnaire tests MSC features and performance during activities identified using a mixed-methods approach and in consultation with users. It confirms that usability is complex and multidimensional. The final questionnaire contains 25 questions in 3 sections. The eMAST 1.0 demonstrates excellent content validity as determined by a small sample of expert clinicians. Conclusion: The eMAST 1.0 tests usability of MSCs from the perspective of adults with SCI and may be used to solicit feedback during MSC design, assessment, prescription, and ongoing use. Further studies assessing the eMAST’s psychometric properties, including studies with users of MSCs, are needed. PMID:25762862
Maier, Jürgen; Hampe, J. Felix; Jahn, Nico
2016-01-01
Real-time response (RTR) measurement is an important technique for analyzing human processing of electronic media stimuli. Although it has been demonstrated that RTR data are reliable and internally valid, some argue that they lack external validity. The reason for this is that RTR measurement is restricted to a laboratory environment due to its technical requirements. This paper introduces a smartphone app that 1) captures real-time responses using the dial technique and 2) provides a solution for one of the most important problems in RTR measurement, the (automatic) synchronization of RTR data. In addition, it explores the reliability and validity of mobile RTR measurement by comparing the real-time reactions of two samples of young and well-educated voters to the 2013 German televised debate. Whereas the first sample participated in a classical laboratory study, the second sample was equipped with our mobile RTR system and watched the debate at home. Results indicate that the mobile RTR system yields similar results to the lab-based RTR measurement, providing evidence that laboratory studies using RTR are externally valid. In particular, the argument that the artificial reception situation creates artificial results has to be questioned. In addition, we conclude that RTR measurement outside the lab is possible. Hence, mobile RTR opens the door for large-scale studies to better understand the processing and impact of electronic media content. PMID:27274577
Xiao, Lan; Lv, Nan; Rosas, Lisa G; Au, David; Ma, Jun
2017-02-01
To validate clinic weights in electronic health records against researcher-measured weights for outcome assessment in weight loss trials. Clinic and researcher-measured weights from a published trial (BE WELL) were compared using Lin's concordance correlation coefficient, Bland and Altman's limits of agreement, and polynomial regression model. Changes in clinic and researcher-measured weights in BE WELL and another trial, E-LITE, were analyzed using growth curve modeling. Among BE WELL (n = 330) and E-LITE (n = 241) participants, 96% and 90% had clinic weights (mean [SD] of 5.8 [6.1] and 3.7 [3.9] records) over 12 and 15 months of follow-up, respectively. The concordance correlation coefficient was 0.99, and limits of agreement plots showed no pattern between or within treatment groups, suggesting overall good agreement between researcher-measured and nearest-in-time clinic weights up to 3 months. The 95% confidence intervals for predicted percent differences fell within ±3% for clinic weights within 3 months of the researcher-measured weights. Furthermore, the growth curve slopes for clinic and researcher-measured weights by treatment group did not differ significantly, suggesting similar inferences about treatment effects over time, in both trials. Compared with researcher-measured weights, close-in-time clinic weights showed high agreement and inference validity. Clinic weights could be a valid pragmatic outcome measure in weight loss studies. © 2017 The Obesity Society.
Cohen-Stavi, Chandra; Leventer-Roberts, Maya; Balicer, Ran D
2017-01-01
Objective To directly compare the performance and externally validate the three most studied prediction tools for osteoporotic fractures—QFracture, FRAX, and Garvan—using data from electronic health records. Design Retrospective cohort study. Setting Payer provider healthcare organisation in Israel. Participants 1 054 815 members aged 50 to 90 years for comparison between tools and cohorts of different age ranges, corresponding to those in each tools’ development study, for tool specific external validation. Main outcome measure First diagnosis of a major osteoporotic fracture (for QFracture and FRAX tools) and hip fractures (for all three tools) recorded in electronic health records from 2010 to 2014. Observed fracture rates were compared to probabilities predicted retrospectively as of 2010. Results The observed five year hip fracture rate was 2.7% and the rate for major osteoporotic fractures was 7.7%. The areas under the receiver operating curve (AUC) for hip fracture prediction were 82.7% for QFracture, 81.5% for FRAX, and 77.8% for Garvan. For major osteoporotic fractures, AUCs were 71.2% for QFracture and 71.4% for FRAX. All the tools underestimated the fracture risk, but the average observed to predicted ratios and the calibration slopes of FRAX were closest to 1. Tool specific validation analyses yielded hip fracture prediction AUCs of 88.0% for QFracture (among those aged 30-100 years), 81.5% for FRAX (50-90 years), and 71.2% for Garvan (60-95 years). Conclusions Both QFracture and FRAX had high discriminatory power for hip fracture prediction, with QFracture performing slightly better. This performance gap was more pronounced in previous studies, likely because of broader age inclusion criteria for QFracture validations. The simpler FRAX performed almost as well as QFracture for hip fracture prediction, and may have advantages if some of the input data required for QFracture are not available. However, both tools require calibration before implementation. PMID:28104610
Kern, David M; Davis, Jill; Williams, Setareh A; Tunceli, Ozgur; Wu, Bingcao; Hollis, Sally; Strange, Charlie; Trudo, Frank
2015-01-01
Objective To estimate the accuracy of claims-based pneumonia diagnoses in COPD patients using clinical information in medical records as the reference standard. Methods Selecting from a repository containing members’ data from 14 regional United States health plans, this validation study identified pneumonia diagnoses within a group of patients initiating treatment for COPD between March 1, 2009 and March 31, 2012. Patients with ≥1 claim for pneumonia (International Classification of Diseases Version 9-CM code 480.xx–486.xx) were identified during the 12 months following treatment initiation. A subset of 800 patients was randomly selected to abstract medical record data (paper based and electronic) for a target sample of 400 patients, to estimate validity within 5% margin of error. Positive predictive value (PPV) was calculated for the claims diagnosis of pneumonia relative to the reference standard, defined as a documented diagnosis in the medical record. Results A total of 388 records were reviewed; 311 included a documented pneumonia diagnosis, indicating 80.2% (95% confidence interval [CI]: 75.8% to 84.0%) of claims-identified pneumonia diagnoses were validated by the medical charts. Claims-based diagnoses in inpatient or emergency departments (n=185) had greater PPV versus outpatient settings (n=203), 87.6% (95% CI: 81.9%–92.0%) versus 73.4% (95% CI: 66.8%–79.3%), respectively. Claims-diagnoses verified with paper-based charts had similar PPV as the overall study sample, 80.2% (95% CI: 71.1%–87.5%), and higher PPV than those linked to electronic medical records, 73.3% (95% CI: 65.5%–80.2%). Combined paper-based and electronic records had a higher PPV, 87.6% (95% CI: 80.9%–92.6%). Conclusion Administrative claims data indicating a diagnosis of pneumonia in COPD patients are supported by medical records. The accuracy of a medical record diagnosis of pneumonia remains unknown. With increased use of claims data in medical research, COPD researchers can study pneumonia with confidence that claims data are a valid tool when studying the safety of COPD therapies that could potentially lead to increased pneumonia susceptibility or severity. PMID:26229461
Kern, David M; Davis, Jill; Williams, Setareh A; Tunceli, Ozgur; Wu, Bingcao; Hollis, Sally; Strange, Charlie; Trudo, Frank
2015-01-01
To estimate the accuracy of claims-based pneumonia diagnoses in COPD patients using clinical information in medical records as the reference standard. Selecting from a repository containing members' data from 14 regional United States health plans, this validation study identified pneumonia diagnoses within a group of patients initiating treatment for COPD between March 1, 2009 and March 31, 2012. Patients with ≥1 claim for pneumonia (International Classification of Diseases Version 9-CM code 480.xx-486.xx) were identified during the 12 months following treatment initiation. A subset of 800 patients was randomly selected to abstract medical record data (paper based and electronic) for a target sample of 400 patients, to estimate validity within 5% margin of error. Positive predictive value (PPV) was calculated for the claims diagnosis of pneumonia relative to the reference standard, defined as a documented diagnosis in the medical record. A total of 388 records were reviewed; 311 included a documented pneumonia diagnosis, indicating 80.2% (95% confidence interval [CI]: 75.8% to 84.0%) of claims-identified pneumonia diagnoses were validated by the medical charts. Claims-based diagnoses in inpatient or emergency departments (n=185) had greater PPV versus outpatient settings (n=203), 87.6% (95% CI: 81.9%-92.0%) versus 73.4% (95% CI: 66.8%-79.3%), respectively. Claims-diagnoses verified with paper-based charts had similar PPV as the overall study sample, 80.2% (95% CI: 71.1%-87.5%), and higher PPV than those linked to electronic medical records, 73.3% (95% CI: 65.5%-80.2%). Combined paper-based and electronic records had a higher PPV, 87.6% (95% CI: 80.9%-92.6%). Administrative claims data indicating a diagnosis of pneumonia in COPD patients are supported by medical records. The accuracy of a medical record diagnosis of pneumonia remains unknown. With increased use of claims data in medical research, COPD researchers can study pneumonia with confidence that claims data are a valid tool when studying the safety of COPD therapies that could potentially lead to increased pneumonia susceptibility or severity.
Zachariah, Marianne; Seidling, Hanna M; Neri, Pamela M; Cresswell, Kathrin M; Duke, Jon; Bloomrosen, Meryl; Volk, Lynn A; Bates, David W
2011-01-01
Background Medication-related decision support can reduce the frequency of preventable adverse drug events. However, the design of current medication alerts often results in alert fatigue and high over-ride rates, thus reducing any potential benefits. Methods The authors previously reviewed human-factors principles for relevance to medication-related decision support alerts. In this study, instrument items were developed for assessing the appropriate implementation of these human-factors principles in drug–drug interaction (DDI) alerts. User feedback regarding nine electronic medical records was considered during the development process. Content validity, construct validity through correlation analysis, and inter-rater reliability were assessed. Results The final version of the instrument included 26 items associated with nine human-factors principles. Content validation on three systems resulted in the addition of one principle (Corrective Actions) to the instrument and the elimination of eight items. Additionally, the wording of eight items was altered. Correlation analysis suggests a direct relationship between system age and performance of DDI alerts (p=0.0016). Inter-rater reliability indicated substantial agreement between raters (κ=0.764). Conclusion The authors developed and gathered preliminary evidence for the validity of an instrument that measures the appropriate use of human-factors principles in the design and display of DDI alerts. Designers of DDI alerts may use the instrument to improve usability and increase user acceptance of medication alerts, and organizations selecting an electronic medical record may find the instrument helpful in meeting their clinicians' usability needs. PMID:21946241
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasten, C. P., E-mail: ckasten@alum.mit.edu; White, A. E.; Irby, J. H.
2014-04-15
Accurately predicting the turbulent transport properties of magnetically confined plasmas is a major challenge of fusion energy research. Validation of transport models is typically done by applying so-called “synthetic diagnostics” to the output of nonlinear gyrokinetic simulations, and the results are compared to experimental data. As part of the validation process, comparing two independent turbulence measurements to each other provides the opportunity to test the synthetic diagnostics themselves; a step which is rarely possible due to limited availability of redundant fluctuation measurements on magnetic confinement experiments. At Alcator C-Mod, phase-contrast imaging (PCI) is a commonly used turbulence diagnostic. PCI measuresmore » line-integrated electron density fluctuations with high sensitivity and wavenumber resolution (1.6 cm{sup −1}≲|k{sub R}|≲11 cm{sup −1}). A new fast two-color interferometry (FTCI) diagnostic on the Alcator C-Mod tokamak measures long-wavelength (|k{sub R}|≲3.0 cm{sup −1}) line-integrated electron density fluctuations. Measurements of coherent and broadband fluctuations made by PCI and FTCI are compared here for the first time. Good quantitative agreement is found between the two measurements. This provides experimental validation of the low-wavenumber region of the PCI calibration, and also helps validate the low-wavenumber portions of the synthetic PCI diagnostic that has been used in gyrokinetic model validation work in the past. We discuss possibilities to upgrade FTCI, so that a similar comparison could be done at higher wavenumbers in the future.« less
77 FR 22707 - Electronic Reporting Under the Toxic Substances Control Act
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-17
... completes metadata information, the web-based tool validates the submission by performing a basic error... uploading PDF attachments or other file types, such as XML, and completing metadata information would be...
WISESight : a multispectral smart video-track intrusion monitor.
DOT National Transportation Integrated Search
2015-05-01
International Electronic Machines : Corporation (IEM) developed, tested, and : validated a unique smart video-based : intrusion monitoring system for use at : highway-rail grade crossings. The system : used both thermal infrared (IR) and : visible/ne...
Validation of COSMIC radio occultation electron density profiles by incoherent scatter radar data
NASA Astrophysics Data System (ADS)
Cherniak, Iurii; Zakharenkova, Irina
The COSMIC/FORMOSAT-3 is a joint US/Taiwan radio occultation mission consisting of six identical micro-satellites. Each microsatellite has a GPS Occultation Experiment payload to operate the ionospheric RO measurements. FS3/COSMIC data can make a positive impact on global ionosphere study providing essential information about height electron density distribu-tion. For correct using of the RO electron density profiles for geophysical analysis, modeling and other applications it is necessary to make validation of these data with electron density distributions obtained by another measurement techniques such as proven ground based facili-ties -ionosondes and IS radars. In fact as the ionosondes provide no direct information on the profile above the maximum electron density and the topside ionosonde profile is obtained by fitting a model to the peak electron density value, the COSMIC RO measurements can make an important contribution to the investigation of the topside part of the ionosphere. IS radars provide information about the whole electron density profile, so we can estimate the agreement of topside parts between two independent measurements. To validate the reliability of COS-MIC data we have used the ionospheric electron density profiles derived from IS radar located near Kharkiv, Ukraine (geographic coordinates: 49.6N, 36.3E, geomagnetic coordinates: 45.7N, 117.8E). The Kharkiv radar is a sole incoherent scatter facility on the middle latitudes of Eu-ropean region. The radar operates with 100-m zenith parabolic antenna at 158 MHz with peak transmitted power 2.0 MW. The Kharkiv IS radar is able to determine the heights-temporal distribution of ionosphere parameters in height range of 70-1500 km. At the ionosphere in-vestigation by incoherent scatter method there are directly measured the power spectrum (or autocorrelation function) of scattered signal. With using of rather complex procedure of the received signal processing it is possible to estimate the majority of the ionospheric parameters -density and kinetic temperature of electron and main ions, the plasma drift velocity and others. The comparison of RO reveals that usually COSMIC RO profiles are in a rather good agreement with ISR profiles both in the F2 layer peak electron density (NmF2) and the form of profiles. The coincidence of profiles is better in the cases when projection of the ray path of tangent points is closer to the ISR location. It is necessary to note that retrieved electron density profiles should not be interpreted as actual vertical profiles. The geographical location of the ray path tangent points at the top and at the bottom of a profile may differ by several hundred kilometers. So the spatial smearing of data takes place and RO technique represents an image of vertical and horizontal ionospheric structure. That is why the comparison with ground-based data has rather relative character. We derived quantitative parameters to char-acterize the differences of the compared profiles: the peak height difference, the relative peak density difference. Most of the compared profiles agree within error limits, depending on the accuracy of the occultation-and the radar-derived profiles. In general COSMIC RO profiles are in a good agreement with incoherent radar profiles both in the F2 layer peak electron density (NmF2) and the form of the profiles. The coincidence of COSMIC and incoherent radar pro-files is better in the cases when projection of the ray path tangent points is closer to the radar location. COSMIC measurements can be efficiently used to study the topside part of the iono-spheric electron density. To validate the reliability of the COSMIC ionospheric observations it must be done the big work on the analysis and statistical generalization of the huge data array (today the total number of ionospheric occultation is more than 2.300.000), but this technique is a very promising one to retrieve accurate profiles of the ionospheric electron density with ground-based measurements on a global scale. We acknowledge the Taiwan's National Space Organization (NSPO) and the University Corporation for Atmospheric Research (UCAR) for providing the COSMIC Data.
Ahmadi, Maryam; Ghazisaeidi, Marjan; Bashiri, Azadeh
2015-03-18
In order to better designing of electronic health record system in Iran, integration of health information systems based on a common language must be done to interpret and exchange this information with this system is required. This study provides a conceptual model of radiology reporting system using unified modeling language. The proposed model can solve the problem of integration this information system with the electronic health record system. By using this model and design its service based, easily connect to electronic health record in Iran and facilitate transfer radiology report data. This is a cross-sectional study that was conducted in 2013. The study population was 22 experts that working at the Imaging Center in Imam Khomeini Hospital in Tehran and the sample was accorded with the community. Research tool was a questionnaire that prepared by the researcher to determine the information requirements. Content validity and test-retest method was used to measure validity and reliability of questioner respectively. Data analyzed with average index, using SPSS. Also Visual Paradigm software was used to design a conceptual model. Based on the requirements assessment of experts and related texts, administrative, demographic and clinical data and radiological examination results and if the anesthesia procedure performed, anesthesia data suggested as minimum data set for radiology report and based it class diagram designed. Also by identifying radiology reporting system process, use case was drawn. According to the application of radiology reports in electronic health record system for diagnosing and managing of clinical problem of the patient, with providing the conceptual Model for radiology reporting system; in order to systematically design it, the problem of data sharing between these systems and electronic health records system would eliminate.
Influence of the angular scattering of electrons on the runaway threshold in air
NASA Astrophysics Data System (ADS)
Chanrion, O.; Bonaventura, Z.; Bourdon, A.; Neubert, T.
2016-04-01
The runaway electron mechanism is of great importance for the understanding of the generation of x- and gamma rays in atmospheric discharges. In 1991, terrestrial gamma-ray flashes (TGFs) were discovered by the Compton Gamma-Ray Observatory. Those emissions are bremsstrahlung from high energy electrons that run away in electric fields associated with thunderstorms. In this paper, we discuss the runaway threshold definition with a particular interest in the influence of the angular scattering for electron energy close to the threshold. In order to understand the mechanism of runaway, we compare the outcome of different Fokker-Planck and Monte Carlo models with increasing complexity in the description of the scattering. The results show that the inclusion of the stochastic nature of collisions smooths the probability to run away around the threshold. Furthermore, we observe that a significant number of electrons diffuse out of the runaway regime when we take into account the diffusion in angle due to the scattering. Those results suggest using a runaway threshold energy based on the Fokker-Planck model assuming the angular equilibrium that is 1.6 to 1.8 times higher than the one proposed by [1, 2], depending on the magnitude of the ambient electric field. The threshold also is found to be 5 to 26 times higher than the one assuming forward scattering. We give a fitted formula for the threshold field valid over a large range of electric fields. Furthermore, we have shown that the assumption of forward scattering is not valid below 1 MeV where the runaway threshold usually is defined. These results are important for the thermal runaway and the runaway electron avalanche discharge mechanisms suggested to participate in the TGF generation.
Design and dosimetry of a few leaf electron collimator for energy modulated electron therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Yahya, Khalid; Verhaegen, Frank; Seuntjens, Jan
2007-12-15
Despite the capability of energy modulated electron therapy (EMET) to achieve highly conformal dose distributions in superficial targets it has not been widely implemented due to problems inherent in electron beam radiotherapy such as planning dosimetry accuracy, and verification as well as a lack of systems for automated delivery. In previous work we proposed a novel technique to deliver EMET using an automated 'few leaf electron collimator' (FLEC) that consists of four motor-driven leaves fit in a standard clinical electron beam applicator. Integrated with a Monte Carlo based optimization algorithm that utilizes patient-specific dose kernels, a treatment delivery was incorporatedmore » within the linear accelerator operation. The FLEC was envisioned to work as an accessory tool added to the clinical accelerator. In this article the design and construction of the FLEC prototype that match our compact design goals are presented. It is controlled using an in-house developed EMET controller. The structure of the software and the hardware characteristics of the EMET controller are demonstrated. Using a parallel plate ionization chamber, output measurements were obtained to validate the Monte Carlo calculations for a range of fields with different energies and sizes. Further verifications were also performed for comparing 1-D and 2-D dose distributions using energy independent radiochromic films. Comparisons between Monte Carlo calculations and measurements of complex intensity map deliveries show an overall agreement to within {+-}3%. This work confirms our design objectives of the FLEC that allow for automated delivery of EMET. Furthermore, the Monte Carlo dose calculation engine required for EMET planning was validated. The result supports the potential of the prototype FLEC for the planning and delivery of EMET.« less
A strain-isolation design for stretchable electronics
NASA Astrophysics Data System (ADS)
Wu, Jian; Li, Ming; Chen, Wei-Qiu; Kim, Dae-Hyeong; Kim, Yun-Soung; Huang, Yong-Gang; Hwang, Keh-Chih; Kang, Zhan; Rogers, John A.
2010-12-01
Stretchable electronics represents a direction of recent development in next-generation semiconductor devices. Such systems have the potential to offer the performance of conventional wafer-based technologies, but they can be stretched like a rubber band, twisted like a rope, bent over a pencil, and folded like a piece of paper. Isolating the active devices from strains associated with such deformations is an important aspect of design. One strategy involves the shielding of the electronics from deformation of the substrate through insertion of a compliant adhesive layer. This paper establishes a simple, analytical model and validates the results by the finite element method. The results show that a relatively thick, compliant adhesive is effective to reduce the strain in the electronics, as is a relatively short film.
Revision of the criterion to avoid electron heating during laser aided plasma diagnostics (LAPD)
NASA Astrophysics Data System (ADS)
Carbone, E. A. D.; Palomares, J. M.; Hübner, S.; Iordanova, E.; van der Mullen, J. J. A. M.
2012-01-01
A criterion is given for the laser fluency (in J/m2) such that, when satisfied, disturbance of the plasma by the laser is avoided. This criterion accounts for laser heating of the electron gas intermediated by electron-ion (ei) and electron-atom (ea) interactions. The first heating mechanism is well known and was extensively dealt with in the past. The second is often overlooked but of importance for plasmas of low degree of ionization. It is especially important for cold atmospheric plasmas, plasmas that nowadays stand in the focus of attention. The new criterion, based on the concerted action of both ei and ea interactions is validated by Thomson scattering experiments performed on four different plasmas.
Electronic publishing and information handling: Plenty of roses, but also some thorns
NASA Astrophysics Data System (ADS)
Heck, André
The current dramatic evolution in information technology is bringing major modifications in the way scientists communicate. The concept of 'electronic publishing' is too restrictive and has often different, sometimes conflicting, interpretations. It is giving way to the broader notion of 'electronic information handling' encompassing the diverse types of information, the different media, as well as the various communication methodologies and technologies. New problems and challenges result also from this new information culture, especially on legal, ethical, and educational grounds. The procedures for validating 'published material' and for evaluating scientific activities will have to be adjusted too. 'Fluid' information is becoming an omnipresent reality. Electronic publishing cannot be conceived without link to knowledge bases and information resources, nor without intelligent information retrieval tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peebles, J.; Wei, M. S.; Arefiev, A. V.
A series of experiments studying pre-plasma’s effect on electron generation and transport due to a high intensity laser were conducted on the OMEGA-EP laser facility. A controlled pre-plasma was produced in front of an aluminum foil target prior to the arrival of the high intensity short pulse beam. Energetic electron spectra were characterized with magnetic and bremsstrahlung spectrometers. Preplasma and pulse length were shown to have a large impact on the temperature of lower energy, ponderomotive scaling electrons. Furthermore, super-ponderomotive electrons, seen in prior pre-plasma experiments with shorter pulses, were observed without any initial pre-plasma in our experiment. 2D particle-in-cellmore » and radiation-hydrodynamic simulations shed light on and validate these experimental results.« less
Peebles, J.; Wei, M. S.; Arefiev, A. V.; ...
2017-02-02
A series of experiments studying pre-plasma’s effect on electron generation and transport due to a high intensity laser were conducted on the OMEGA-EP laser facility. A controlled pre-plasma was produced in front of an aluminum foil target prior to the arrival of the high intensity short pulse beam. Energetic electron spectra were characterized with magnetic and bremsstrahlung spectrometers. Preplasma and pulse length were shown to have a large impact on the temperature of lower energy, ponderomotive scaling electrons. Furthermore, super-ponderomotive electrons, seen in prior pre-plasma experiments with shorter pulses, were observed without any initial pre-plasma in our experiment. 2D particle-in-cellmore » and radiation-hydrodynamic simulations shed light on and validate these experimental results.« less
Modelling of electron beam induced nanowire attraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bitzer, Lucas A.; Benson, Niels, E-mail: niels.benson@uni-due.de; Schmechel, Roland
2016-04-14
Scanning electron microscope (SEM) induced nanowire (NW) attraction or bundling is a well known effect, which is mainly ascribed to structural or material dependent properties. However, there have also been recent reports of electron beam induced nanowire bending by SEM imaging, which is not fully explained by the current models, especially when considering the electro-dynamic interaction between NWs. In this article, we contribute to the understanding of this phenomenon, by introducing an electro-dynamic model based on capacitor and Lorentz force interaction, where the active NW bending is stimulated by an electromagnetic force between individual wires. The model includes geometrical, electrical,more » and mechanical NW parameters, as well as the influence of the electron beam source parameters and is validated using in-situ observations of electron beam induced GaAs nanowire (NW) bending by SEM imaging.« less
NASA Astrophysics Data System (ADS)
Amami, Sadek; Ozer, Zehra N.; Dogan, Mevlut; Yavuz, Murat; Varol, Onur; Madison, Don
2016-09-01
There have been several studies of electron-impact ionization of inert gases for asymmetric final state energy sharing and normally one electron has an energy significantly higher than the other. However, there have been relatively few studies examining equal energy final state electrons. Here we report experimental and theoretical triple differential cross sections for electron impact ionization of Ar (3p) for equal energy sharing of the outgoing electrons. Previous experimental results combined with some new measurements are compared with distorted wave born approximation (DWBA) results, DWBA results using the Ward-Macek (WM) approximation for the post collision interaction (PCI), and three-body distorted wave (3DW) which includes PCI without approximation. The results show that it is crucially important to include PCI in the calculation particularly for lower energies and that the WM approximation is valid only for high energies. The 3DW, on the other hand, is in reasonably good agreement with data down to fairly low energies.
Raymond, Louis; Paré, Guy; Marchand, Marie
2017-04-01
The deployment of electronic health record systems is deemed to play a decisive role in the transformations currently being implemented in primary care medical practices. This study aims to characterize electronic health record systems from the perspective of family physicians. To achieve this goal, we conducted a survey of physicians practising in private clinics located in Quebec, Canada. We used valid responses from 331 respondents who were found to be representative of the larger population. Data provided by the physicians using the top three electronic health record software products were analysed in order to obtain statistically adequate sub-sample sizes. Significant differences were observed among the three products with regard to their functional capability. The extent to which each of the electronic health record functionalities are used by physicians also varied significantly. Our results confirm that the electronic health record artefact 'does matter', its clinical functionalities explaining why certain physicians make more extended use of their system than others.
Electron emission and plasma generation in a modulator electron gun using ferroelectric cathode
NASA Astrophysics Data System (ADS)
Chen, Shutao; Zheng, Shuxin; Zhu, Ziqiu; Dong, Xianlin; Tang, Chuanxiang
2006-10-01
Strong electron emission and dense plasma generation have been observed in a modulator electron gun with a Ba 0.67Sr 0.33TiO 3 ferroelectric cathode. Parameter of the modulator electron gun and lifetime of the ferroelectric cathode were investigated. It was shown that electron emission from Ba 0.67Sr 0.33TiO 3 cathode with a positive triggering pulse is a sort of plasma emission. Electrons were emitted by the co-effect of surface plasma and non-compensated negative polarization charges at the surface of the ferroelectric. The element analyses of the graphite collector after emission process was performed to show the ingredient of the plasma consist of Ba, Ti and Cu heavy cations of the ceramic compound and electrode. It was demonstrated the validity of the Child-Langmuir law by introducing the decrease of vacuum gap and increase of emission area caused by the expansion of the surface plasma.
One-electron reduced density matrices of strongly correlated harmonium atoms.
Cioslowski, Jerzy
2015-03-21
Explicit asymptotic expressions are derived for the reduced one-electron density matrices (the 1-matrices) of strongly correlated two- and three-electron harmonium atoms in the ground and first excited states. These expressions, which are valid at the limit of small confinement strength ω, yield electron densities and kinetic energies in agreement with the published values. In addition, they reveal the ω(5/6) asymptotic scaling of the exchange components of the electron-electron repulsion energies that differs from the ω(2/3) scaling of their Coulomb and correlation counterparts. The natural orbitals of the totally symmetric ground state of the two-electron harmonium atom are found to possess collective occupancies that follow a mixed power/Gaussian dependence on the angular momentum in variance with the simple power-law prediction of Hill's asymptotics. Providing rigorous constraints on energies as functionals of 1-matrices, these results are expected to facilitate development of approximate implementations of the density matrix functional theory and ensure their proper description of strongly correlated systems.
Twilight reloaded: the peptide experience
Weichenberger, Christian X.; Pozharski, Edwin; Rupp, Bernhard
2017-01-01
The de facto commoditization of biomolecular crystallography as a result of almost disruptive instrumentation automation and continuing improvement of software allows any sensibly trained structural biologist to conduct crystallographic studies of biomolecules with reasonably valid outcomes: that is, models based on properly interpreted electron density. Robust validation has led to major mistakes in the protein part of structure models becoming rare, but some depositions of protein–peptide complex structure models, which generally carry significant interest to the scientific community, still contain erroneous models of the bound peptide ligand. Here, the protein small-molecule ligand validation tool Twilight is updated to include peptide ligands. (i) The primary technical reasons and potential human factors leading to problems in ligand structure models are presented; (ii) a new method used to score peptide-ligand models is presented; (iii) a few instructive and specific examples, including an electron-density-based analysis of peptide-ligand structures that do not contain any ligands, are discussed in detail; (iv) means to avoid such mistakes and the implications for database integrity are discussed and (v) some suggestions as to how journal editors could help to expunge errors from the Protein Data Bank are provided. PMID:28291756
Twilight reloaded: the peptide experience.
Weichenberger, Christian X; Pozharski, Edwin; Rupp, Bernhard
2017-03-01
The de facto commoditization of biomolecular crystallography as a result of almost disruptive instrumentation automation and continuing improvement of software allows any sensibly trained structural biologist to conduct crystallographic studies of biomolecules with reasonably valid outcomes: that is, models based on properly interpreted electron density. Robust validation has led to major mistakes in the protein part of structure models becoming rare, but some depositions of protein-peptide complex structure models, which generally carry significant interest to the scientific community, still contain erroneous models of the bound peptide ligand. Here, the protein small-molecule ligand validation tool Twilight is updated to include peptide ligands. (i) The primary technical reasons and potential human factors leading to problems in ligand structure models are presented; (ii) a new method used to score peptide-ligand models is presented; (iii) a few instructive and specific examples, including an electron-density-based analysis of peptide-ligand structures that do not contain any ligands, are discussed in detail; (iv) means to avoid such mistakes and the implications for database integrity are discussed and (v) some suggestions as to how journal editors could help to expunge errors from the Protein Data Bank are provided.
Baker, Matthew L.; Hryc, Corey F.; Zhang, Qinfen; Wu, Weimin; Jakana, Joanita; Haase-Pettingell, Cameron; Afonine, Pavel V.; Adams, Paul D.; King, Jonathan A.; Jiang, Wen; Chiu, Wah
2013-01-01
High-resolution structures of viruses have made important contributions to modern structural biology. Bacteriophages, the most diverse and abundant organisms on earth, replicate and infect all bacteria and archaea, making them excellent potential alternatives to antibiotics and therapies for multidrug-resistant bacteria. Here, we improved upon our previous electron cryomicroscopy structure of Salmonella bacteriophage epsilon15, achieving a resolution sufficient to determine the tertiary structures of both gp7 and gp10 protein subunits that form the T = 7 icosahedral lattice. This study utilizes recently established best practice for near-atomic to high-resolution (3–5 Å) electron cryomicroscopy data evaluation. The resolution and reliability of the density map were cross-validated by multiple reconstructions from truly independent data sets, whereas the models of the individual protein subunits were validated adopting the best practices from X-ray crystallography. Some sidechain densities are clearly resolved and show the subunit–subunit interactions within and across the capsomeres that are required to stabilize the virus. The presence of the canonical phage and jellyroll viral protein folds, gp7 and gp10, respectively, in the same virus suggests that epsilon15 may have emerged more recently relative to other bacteriophages. PMID:23840063
NASA Astrophysics Data System (ADS)
Salomir, Rares; Rata, Mihaela; Lafon, Cyril; Melodelima, David; Chapelon, Jean-Yves; Mathias, Adrien; Cotton, François; Bonmartin, Alain; Cathignol, Dominique
2006-05-01
Contact application of high intensity ultrasound was demonstrated to be suitable for thermal ablation of sectorial tumours of the digestive duct. Experimental validation of a new MR compatible ultrasonic device is described here, dedicated to the minimal invasive therapy of localized colorectal cancer. This is a cylindrical 1D 64-element phased array transducer of 14 mm diameter and 25 mm height (Imasonic, France) allowing electronic rotation of the acoustic beam. Operating frequency ranges from 3.5 to 4.0 MHz and up to 5 effective electrical watts per element are available. A plane wave is reconstructed by simultaneous excitation of eigth adjacent elements with an appropriate phase law. Driving electronics operates outside the Faraday cage of the scanner and provides fast switching capabilities. Excellent passive and active compatibility with the MRI data acquisition has been demonstrated. In addition, feasibility of active temperature control has been demonstrated based on real-time data export out of the MR scanner and a PID feedback algorithm. Further studies will address the in-vivo validation and the integration of a miniature NMR coil for increased SNR in the near field.
Anonymization of electronic medical records for validating genome-wide association studies
Loukides, Grigorios; Gkoulalas-Divanis, Aris; Malin, Bradley
2010-01-01
Genome-wide association studies (GWAS) facilitate the discovery of genotype–phenotype relations from population-based sequence databases, which is an integral facet of personalized medicine. The increasing adoption of electronic medical records allows large amounts of patients’ standardized clinical features to be combined with the genomic sequences of these patients and shared to support validation of GWAS findings and to enable novel discoveries. However, disseminating these data “as is” may lead to patient reidentification when genomic sequences are linked to resources that contain the corresponding patients’ identity information based on standardized clinical features. This work proposes an approach that provably prevents this type of data linkage and furnishes a result that helps support GWAS. Our approach automatically extracts potentially linkable clinical features and modifies them in a way that they can no longer be used to link a genomic sequence to a small number of patients, while preserving the associations between genomic sequences and specific sets of clinical features corresponding to GWAS-related diseases. Extensive experiments with real patient data derived from the Vanderbilt's University Medical Center verify that our approach generates data that eliminate the threat of individual reidentification, while supporting GWAS validation and clinical case analysis tasks. PMID:20385806
Jin, Yinji; Jin, Taixian; Lee, Sun-Mi
Pressure injury risk assessment is the first step toward preventing pressure injuries, but traditional assessment tools are time-consuming, resulting in work overload and fatigue for nurses. The objectives of the study were to build an automated pressure injury risk assessment system (Auto-PIRAS) that can assess pressure injury risk using data, without requiring nurses to collect or input additional data, and to evaluate the validity of this assessment tool. A retrospective case-control study and a system development study were conducted in a 1,355-bed university hospital in Seoul, South Korea. A total of 1,305 pressure injury patients and 5,220 nonpressure injury patients participated for the development of a risk scoring algorithm: 687 and 2,748 for the validation of the algorithm and 237 and 994 for validation after clinical implementation, respectively. A total of 4,211 pressure injury-related clinical variables were extracted from the electronic health record (EHR) systems to develop a risk scoring algorithm, which was validated and incorporated into the EHR. That program was further evaluated for predictive and concurrent validity. Auto-PIRAS, incorporated into the EHR system, assigned a risk assessment score of high, moderate, or low and displayed this on the Kardex nursing record screen. Risk scores were updated nightly according to 10 predetermined risk factors. The predictive validity measures of the algorithm validation stage were as follows: sensitivity = .87, specificity = .90, positive predictive value = .68, negative predictive value = .97, Youden index = .77, and the area under the receiver operating characteristic curve = .95. The predictive validity measures of the Braden Scale were as follows: sensitivity = .77, specificity = .93, positive predictive value = .72, negative predictive value = .95, Youden index = .70, and the area under the receiver operating characteristic curve = .85. The kappa of the Auto-PIRAS and Braden Scale risk classification result was .73. The predictive performance of the Auto-PIRAS was similar to Braden Scale assessments conducted by nurses. Auto-PIRAS is expected to be used as a system that assesses pressure injury risk automatically without additional data collection by nurses.
Performance Analysis and Electronics Packaging of the Optical Communications Demonstrator
NASA Technical Reports Server (NTRS)
Jeganathan, M.; Monacos, S.
1998-01-01
The Optical Communications Demonstrator (OCD), under development at the Jet Propulsion Laboratory (JPL), is a laboratory-based lasercomm terminal designed to validate several key technologies, primarily precision beam pointing, high bandwidth tracking, and beacon acquisition.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Aviation Administration (FAA) Aviation Safety Inspectors with valid credentials and authorization are not... TSA for compliance with an aviation security program, emergency amendment, or security directive...
Towards an electronic national injury surveillance system in Saudi Arabia.
Alanazi, F; Hussain, S A; Mandil, A; Alamro, N
2015-04-02
Given the need for a uniform, comprehensive, electronic nationwide surveillance system for injuries in Saudi Arabia, a system was designed with the objectives of establishing an epidemiologic profile of injuries in the country; evaluating injury indicators on an ongoing basis; identifying high-risk groups requiring specific interventions; monitoring and evaluating interventions for effectiveness; and producing reports to assist in planning and resource allocation. A special form for this purpose was designed, modified from validated forms used elsewhere for injury surveillance. This initiative of the Ministry of Health is also expected to help validate data collected by other sectors, such as the Ministry of Interior. This paper reviews the milestones of building the system and aims to prompt a debate within the scientific community, especially within the Eastern Mediterranean Region, about the best way to design injury surveillance systems for the Region in order to fine-tune the proposed system before its full-scale implementation.
Iterative Stable Alignment and Clustering of 2D Transmission Electron Microscope Images
Yang, Zhengfan; Fang, Jia; Chittuluru, Johnathan; Asturias, Francisco J.; Penczek, Pawel A.
2012-01-01
SUMMARY Identification of homogeneous subsets of images in a macromolecular electron microscopy (EM) image data set is a critical step in single-particle analysis. The task is handled by iterative algorithms, whose performance is compromised by the compounded limitations of image alignment and K-means clustering. Here we describe an approach, iterative stable alignment and clustering (ISAC) that, relying on a new clustering method and on the concepts of stability and reproducibility, can extract validated, homogeneous subsets of images. ISAC requires only a small number of simple parameters and, with minimal human intervention, can eliminate bias from two-dimensional image clustering and maximize the quality of group averages that can be used for ab initio three-dimensional structural determination and analysis of macromolecular conformational variability. Repeated testing of the stability and reproducibility of a solution within ISAC eliminates heterogeneous or incorrect classes and introduces critical validation to the process of EM image clustering. PMID:22325773
Fabrello, Amandine; Dinoi, Chiara; Perrin, Lionel; Kalck, Philippe; Maron, Laurent; Urrutigoity, Martine; Dechy-Cabaret, Odile
2010-11-01
(103)Rh NMR represents a powerful tool to assess the global electronic and steric contribution of diphosphine ligands on [Rh(COD)(diphosphine)](+) complexes. In the case of DIOP, BINAP and MeDUPHOS, this approach proved to be more informative than classical CO-stretching frequency measurements. After validation, this method has been extended to a set of seven diphosphines. (103)Rh NMR measurements on [Rh(COD)(diphosphine)]PF(6) lead to the following order of donor properties: dppe > MeBPE > MeDUPHOS > dppb > DIOP > BINAP > Tol-BINAP. This trend has been validated by DFT in the case of DIOP, BINAP and MeDUPHOS. In conjunction, (31)P NMR chemical shift has been shown to reflect the ring constraints of the Rh-diphosphine scaffold. This contribution is a step towards a mechanistic investigation of the catalytic hydrogenation of unsaturated substrates by (103)Rh NMR and DFT. 2010 John Wiley & Sons, Ltd.
Implementation of a low-cost Interim 21CFR11 compliance solution for laboratory environments.
Greene, Jack E
2003-01-01
In the recent past, compliance with 21CFR11 has become a major buzzword within the pharmaceutical and biotechnology industries. While commercial solutions exist, implementation and validation are expensive and cumbersome. Frequent implementation of new features via point releases further complicates purchasing decisions by making it difficult to weigh the risk of non-compliance against the costs of too frequent upgrades. This presentation discusses a low-cost interim solution to the problem. While this solution does not address 100% of the issues raised by 21CFR11, it does implement and validate: (1) computer system security; (2) backup and restore ability on the electronic records store; and (3) an automated audit trail mechanism that captures the date, time and user identification whenever electronic records are created, modified or deleted. When coupled with enhanced procedural controls, this solution provides an acceptable level of compliance at extremely low cost.
Perimal-Lewis, Lua; Teubner, David; Hakendorf, Paul; Horwood, Chris
2016-12-01
Effective and accurate use of routinely collected health data to produce Key Performance Indicator reporting is dependent on the underlying data quality. In this research, Process Mining methodology and tools were leveraged to assess the data quality of time-based Emergency Department data sourced from electronic health records. This research was done working closely with the domain experts to validate the process models. The hospital patient journey model was used to assess flow abnormalities which resulted from incorrect timestamp data used in time-based performance metrics. The research demonstrated process mining as a feasible methodology to assess data quality of time-based hospital performance metrics. The insight gained from this research enabled appropriate corrective actions to be put in place to address the data quality issues. © The Author(s) 2015.
Quantitative Determination of Spring Water Quality Parameters via Electronic Tongue.
Carbó, Noèlia; López Carrero, Javier; Garcia-Castillo, F Javier; Tormos, Isabel; Olivas, Estela; Folch, Elisa; Alcañiz Fillol, Miguel; Soto, Juan; Martínez-Máñez, Ramón; Martínez-Bisbal, M Carmen
2017-12-25
The use of a voltammetric electronic tongue for the quantitative analysis of quality parameters in spring water is proposed here. The electronic voltammetric tongue consisted of a set of four noble electrodes (iridium, rhodium, platinum, and gold) housed inside a stainless steel cylinder. These noble metals have a high durability and are not demanding for maintenance, features required for the development of future automated equipment. A pulse voltammetry study was conducted in 83 spring water samples to determine concentrations of nitrate (range: 6.9-115 mg/L), sulfate (32-472 mg/L), fluoride (0.08-0.26 mg/L), chloride (17-190 mg/L), and sodium (11-94 mg/L) as well as pH (7.3-7.8). These parameters were also determined by routine analytical methods in spring water samples. A partial least squares (PLS) analysis was run to obtain a model to predict these parameter. Orthogonal signal correction (OSC) was applied in the preprocessing step. Calibration (67%) and validation (33%) sets were selected randomly. The electronic tongue showed good predictive power to determine the concentrations of nitrate, sulfate, chloride, and sodium as well as pH and displayed a lower R² and slope in the validation set for fluoride. Nitrate and fluoride concentrations were estimated with errors lower than 15%, whereas chloride, sulfate, and sodium concentrations as well as pH were estimated with errors below 10%.
Domain of validity of the perturbative approach to femtosecond optical spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelin, Maxim F.; Rao, B. Jayachander; Nest, Mathias
2013-12-14
We have performed numerical nonperturbative simulations of transient absorption pump-probe responses for a series of molecular model systems. The resulting signals as a function of the laser field strength and the pump-probe delay time are compared with those obtained in the perturbative response function formalism. The simulations and their theoretical analysis indicate that the perturbative description remains valid up to moderately strong laser pulses, corresponding to a rather substantial depopulation (population) of the initial (final) electronic states.
Photodetachment cross sections of negative ions - The range of validity of the Wigner threshold law
NASA Technical Reports Server (NTRS)
Farley, John W.
1989-01-01
The threshold behavior of the photodetachment cross section of negative ions as a function of photon frequency is usually described by the Wigner law. This paper reports the results of a model calculation using the zero-core-contribution (ZCC) approximation. Theoretical expressions for the leading correction to the Wigner law are developed, giving the range of validity of the Wigner law and the expected accuracy. The results are relevant to extraction of electron affinities from experimental photodetachment data.
Escaño, Mary Clare Sison; Arevalo, Ryan Lacdao; Gyenge, Elod; Kasai, Hideaki
2014-09-03
The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4(-) on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.
NASA Astrophysics Data System (ADS)
Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki
2014-09-01
The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.
2012-04-01
Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December
Energy deposition dynamics of femtosecond pulses in water
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minardi, Stefano, E-mail: stefano@stefanominardi.eu; Pertsch, Thomas; Milián, Carles
2014-12-01
We exploit inverse Raman scattering and solvated electron absorption to perform a quantitative characterization of the energy loss and ionization dynamics in water with tightly focused near-infrared femtosecond pulses. A comparison between experimental data and numerical simulations suggests that the ionization energy of water is 8 eV, rather than the commonly used value of 6.5 eV. We also introduce an equation for the Raman gain valid for ultra-short pulses that validates our experimental procedure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Robert Cameron; Steiner, Don
2004-06-15
The generation of runaway electrons during a thermal plasma disruption is a concern for the safe and economical operation of a tokamak power system. Runaway electrons have high energy, 10 to 300 MeV, and may potentially cause extensive damage to plasma-facing components (PFCs) through large temperature increases, melting of metallic components, surface erosion, and possible burnout of coolant tubes. The EPQ code system was developed to simulate the thermal response of PFCs to a runaway electron impact. The EPQ code system consists of several parts: UNIX scripts that control the operation of an electron-photon Monte Carlo code to calculate themore » interaction of the runaway electrons with the plasma-facing materials; a finite difference code to calculate the thermal response, melting, and surface erosion of the materials; a code to process, scale, transform, and convert the electron Monte Carlo data to volumetric heating rates for use in the thermal code; and several minor and auxiliary codes for the manipulation and postprocessing of the data. The electron-photon Monte Carlo code used was Electron-Gamma-Shower (EGS), developed and maintained by the National Research Center of Canada. The Quick-Therm-Two-Dimensional-Nonlinear (QTTN) thermal code solves the two-dimensional cylindrical modified heat conduction equation using the Quickest third-order accurate and stable explicit finite difference method and is capable of tracking melting or surface erosion. The EPQ code system is validated using a series of analytical solutions and simulations of experiments. The verification of the QTTN thermal code with analytical solutions shows that the code with the Quickest method is better than 99.9% accurate. The benchmarking of the EPQ code system and QTTN versus experiments showed that QTTN's erosion tracking method is accurate within 30% and that EPQ is able to predict the occurrence of melting within the proper time constraints. QTTN and EPQ are verified and validated as able to calculate the temperature distribution, phase change, and surface erosion successfully.« less
NASA Technical Reports Server (NTRS)
Paquette, Beth; Samuels, Margaret; Chen, Peng
2017-01-01
Direct-write printing techniques will enable new detector assemblies that were not previously possible with traditional assembly processes. Detector concepts were manufactured using this technology to validate repeatability. Additional detector applications and printed wires on a 3-dimensional magnetometer bobbin will be designed for print. This effort focuses on evaluating performance for direct-write manufacturing techniques on 3-dimensional surfaces. Direct-write manufacturing has the potential to reduce mass and volume for fabrication and assembly of advanced detector concepts by reducing trace widths down to 10 microns, printing on complex geometries, allowing new electronic concept production, and reduced production times of complex those electronics.
Note: Characteristic beam parameter for the line electron gun
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iqbal, M.; Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049; Islam, G. U.
We have optimized the beam parameters of line source electron gun using Stanford Linear Accelerator Centre electron beam trajectory program (EGUN), utilizing electrostatic focusing only. We measured minimum beam diameter as 0.5 mm that corresponds to power density of 68.9 kW/cm{sup 2} at 13.5 mm in the post-anode region which is more than two-fold (33 kW/cm{sup 2}), of the previously reported results. The gun was operated for the validation of the theoretical results and found in good agreement. The gun is now without any magnetic and electrostatic focusing thus much simpler and more powerful.
Note: Characteristic beam parameter for the line electron gun.
Iqbal, M; Islam, G U; Zhou, Z; Chi, Y
2013-11-01
We have optimized the beam parameters of line source electron gun using Stanford Linear Accelerator Centre electron beam trajectory program (EGUN), utilizing electrostatic focusing only. We measured minimum beam diameter as 0.5 mm that corresponds to power density of 68.9 kW/cm(2) at 13.5 mm in the post-anode region which is more than two-fold (33 kW/cm(2)), of the previously reported results. The gun was operated for the validation of the theoretical results and found in good agreement. The gun is now without any magnetic and electrostatic focusing thus much simpler and more powerful.
Note: Characteristic beam parameter for the line electron gun
NASA Astrophysics Data System (ADS)
Iqbal, M.; Islam, G. U.; Zhou, Z.; Chi, Y.
2013-11-01
We have optimized the beam parameters of line source electron gun using Stanford Linear Accelerator Centre electron beam trajectory program (EGUN), utilizing electrostatic focusing only. We measured minimum beam diameter as 0.5 mm that corresponds to power density of 68.9 kW/cm2 at 13.5 mm in the post-anode region which is more than two-fold (33 kW/cm2), of the previously reported results. The gun was operated for the validation of the theoretical results and found in good agreement. The gun is now without any magnetic and electrostatic focusing thus much simpler and more powerful.
NASA Technical Reports Server (NTRS)
1993-01-01
Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.
NASA Astrophysics Data System (ADS)
Drachta, Jürgen T.; Kreil, Dominik; Hobbiger, Raphael; Böhm, Helga M.
2018-03-01
Correlations, highly important in low-dimensional systems, are known to decrease the plasmon dispersion of two-dimensional electron liquids. Here we calculate the plasmon properties, applying the 'Dynamic Many-Body Theory', accounting for correlated two-particle-two-hole fluctuations. These dynamic correlations are found to significantly lower the plasmon's energy. For the data obtained numerically, we provide an analytic expression that is valid across a wide range both of densities and of wave vectors. Finally, we demonstrate how this can be invoked in determining the actual electron densities from measurements on an AlGaAs quantum well.
Armendáriz-Vidales, G; Frontana, C
2015-11-21
In this work, electrogenerated anion and dianion species from shikonin and its ester derivative isovalerylshikonin were characterized by means of ESR/UV-Vis spectroelectrochemistry. Analysis of the spectra supported the proposal that stepwise dissociative electron transfer (DET) takes place during the second reduction process of the esterified compound. Quantum chemical calculations were performed for validating the occurrence of this mechanistic pathway and for obtaining thermodynamic information on the electron transfer process; ΔG(cleavage)(0) was estimated to be -0.45 eV, considering that the two possible products of the overall reaction scheme are both a quinone and carboxylate anions.
A flexible, on-line magnetic spectrometer for ultra-intense laser produced fast electron measurement
NASA Astrophysics Data System (ADS)
Ge, Xulei; Yuan, Xiaohui; Yang, Su; Deng, Yanqing; Wei, Wenqing; Fang, Yuan; Gao, Jian; Liu, Feng; Chen, Min; Zhao, Li; Ma, Yanyun; Sheng, Zhengming; Zhang, Jie
2018-04-01
We have developed an on-line magnetic spectrometer to measure energy distributions of fast electrons generated from ultra-intense laser-solid interactions. The spectrometer consists of a sheet of plastic scintillator, a bundle of non-scintillating plastic fibers, and an sCMOS camera recording system. The design advantages include on-line capturing ability, versatility of detection arrangement, and resistance to harsh in-chamber environment. The validity of the instrument was tested experimentally. This spectrometer can be applied to the characterization of fast electron source for understanding fundamental laser-plasma interaction physics and to the optimization of high-repetition-rate laser-driven applications.
Nonequilibrium itinerant-electron magnetism: A time-dependent mean-field theory
NASA Astrophysics Data System (ADS)
Secchi, A.; Lichtenstein, A. I.; Katsnelson, M. I.
2016-08-01
We study the dynamical magnetic susceptibility of a strongly correlated electronic system in the presence of a time-dependent hopping field, deriving a generalized Bethe-Salpeter equation that is valid also out of equilibrium. Focusing on the single-orbital Hubbard model within the time-dependent Hartree-Fock approximation, we solve the equation in the nonequilibrium adiabatic regime, obtaining a closed expression for the transverse magnetic susceptibility. From this, we provide a rigorous definition of nonequilibrium (time-dependent) magnon frequencies and exchange parameters, expressed in terms of nonequilibrium single-electron Green's functions and self-energies. In the particular case of equilibrium, we recover previously known results.
Thermal Testing and Quality Assurance of BGA LCC & QFN Electronic Packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuper, Cameron Mathias
The purpose of this project is to experimentally validate the thermal fatigue life of solder interconnects for a variety of surface mount electronic packages. Over the years, there has been a significant amount of research and analysis in the fracture of solder joints on printed circuit boards. Solder is important in the mechanical and electronic functionality of the component. It is important throughout the life of the product that the solder remains crack and fracture free. The specific type of solder used in this experiment is a 63Sn37Pb eutectic alloy. Each package was surrounded conformal coating or underfill material.
Nayor, Jennifer; Borges, Lawrence F; Goryachev, Sergey; Gainer, Vivian S; Saltzman, John R
2018-07-01
ADR is a widely used colonoscopy quality indicator. Calculation of ADR is labor-intensive and cumbersome using current electronic medical databases. Natural language processing (NLP) is a method used to extract meaning from unstructured or free text data. (1) To develop and validate an accurate automated process for calculation of adenoma detection rate (ADR) and serrated polyp detection rate (SDR) on data stored in widely used electronic health record systems, specifically Epic electronic health record system, Provation ® endoscopy reporting system, and Sunquest PowerPath pathology reporting system. Screening colonoscopies performed between June 2010 and August 2015 were identified using the Provation ® reporting tool. An NLP pipeline was developed to identify adenomas and sessile serrated polyps (SSPs) on pathology reports corresponding to these colonoscopy reports. The pipeline was validated using a manual search. Precision, recall, and effectiveness of the natural language processing pipeline were calculated. ADR and SDR were then calculated. We identified 8032 screening colonoscopies that were linked to 3821 pathology reports (47.6%). The NLP pipeline had an accuracy of 100% for adenomas and 100% for SSPs. Mean total ADR was 29.3% (range 14.7-53.3%); mean male ADR was 35.7% (range 19.7-62.9%); and mean female ADR was 24.9% (range 9.1-51.0%). Mean total SDR was 4.0% (0-9.6%). We developed and validated an NLP pipeline that accurately and automatically calculates ADRs and SDRs using data stored in Epic, Provation ® and Sunquest PowerPath. This NLP pipeline can be used to evaluate colonoscopy quality parameters at both individual and practice levels.
Rasmussen, Luke V; Berg, Richard L; Linneman, James G; McCarty, Catherine A; Waudby, Carol; Chen, Lin; Denny, Joshua C; Wilke, Russell A; Pathak, Jyotishman; Carrell, David; Kho, Abel N; Starren, Justin B
2012-01-01
Objective There is increasing interest in using electronic health records (EHRs) to identify subjects for genomic association studies, due in part to the availability of large amounts of clinical data and the expected cost efficiencies of subject identification. We describe the construction and validation of an EHR-based algorithm to identify subjects with age-related cataracts. Materials and methods We used a multi-modal strategy consisting of structured database querying, natural language processing on free-text documents, and optical character recognition on scanned clinical images to identify cataract subjects and related cataract attributes. Extensive validation on 3657 subjects compared the multi-modal results to manual chart review. The algorithm was also implemented at participating electronic MEdical Records and GEnomics (eMERGE) institutions. Results An EHR-based cataract phenotyping algorithm was successfully developed and validated, resulting in positive predictive values (PPVs) >95%. The multi-modal approach increased the identification of cataract subject attributes by a factor of three compared to single-mode approaches while maintaining high PPV. Components of the cataract algorithm were successfully deployed at three other institutions with similar accuracy. Discussion A multi-modal strategy incorporating optical character recognition and natural language processing may increase the number of cases identified while maintaining similar PPVs. Such algorithms, however, require that the needed information be embedded within clinical documents. Conclusion We have demonstrated that algorithms to identify and characterize cataracts can be developed utilizing data collected via the EHR. These algorithms provide a high level of accuracy even when implemented across multiple EHRs and institutional boundaries. PMID:22319176
NASA Astrophysics Data System (ADS)
Scherliess, L.; Schunk, R. W.; Sojka, J. J.; Thompson, D. C.; Zhu, L.
2006-11-01
The Utah State University Gauss-Markov Kalman Filter (GMKF) was developed as part of the Global Assimilation of Ionospheric Measurements (GAIM) program. The GMKF uses a physics-based model of the ionosphere and a Gauss-Markov Kalman filter as a basis for assimilating a diverse set of real-time (or near real-time) observations. The physics-based model is the Ionospheric Forecast Model (IFM), which accounts for five ion species and covers the E region, F region, and the topside from 90 to 1400 km altitude. Within the GMKF, the IFM derived ionospheric densities constitute a background density field on which perturbations are superimposed based on the available data and their errors. In the current configuration, the GMKF assimilates slant total electron content (TEC) from a variable number of global positioning satellite (GPS) ground sites, bottomside electron density (Ne) profiles from a variable number of ionosondes, in situ Ne from four Defense Meteorological Satellite Program (DMSP) satellites, and nighttime line-of-sight ultraviolet (UV) radiances measured by satellites. To test the GMKF for real-time operations and to validate its ionospheric density specifications, we have tested the model performance for a variety of geophysical conditions. During these model runs various combination of data types and data quantities were assimilated. To simulate real-time operations, the model ran continuously and automatically and produced three-dimensional global electron density distributions in 15 min increments. In this paper we will describe the Gauss-Markov Kalman filter model and present results of our validation study, with an emphasis on comparisons with independent observations.
NASA Astrophysics Data System (ADS)
Badavi, Francis F.; Nealy, John E.; Wilson, John W.
2011-10-01
The International Space Station (ISS) provides the proving ground for future long duration human activities in space. Ionizing radiation measurements in ISS form the ideal tool for the experimental validation of radiation environmental models, nuclear transport code algorithms and nuclear reaction cross sections. Indeed, prior measurements on the Space Transportation System (STS; Shuttle) have provided vital information impacting both the environmental models and the nuclear transport code development by requiring dynamic models of the Low Earth Orbit (LEO) environment. Previous studies using Computer Aided Design (CAD) models of the evolving ISS configurations with Thermo-Luminescent Detector (TLD) area monitors, demonstrated that computational dosimetry requires environmental models with accurate non-isotropic as well as dynamic behavior, detailed information on rack loading, and an accurate six degree of freedom (DOF) description of ISS trajectory and orientation. It is imperative that we understand ISS exposures dynamically for crew career planning, and insure that the regulatory requirements of keeping exposure as low as reasonably achievable (ALARA) are adequately implemented. This is especially true as ISS nears some form of completion with increasing complexity, resulting in a larger drag coefficient, and requiring operation at higher altitudes with increased exposure rates. In this paper ISS environmental model is configured for 11A (circa mid 2005), and uses non-isotropic and dynamic geomagnetic transmission and trapped proton models. ISS 11A and LEO model validations are important steps in preparation for the design and validation for the next generation manned vehicles. While the described cutoff rigidity, trapped proton and electron formalisms as coded in a package named GEORAD (GEOmagnetic RADiation) and a web interface named OLTARIS (On-line Tool for the Assessment of Radiation in Space) are applicable to the LEO, Medium Earth Orbit (MEO) and Geosynchronous Earth Orbit (GEO) at quiet solar periods, in this report, the validation of the models using available measurements are limited to STS and ISS nominal operational altitudes (300-400 km) range at LEO where the dominant fields within the vehicle are the trapped proton and attenuated Galactic Cosmic Ray (GCR) ions. The described formalism applies to trapped electron at LEO, MEO and GEO as well. Due to the scarcity of available electron measurements, the trapped electron capabilities of the GEORAD are not discussed in this report, but are accessible through OLTARIS web interface. GEORAD and OLTARIS interests are in the study of long term effects (i.e. a meaningful portion of solar cycle). Therefore, GEORAD does not incorporate any short term external field contribution due to solar activity. Finally, we apply these environmental models to selected target points within ISS 6A (circa early 2001), 7A (circa late 2001), and 11A during its passage through the South Atlantic Anomaly (SAA) to assess the validity of the environmental models at ISS altitudes.
Oxygen sensor signal validation for the safety of the rebreather diver.
Sieber, Arne; L'abbate, Antonio; Bedini, Remo
2009-03-01
In electronically controlled, closed-circuit rebreather diving systems, the partial pressure of oxygen inside the breathing loop is controlled with three oxygen sensors, a microcontroller and a solenoid valve - critical components that may fail. State-of-the-art detection of sensor failure, based on a voting algorithm, may fail under circumstances where two or more sensors show the same but incorrect values. The present paper details a novel rebreather controller that offers true sensor-signal validation, thus allowing efficient and reliable detection of sensor failure. The core components of this validation system are two additional solenoids, which allow an injection of oxygen or diluent gas directly across the sensor membrane.
NASA Astrophysics Data System (ADS)
Ivantchenko, Vladimir
Geant4 is a toolkit for Monte Carlo simulation of particle transport originally developed for applications in high-energy physics with the focus on experiments at the Large Hadron Collider (CERN, Geneva). The transparency and flexibility of the code has spread its use to other fields of research, e.g. radiotherapy and space science. The tool provides possibility to simulate complex geometry, transportation in electric and magnetic fields and variety of physics models of interaction of particles with media. Geant4 has been used for simulation of radiation effects for number of space missions. Recent upgrades of the toolkit released in December 2009 include new model for ion electronic stopping power based on the revised version of ICRU'73 Report increasing accuracy of simulation of ion transport. In the current work we present the status of Geant4 electromagnetic package for simulation of particle energy loss, ranges and transmission. This has a direct implication for simulation of ground testing setups at existing European facilities and for simulation of radiation effects in space. A number of improvements were introduced for electron and proton transport, followed by a thorough validation. It was the aim of the present study to validate the range against reference data from the United States National Institute of Standards and Technologies (NIST) ESTAR, PSTAR and ASTAR databases. We compared Geant4 and NIST ranges of electrons using different Geant4 models. The best agreement was found for Penelope, except at very low energies in heavy materials, where the Standard package gave better results. Geant4 proton ranges in water agreed with NIST within 1 The validation of the new ion model is performed against recent data on Bragg peak position in water. The data from transmission of carbon ions via various absorbers following Bragg peak in water demonstrate that the new Geant4 model significantly improves precision of ion range. The absolute accuracy of ion range achieved is on level of 1
Thermally Driven One-Fluid Electron-Proton Solar Wind: Eight-Moment Approximation
NASA Astrophysics Data System (ADS)
Olsen, Espen Lyngdal; Leer, Egil
1996-05-01
In an effort to improve the "classical" solar wind model, we study an eight-moment approximation hydrodynamic solar wind model, in which the full conservation equation for the heat conductive flux is solved together with the conservation equations for mass, momentum, and energy. We consider two different cases: In one model the energy flux needed to drive the solar wind is supplied as heat flux from a hot coronal base, where both the density and temperature are specified. In the other model, the corona is heated. In that model, the coronal base density and temperature are also specified, but the temperature increases outward from the coronal base due to a specified energy flux that is dissipated in the corona. The eight-moment approximation solutions are compared with the results from a "classical" solar wind model in which the collision-dominated gas expression for the heat conductive flux is used. It is shown that the "classical" expression for the heat conductive flux is generally not valid in the solar wind. In collisionless regions of the flow, the eight-moment approximation gives a larger thermalization of the heat conductive flux than the models using the collision-dominated gas approximation for the heat flux, but the heat flux is still larger than the "saturation heat flux." This leads to a breakdown of the electron distribution function, which turns negative in the collisionless region of the flow. By increasing the interaction between the electrons, the heat flux is reduced, and a reasonable shape is obtained on the distribution function. By solving the full set of equations consistent with the eight-moment distribution function for the electrons, we are thus able to draw inferences about the validity of the eight-moment description of the solar wind as well as the validity of the very commonly used collision-dominated gas approximation for the heat conductive flux in the solar wind.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Nicholas H. C.; Dong, Hui; Oliver, Thomas A. A.
2015-09-28
Two dimensional electronic spectroscopy has proven to be a valuable experimental technique to reveal electronic excitation dynamics in photosynthetic pigment-protein complexes, nanoscale semiconductors, organic photovoltaic materials, and many other types of systems. It does not, however, provide direct information concerning the spatial structure and dynamics of excitons. 2D infrared spectroscopy has become a widely used tool for studying structural dynamics but is incapable of directly providing information concerning electronic excited states. 2D electronic-vibrational (2DEV) spectroscopy provides a link between these domains, directly connecting the electronic excitation with the vibrational structure of the system under study. In this work, we derivemore » response functions for the 2DEV spectrum of a molecular dimer and propose a method by which 2DEV spectra could be used to directly measure the electronic site populations as a function of time following the initial electronic excitation. We present results from the response function simulations which show that our proposed approach is substantially valid. This method provides, to our knowledge, the first direct experimental method for measuring the electronic excited state dynamics in the spatial domain, on the molecular scale.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Nicholas H. C.; Dong, Hui; Oliver, Thomas A. A.
2015-09-28
Two dimensional electronic spectroscopy has proved to be a valuable experimental technique to reveal electronic excitation dynamics in photosynthetic pigment-protein complexes, nanoscale semiconductors, organic photovoltaic materials, and many other types of systems. It does not, however, provide direct information concerning the spatial structure and dynamics of excitons. 2D infrared spectroscopy has become a widely used tool for studying structural dynamics but is incapable of directly providing information concerning electronic excited states. 2D electronic-vibrational (2DEV) spectroscopy provides a link between these domains, directly connecting the electronic excitation with the vibrational structure of the system under study. In this work, we derivemore » response functions for the 2DEV spectrum of a molecular dimer and propose a method by which 2DEV spectra could be used to directly measure the electronic site populations as a function of time following the initial electronic excitation. We present results from the response function simulations which show that our proposed approach is substantially valid. This method provides, to our knowledge, the first direct experimental method for measuring the electronic excited state dynamics in the spatial domain, on the molecular scale.« less
Lewis, Nicholas H C; Dong, Hui; Oliver, Thomas A A; Fleming, Graham R
2015-09-28
Two dimensional electronic spectroscopy has proved to be a valuable experimental technique to reveal electronic excitation dynamics in photosynthetic pigment-protein complexes, nanoscale semiconductors, organic photovoltaic materials, and many other types of systems. It does not, however, provide direct information concerning the spatial structure and dynamics of excitons. 2D infrared spectroscopy has become a widely used tool for studying structural dynamics but is incapable of directly providing information concerning electronic excited states. 2D electronic-vibrational (2DEV) spectroscopy provides a link between these domains, directly connecting the electronic excitation with the vibrational structure of the system under study. In this work, we derive response functions for the 2DEV spectrum of a molecular dimer and propose a method by which 2DEV spectra could be used to directly measure the electronic site populations as a function of time following the initial electronic excitation. We present results from the response function simulations which show that our proposed approach is substantially valid. This method provides, to our knowledge, the first direct experimental method for measuring the electronic excited state dynamics in the spatial domain, on the molecular scale.
Electron temperature critical gradient and transport stiffness in DIII-D
Smith, Sterling P.; Petty, Clinton C.; White, Anne E.; ...
2015-07-06
The electron energy flux has been probed as a function of electron temperature gradient on the DIII-D tokamak, in a continuing effort to validate turbulent transport models. In the scan of gradient, a critical electron temperature gradient has been found in the electron heat fluxes and stiffness at various radii in L-mode plasmas. The TGLF reduced turbulent transport model [G.M. Staebler et al, Phys. Plasmas 14, 055909 (2007)] and full gyrokinetic GYRO model [J. Candy and R.E. Waltz, J. Comput. Phys. 186, 545 (2003)] recover the general trend of increasing electron energy flux with increasing electron temperature gradient scale length,more » but they do not predict the absolute level of transport at all radii and gradients. Comparing the experimental observations of incremental (heat pulse) diffusivity and stiffness to the models’ reveals that TGLF reproduces the trends in increasing diffusivity and stiffness with increasing electron temperature gradient scale length with a critical gradient behavior. Furthermore, the critical gradient of TGLF is found to have a dependence on q 95, contrary to the independence of the experimental critical gradient from q 95.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Lei
Magnetic confinement fusion is one of the most promising approaches to achieve fusion energy. With the rapid increase of the computational power over the past decades, numerical simulation have become an important tool to study the fusion plasmas. Eventually, the numerical models will be used to predict the performance of future devices, such as the International Thermonuclear Experiment Reactor (ITER) or DEMO. However, the reliability of these models needs to be carefully validated against experiments before the results can be trusted. The validation between simulations and measurements is hard particularly because the quantities directly available from both sides are different.more » While the simulations have the information of the plasma quantities calculated explicitly, the measurements are usually in forms of diagnostic signals. The traditional way of making the comparison relies on the diagnosticians to interpret the measured signals as plasma quantities. The interpretation is in general very complicated and sometimes not even unique. In contrast, given the plasma quantities from the plasma simulations, we can unambiguously calculate the generation and propagation of the diagnostic signals. These calculations are called synthetic diagnostics, and they enable an alternate way to compare the simulation results with the measurements. In this dissertation, we present a platform for developing and applying synthetic diagnostic codes. Three diagnostics on the platform are introduced. The reflectometry and beam emission spectroscopy diagnostics measure the electron density, and the electron cyclotron emission diagnostic measures the electron temperature. The theoretical derivation and numerical implementation of a new two dimensional Electron cyclotron Emission Imaging code is discussed in detail. This new code has shown the potential to address many challenging aspects of the present ECE measurements, such as runaway electron effects, and detection of the cross phase between the electron temperature and density fluctuations.« less
White, A E; Schmitz, L; Peebles, W A; Carter, T A; Rhodes, T L; Doyle, E J; Gourdain, P A; Hillesheim, J C; Wang, G; Holland, C; Tynan, G R; Austin, M E; McKee, G R; Shafer, M W; Burrell, K H; Candy, J; DeBoo, J C; Prater, R; Staebler, G M; Waltz, R E; Makowski, M A
2008-10-01
A correlation electron cyclotron emission (CECE) diagnostic has been used to measure local, turbulent fluctuations of the electron temperature in the core of DIII-D plasmas. This paper describes the hardware and testing of the CECE diagnostic and highlights the importance of measurements of multifield fluctuation profiles for the testing and validation of nonlinear gyrokinetic codes. The process of testing and validating such codes is critical for extrapolation to next-step fusion devices. For the first time, the radial profiles of electron temperature and density fluctuations are compared to nonlinear gyrokinetic simulations. The CECE diagnostic at DIII-D uses correlation radiometry to measure the rms amplitude and spectrum of the electron temperature fluctuations. Gaussian optics are used to produce a poloidal spot size with w(o) approximately 1.75 cm in the plasma. The intermediate frequency filters and the natural linewidth of the EC emission determine the radial resolution of the CECE diagnostic, which can be less than 1 cm. Wavenumbers resolved by the CECE diagnostic are k(theta) < or = 1.8 cm(-1) and k(r) < or = 4 cm(-1), relevant for studies of long-wavelength turbulence associated with the trapped electron mode and the ion temperature gradient mode. In neutral beam heated L-mode plasmas, core electron temperature fluctuations in the region 0.5 < r/a < 0.9, increase with radius from approximately 0.5% to approximately 2%, similar to density fluctuations that are measured simultaneously with beam emission spectroscopy. After incorporating "synthetic diagnostics" to effectively filter the code output, the simulations reproduce the characteristics of the turbulence and transport at one radial location r/a = 0.5, but not at a second location, r/a = 0.75. These results illustrate that measurements of the profiles of multiple fluctuating fields can provide a significant constraint on the turbulence models employed by the code.
NASA Astrophysics Data System (ADS)
Wayson, Michael B.; Bolch, Wesley E.
2018-04-01
Various computational tools are currently available that facilitate patient organ dosimetry in diagnostic nuclear medicine, yet they are typically restricted to reporting organ doses to ICRP-defined reference phantoms. The present study, while remaining computational phantom based, provides straightforward tools to adjust reference phantom organ dose for both internal photon and electron sources. A wide variety of monoenergetic specific absorbed fractions were computed using radiation transport simulations for tissue spheres of varying size and separation distance. Scaling methods were then constructed for both photon and electron self-dose and cross-dose, with data validation provided from patient-specific voxel phantom simulations, as well as via comparison to the scaling methodology given in MIRD Pamphlet No. 11. Photon and electron self-dose was found to be dependent on both radiation energy and sphere size. Photon cross-dose was found to be mostly independent of sphere size. Electron cross-dose was found to be dependent on sphere size when the spheres were in close proximity, owing to differences in electron range. The validation studies showed that this dataset was more effective than the MIRD 11 method at predicting patient-specific photon doses for at both high and low energies, but gave similar results at photon energies between 100 keV and 1 MeV. The MIRD 11 method for electron self-dose scaling was accurate for lower energies but began to break down at higher energies. The photon cross-dose scaling methodology developed in this study showed gains in accuracy of up to 9% for actual patient studies, and the electron cross-dose scaling methodology showed gains in accuracy up to 9% as well when only the bremsstrahlung component of the cross-dose was scaled. These dose scaling methods are readily available for incorporation into internal dosimetry software for diagnostic phantom-based organ dosimetry.
Validating an electronic health literacy scale in an older hispanic population.
Aponte, Judith; Nokes, Kathleen M
2017-09-01
To examine the validity of the Spanish version of an instrument used to measure electronic health literacy (eHEALS) with an older Hispanic population from a number of Spanish-language countries living in New York City in the United States (US). Although the Internet is available globally, complex skills are needed to use this source of valuable health-related information effectively. Electronic health literacy is a multifactorial concept that includes health literacy but also requires technology skills. Cross-sectional. Recruitment occurred at a Senior Organization located in a largely Hispanic neighbourhood in New York City (N = 100). Participants completed eHEALS and selected items from the Health Information National Trends Survey (HINTS) which assesses how adults use different communication channels, including the Internet, to obtain vital health information. Data from the US HINTS sample (N = 162) were matched to the Senior Organization sample on age range and Hispanic ethnicity. The average Senior Organization participant was 68 years old, female, born in one of six different Spanish-language countries, and completed high school while the average HINTS participant was 67 years old, female and had high school or less education. Although there was no relationship with the two HINTS subscales and electronic health literacy, there were significant relationships between electronic health literacy and health status and confidence in self-care. Inadequate electronic health literacy is a barrier to positive health outcomes. The Spanish version of eHEALS could be used as a screening instrument to identify gaps and tailored interventions could be developed to increase consumer confidence in using the Internet for reliable health-related information. Knowledge in self-management is related to positive health outcomes; all persons irrespective of their electronic health literacy should be able to use all sources of health information to enhance their self-care. © 2017 John Wiley & Sons Ltd.
Wayson, Michael B; Bolch, Wesley E
2018-04-13
Various computational tools are currently available that facilitate patient organ dosimetry in diagnostic nuclear medicine, yet they are typically restricted to reporting organ doses to ICRP-defined reference phantoms. The present study, while remaining computational phantom based, provides straightforward tools to adjust reference phantom organ dose for both internal photon and electron sources. A wide variety of monoenergetic specific absorbed fractions were computed using radiation transport simulations for tissue spheres of varying size and separation distance. Scaling methods were then constructed for both photon and electron self-dose and cross-dose, with data validation provided from patient-specific voxel phantom simulations, as well as via comparison to the scaling methodology given in MIRD Pamphlet No. 11. Photon and electron self-dose was found to be dependent on both radiation energy and sphere size. Photon cross-dose was found to be mostly independent of sphere size. Electron cross-dose was found to be dependent on sphere size when the spheres were in close proximity, owing to differences in electron range. The validation studies showed that this dataset was more effective than the MIRD 11 method at predicting patient-specific photon doses for at both high and low energies, but gave similar results at photon energies between 100 keV and 1 MeV. The MIRD 11 method for electron self-dose scaling was accurate for lower energies but began to break down at higher energies. The photon cross-dose scaling methodology developed in this study showed gains in accuracy of up to 9% for actual patient studies, and the electron cross-dose scaling methodology showed gains in accuracy up to 9% as well when only the bremsstrahlung component of the cross-dose was scaled. These dose scaling methods are readily available for incorporation into internal dosimetry software for diagnostic phantom-based organ dosimetry.
Progress & Frontiers in PV Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deline, Chris; DiOrio, Nick; Jordan, Dirk
2016-09-12
PowerPoint slides for a presentation given at Solar Power International 2016. Presentation includes System Advisor Model (SAM) introduction and battery modeling, bifacial PV modules and modeling, shade modeling and module level power electronics (MLPE), degradation rates, and PVWatts updates and validation.
DOT National Transportation Integrated Search
2009-03-20
International Electronic Machines Corporation (IEM) has developed and is now marketing a state-of-the-art Wheel Inspection System Environment (WISE). WISE provides wheel profile and dimensional measurements, i.e. rim thickness, flange height, flange ...
Power generator driven by Maxwell's demon
NASA Astrophysics Data System (ADS)
Chida, Kensaku; Desai, Samarth; Nishiguchi, Katsuhiko; Fujiwara, Akira
2017-05-01
Maxwell's demon is an imaginary entity that reduces the entropy of a system and generates free energy in the system. About 150 years after its proposal, theoretical studies explained the physical validity of Maxwell's demon in the context of information thermodynamics, and there have been successful experimental demonstrations of energy generation by the demon. The demon's next task is to convert the generated free energy to work that acts on the surroundings. Here, we demonstrate that Maxwell's demon can generate and output electric current and power with individual randomly moving electrons in small transistors. Real-time monitoring of electron motion shows that two transistors functioning as gates that control an electron's trajectory so that an electron moves directionally. A numerical calculation reveals that power generation is increased by miniaturizing the room in which the electrons are partitioned. These results suggest that evolving transistor-miniaturization technology can increase the demon's power output.
Inductive voltage adder (IVA) for submillimeter radius electron beam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazarakis, M.G.; Poukey, J.W.; Maenchen, J.E.
The authors have already demonstrated the utility of inductive voltage adder accelerators for production of small-size electron beams. In this approach, the inductive voltage adder drives a magnetically immersed foilless diode to produce high-energy (10--20 MeV), high-brightness pencil electron beams. This concept was first demonstrated with the successful experiments which converted the linear induction accelerator RADLAC II into an IVA fitted with a small 1-cm radius cathode magnetically immersed foilless diode (RADLAC II/SMILE). They present here first validations of extending this idea to mm-scale electron beams using the SABRE and HERMES-III inductive voltage adders as test beds. The SABRE experimentsmore » are already completed and have produced 30-kA, 9-MeV electron beams with envelope diameter of 1.5-mm FWHM. The HERMES-III experiments are currently underway.« less
NASA Astrophysics Data System (ADS)
Wang, RuLin; Zheng, Xiao; Kwok, YanHo; Xie, Hang; Chen, GuanHua; Yam, ChiYung
2015-04-01
Understanding electronic dynamics on material surfaces is fundamentally important for applications including nanoelectronics, inhomogeneous catalysis, and photovoltaics. Practical approaches based on time-dependent density functional theory for open systems have been developed to characterize the dissipative dynamics of electrons in bulk materials. The accuracy and reliability of such approaches depend critically on how the electronic structure and memory effects of surrounding material environment are accounted for. In this work, we develop a novel squared-Lorentzian decomposition scheme, which preserves the positive semi-definiteness of the environment spectral matrix. The resulting electronic dynamics is guaranteed to be both accurate and convergent even in the long-time limit. The long-time stability of electronic dynamics simulation is thus greatly improved within the current decomposition scheme. The validity and usefulness of our new approach are exemplified via two prototypical model systems: quasi-one-dimensional atomic chains and two-dimensional bilayer graphene.
NASA Astrophysics Data System (ADS)
Zhou, Chen; Lei, Yong; Li, Bofeng; An, Jiachun; Zhu, Peng; Jiang, Chunhua; Zhao, Zhengyu; Zhang, Yuannong; Ni, Binbin; Wang, Zemin; Zhou, Xuhua
2015-12-01
Global Positioning System (GPS) computerized ionosphere tomography (CIT) and ionospheric sky wave ground backscatter radar are both capable of measuring the large-scale, two-dimensional (2-D) distributions of ionospheric electron density (IED). Here we report the spatial and temporal electron density results obtained by GPS CIT and backscatter ionogram (BSI) inversion for three individual experiments. Both the GPS CIT and BSI inversion techniques demonstrate the capability and the consistency of reconstructing large-scale IED distributions. To validate the results, electron density profiles obtained from GPS CIT and BSI inversion are quantitatively compared to the vertical ionosonde data, which clearly manifests that both methods output accurate information of ionopsheric electron density and thereby provide reliable approaches to ionospheric soundings. Our study can improve current understanding of the capability and insufficiency of these two methods on the large-scale IED reconstruction.
Johnston-Peck, Aaron C; Winterstein, Jonathan P; Roberts, Alan D; DuChene, Joseph S; Qian, Kun; Sweeny, Brendan C; Wei, Wei David; Sharma, Renu; Stach, Eric A; Herzing, Andrew A
2016-03-01
Low-angle annular dark field (LAADF) scanning transmission electron microscopy (STEM) imaging is presented as a method that is sensitive to the oxidation state of cerium ions in CeO2 nanoparticles. This relationship was validated through electron energy loss spectroscopy (EELS), in situ measurements, as well as multislice image simulations. Static displacements caused by the increased ionic radius of Ce(3+) influence the electron channeling process and increase electron scattering to low angles while reducing scatter to high angles. This process manifests itself by reducing the high-angle annular dark field (HAADF) signal intensity while increasing the LAADF signal intensity in close proximity to Ce(3+) ions. This technique can supplement STEM-EELS and in so doing, relax the experimental challenges associated with acquiring oxidation state information at high spatial resolutions. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
LeBel, Kenneth A.; Poivey, Christian; Barth, Janet L.
2003-01-01
This viewgraph presentation presents an overview of the use of in-flight science data to review the radiation effects on commercial off the shelf (COTS) electronics used in recent spacecraft missions. The authors review the hazards that the space radiation environment pose for spacecraft electronics. They specifically discuss long term effects such as total ionizing dose (TID) and short term effects like single particle events (SEE). The advantages of using COTS electronics despite not being radiation hardened are mentioned. The reasons cite for tracking in-flight performance of COTS electronics include: anomaly resolution, validate ground tests and environmental predictions and provide lessons for future designers. Sample radiation impacts of science data from the following missions are analyzed: SOHO/LASCO 3 Coronograph, Microwave Anisotrophy Probe, Hubble Space Telescope and Chandra X-Ray Observatory.
Bogdan Neculaes, V.; Zou, Yun; Zavodszky, Peter; Inzinna, Louis; Zhang, Xi; Conway, Kenneth; Caiafa, Antonio; Frutschy, Kristopher; Waters, William; De Man, Bruno
2014-01-01
A novel electron beam focusing scheme for medical X-ray sources is described in this paper. Most vacuum based medical X-ray sources today employ a tungsten filament operated in temperature limited regime, with electrostatic focusing tabs for limited range beam optics. This paper presents the electron beam optics designed for the first distributed X-ray source in the world for Computed Tomography (CT) applications. This distributed source includes 32 electron beamlets in a common vacuum chamber, with 32 circular dispenser cathodes operated in space charge limited regime, where the initial circular beam is transformed into an elliptical beam before being collected at the anode. The electron beam optics designed and validated here are at the heart of the first Inverse Geometry CT system, with potential benefits in terms of improved image quality and dramatic X-ray dose reduction for the patient. PMID:24826066
STOPP/START Medication Criteria Modified for US Nursing Home Setting
Khodyakov, Dmitry; Ochoa, Aileen; Olivieri-Mui, Brianne L.; Bouwmeester, Carla; Zarowitz, Barbara J.; Patel, Meenakshi; Ching, Diana; Briesacher, Becky
2016-01-01
STRUCTURED ABSTRACT BACKGROUND/OBJECTIVES A barrier to assessing the quality of prescribing in nursing homes (NH) is the lack of explicit criteria for this setting. Our objective was to develop a set of prescribing indicators measurable with available data from electronic nursing home databases by adapting the European-based 2014 STOPP/START criteria of potentially inappropriate and underused medications for the US setting. DESIGN A two-stage expert panel process. In first stage, investigator team reviewed 114 criteria for compatibility and measurability. In second stage, we convened an online modified e-Delphi (OMD) panel to rate the validity of criteria and two webinars to identify criteria with highest relevance to US NHs. PARTICIPANTS Seventeen experts with recognized reputations in NH care participated in the e-Delphi panel and 12 in the webinar. MEASUREMENTS Compatibility and measurability were assessed by comparing criteria to US terminology/setting standards and data elements in NH databases. Validity was rated with a 9-point Likert-type scale (1=not valid at all, 9=highly valid). Mean, median, interpercentile ranges, and agreement were determined for each criterion score. Relevance was determined by ranking the mean panel ratings on criteria that reached agreement; half of the criteria with the highest mean values were reviewed and approved by the webinar participants. RESULTS Fifty-three STOPP/START criteria were deemed as compatible with US setting and measurable using data from electronic NH databases. E-Delphi panelists rated 48 criteria as valid for US NHs. Twenty-four criteria were deemed as most relevant, consisting of 22 measures of potentially inappropriate medications and 2 measures of underused medications. CONCLUSION This study created the first explicit criteria for assessing the quality of prescribing in US NHs. PMID:28008599
A new e-beam application in the pharmaceutical industry
NASA Astrophysics Data System (ADS)
Sadat, Theo; Malcolm, Fiona
2005-10-01
The paper presents a new electron beam application in the pharmaceutical industry: an in-line self-shielded atropic transfer system using electron beam for surface decontamination of products entering a pharmaceutical filling line. The unit was developed by Linac Technologies in response to the specifications of a multi-national pharmaceutical company, to solve the risk of microbial contamination entering a filling line housed inside an isolator. In order to fit the sterilization unit inside the pharmaceutical plant, a "miniature" low-energy (200 keV) electron beam accelerator and e-beam tunnel were designed, all conforming to the pharmaceutical good manufacturing practice (GMP) regulations. Process validation using biological indicators is described, with reference to the regulations governing the pharmaceutical industry. Other industrial applications of a small-sized self-shielded electron beam sterilization unit are mentioned.
Population kinetics on K alpha lines of partially ionized Cl atoms.
Kawamura, Tohru; Nishimura, Hiroaki; Koike, Fumihiro; Ochi, Yoshihiro; Matsui, Ryoji; Miao, Wen Yong; Okihara, Shinichiro; Sakabe, Shuji; Uschmann, Ingo; Förster, Eckhart; Mima, Kunioki
2002-07-01
A population kinetics code was developed to analyze K alpha emission from partially ionized chlorine atoms in hydrocarbon plasmas. Atomic processes are solved under collisional-radiative equilibrium for two-temperature plasmas. It is shown that the fast electrons dominantly contribute to ionize the K-shell bound electrons (i.e., inner-shell ionization) and the cold electrons to the outer-shell bound ones. Ratios of K alpha lines of partially ionized atoms are presented as a function of cold-electron temperature. The model was validated by observation of the K alpha lines from a chlorinated plastic target irradiated with 1 TW Ti:sapphire laser pulses at 1.5 x 10(17) W/cm(2), inferring a plasma temperature of about 100 eV on the target surface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breuer, Marian; Zarzycki, Piotr P.; Shi, Liang
2012-12-01
The free energy profile for electron flow through the bacterial deca-heme cytochrome MtrF has been computed using thermodynamic integration and classical molecular dynamics. The extensive calculations on two versions of the structure help validate the method and results, because differences in the profiles can be related to differences in the charged amino acids local to specific heme groups. First estimates of reorganization free energies λ yield a range consistent with expectations for partially solvent exposed cofactors, and reveal an activation energy range surmountable for electron flow. Future work will aim at increasing the accuracy of λ with polarizable force fieldmore » dynamics and quantum chemical energy gap calculations, as well as quantum chemical computation of electronic coupling matrix elements.« less
NASA Astrophysics Data System (ADS)
Winter, Thomas G.; Alston, Steven G.
1992-02-01
Cross sections have been determined for electron transfer and ionization in collisions between protons and He+ ions at proton energies from several hundred kilo-electron-volts to 2 MeV. A coupled-Sturmian approach is taken, extending the work of Winter [Phys. Rev. A 35, 3799 (1987)] and Stodden et al. [Phys. Rev. A 41, 1281 (1990)] to high energies where perturbative approaches are expected to be valid. An explicit connection is made with the first-order Born approximation for ionization and the impulse version of the distorted, strong-potential Born approximation for electron transfer. The capture cross section is shown to be affected by the presence of target basis functions of positive energy near v2/2, corresponding to the Thomas mechanism.
Development of a PDXP platform on NIF
NASA Astrophysics Data System (ADS)
Whitley, Heather; Schneider, Marilyn; Garbett, Warren; Pino, Jesse; Shepherd, Ronnie; Brown, Colin; Castor, John; Scott, Howard; Ellison, C. Leland; Benedict, Lorin; Sio, Hong; Lahmann, Brandon; Petrasso, Richard; Graziani, Frank
2016-10-01
Over the past several years, we have conducted theoretical investigations of electron-ion coupling and electronic transport in plasmas. In the regime of weakly coupled plasmas, we have identified models that we believe describe the physics well, but experimental measurements are still needed to validate the models. We are developing spectroscopic experiments to study electron-ion equilibration and electron heat transport using a polar direct drive exploding pusher (PDXP) platform at the National Ignition Facility (NIF). Initial measurements are focused on characterizing the laser-target coupling, symmetry of the PDXP implosion, and overall neutron and x-ray signals. We present images from the first set of shots and make comparisons with simulations from ARES and discuss next steps in the platform development. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-697489.
Fundamental edge broadening effects during focused electron beam induced nanosynthesis
Schmied, Roland; Fowlkes, Jason Davidson; Winkler, Robert; ...
2015-02-16
In this study, we explore lateral broadening effects of 3D structures fabricated through focused electron beam induced deposition using MeCpPt(IV)Me 3 precursor. In particular, the scaling behavior of proximity effects as a function of the primary electron energy and the deposit height is investigated through experiments and validated through simulations. Correlated Kelvin force microscopy and conductive atomic force microscopy measurements identified conductive and non-conductive proximity regions. It was determined that the highest primary electron energies enable the highest edge sharpness while lower energies contain a complex convolution of broadening effects. In addition, it is demonstrated that intermediate energies lead tomore » even more complex proximity effects that significantly reduce lateral edge sharpness and thus should be avoided if desiring high lateral resolution.« less
2008-01-01
information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD...microscopy ( AEM ), to characterize a variety of III-V semiconductor thin films. The materials investigated include superlattices based on the InAs- GaSb...technique. TEM observations were performed using a Philips-CM 200 FEG transmission electron microscope equipped with a field emission gun, operated at an
2015-02-01
with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...the impact of an electronic innovation must include a description of the sociotechnical context as well as the process and outcome metrics for...dissemination, will have a positive effect on nursing knowledge, use of evidence-based practices, and the achievement of nurse-sensitive patient outcomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacon, L. D.
Hybrid Band{trademark} (H-band) is a Lockheed Martin Missiles and Fire Control (LMMFC) designation for a specific RF modulation that causes disruption of select electronic components and circuits. H-Band enables conventional high-power microwave (HPM) effects (with a center frequency of 1 to 2 GHz, for example) using a higher frequency carrier signal. The primary technical objective of this project was to understand the fundamental physics of Hybrid Band{trademark} Radio Frequency effects on electronic systems. The follow-on objective was to develop and validate a Hybrid Band{trademark} effects analysis process.
On the lattice dynamics of metallic hydrogen and other Coulomb systems
NASA Technical Reports Server (NTRS)
Beck, H.; Straus, D.
1975-01-01
Numerical results for the phonon spectra of metallic hydrogen and other Coulomb systems in cubic lattices are presented. In second order in the electron-ion interaction, the behavior of the dielectric function of the interacting electron gas for arguments around the seond Fermi harmonic leads to drastic Kohn anomalies and even to imaginary phonon frequencies. Third-order band-structure corrections are also calculated. Properties of self-consistent phonons and the validity of the adiabatic approximation are discussed.
Three-Jet Production in Electron-Positron Collisions at Next-to-Next-to-Leading Order Accuracy
NASA Astrophysics Data System (ADS)
Del Duca, Vittorio; Duhr, Claude; Kardos, Adam; Somogyi, Gábor; Trócsányi, Zoltán
2016-10-01
We introduce a completely local subtraction method for fully differential predictions at next-to-next-to-leading order (NNLO) accuracy for jet cross sections and use it to compute event shapes in three-jet production in electron-positron collisions. We validate our method on two event shapes, thrust and C parameter, which are already known in the literature at NNLO accuracy and compute for the first time oblateness and the energy-energy correlation at the same accuracy.
Three-Jet Production in Electron-Positron Collisions at Next-to-Next-to-Leading Order Accuracy.
Del Duca, Vittorio; Duhr, Claude; Kardos, Adam; Somogyi, Gábor; Trócsányi, Zoltán
2016-10-07
We introduce a completely local subtraction method for fully differential predictions at next-to-next-to-leading order (NNLO) accuracy for jet cross sections and use it to compute event shapes in three-jet production in electron-positron collisions. We validate our method on two event shapes, thrust and C parameter, which are already known in the literature at NNLO accuracy and compute for the first time oblateness and the energy-energy correlation at the same accuracy.
Widlansky, Michael E.; Wang, Jingli; Shenouda, Sherene M.; Hagen, Tory M.; Smith, Anthony R.; Kizhakekuttu, Tinoy J.; Kluge, Matthew A.; Weihrauch, Dorothee; Gutterman, David D.; Vita, Joseph A.
2010-01-01
Mitochondrial membrane hyperpolarization and morphological changes are important in inflammatory cell activation. Despite the pathophysiological relevance, no valid and reproducible method for measuring mitochondrial homeostasis in human inflammatory cells is currently available. This study's purpose was to define and validate reproducible methods for measuring relevant mitochondrial perturbations and to determine whether these methods could discern mitochondrial perturbations in type 2 diabetes mellitus (T2DM), a condition associated with altered mitochondrial homeostasis. We employed 5,5',6,6'-tetrachloro-1,1'3,3'-tetraethylbenzamidazol-carboncyanine (JC-1) to estimate mitochondrial membrane potential (ψm) and acridine orange 10-nonyl bromide (NAO) to assess mitochondrial mass in human mononuclear cells isolated from blood. Both assays were reproducible. We validated our findings by electron microscopy and pharmacological manipulation of ψm. We measured JC-1 and NAO fluorescence in the mononuclear cells of 27 T2DM patients and 32 controls. Mitochondria were more polarized (P=0.02) and mitochondrial mass was lower in T2DM (P=0.008). Electron microscopy demonstrated diabetic mitochondria were smaller, more spherical, and occupied less cellular area in T2DM. Mitochondrial superoxide production was higher in T2DM (P=0.01). Valid and reproducible measurements of mitochondrial homeostasis can be made in human mononuclear cells using these fluorophores. Further, potential clinically relevant perturbations in mitochondrial homeostasis in T2DM human mononuclear cells can be detected. PMID:20621033
Cost-effectiveness of electronic training in domestic violence risk assessment: ODARA 101.
Hilton, N Zoe; Ham, Elke
2015-03-01
The need for domestic violence training has increased with the development of evidence-based risk assessment tools, which must be scored correctly for valid application. Emerging research indicates that training in domestic violence risk assessment can increase scoring accuracy, but despite the increasing popularity of electronic training, it is not yet known whether it can be an effective method of risk assessment training. In the present study, 87 assessors from various professions had training in the Ontario Domestic Assault Risk Assessment either face-to-face or using an electronic training program. The two conditions were equally effective, as measured by performance on a post-training skill acquisition test. Completion rates were 100% for face-to-face and 86% for electronic training, an improvement over a previously evaluated manual-only condition. The estimated per-trainee cost of electronic training was one third that of face-to-face training and expected to decrease. More rigorous evaluations of electronic training for risk assessment are recommended. © The Author(s) 2014.
Computational Study of Primary Electrons in the Cusp Region of an Ion Engine's Discharge Chamber
NASA Technical Reports Server (NTRS)
Stueber, Thomas J. (Technical Monitor); Deshpande, Shirin S.; Mahalingam, Sudhakar; Menart, James A.
2004-01-01
In this work a computer code called PRIMA is used to study the motion of primary electrons in the magnetic cusp region of the discharge chamber of an ion engine. Even though the amount of wall area covered by the cusps is very small, the cusp regions are important because prior computational analyses have indicated that most primary electrons leave the discharge chamber through the cusps. The analysis presented here focuses on the cusp region only. The affects of the shape and size of the cusp region on primary electron travel are studied as well as the angle and location at which the electron enters the cusp region. These affects are quantified using the confinement length and the number density distributions of the primary electrons. In addition to these results comparisons of the results from PRIMA are made to experimental results for a cylindrical discharge chamber with two magnetic rings. These comparisons indicate the validity of the computer code called PRIMA.
Edge roughness evaluation method for quantifying at-size beam blur in electron-beam lithography
NASA Astrophysics Data System (ADS)
Yoshizawa, Masaki; Moriya, Shigeru
2000-07-01
At-size beam blur at any given pattern size of an electron beam (EB) direct writer, HL800D, was quantified using the new edge roughness evaluation (ERE) method to optimize the electron-optical system. We characterized the two-dimensional beam-blur dependence on the electron deflection length of the EB direct writer. The results indicate that the beam blur ranged from 45 nm to 56 nm in a deflection field 2520 micrometer square. The new ERE method is based on the experimental finding that line edge roughness of a resist pattern is inversely proportional to the slope of the Gaussian-distributed quasi-beam-profile (QBP) proposed in this paper. The QBP includes effects of the beam blur, electron forward scattering, acid diffusion in chemically amplified resist (CAR), the development process, and aperture mask quality. The application the ERE method to investigating the beam-blur fluctuation demonstrates the validity of the ERE method in characterizing the electron-optical column conditions of EB projections such as SCALPEL and PREVAIL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bajaj, Sanyam, E-mail: bajaj.10@osu.edu; Shoron, Omor F.; Park, Pil Sung
We report on the direct measurement of two-dimensional sheet charge density dependence of electron transport in AlGaN/GaN high electron mobility transistors (HEMTs). Pulsed IV measurements established increasing electron velocities with decreasing sheet charge densities, resulting in saturation velocity of 1.9 × 10{sup 7 }cm/s at a low sheet charge density of 7.8 × 10{sup 11 }cm{sup −2}. An optical phonon emission-based electron velocity model for GaN is also presented. It accommodates stimulated longitudinal optical (LO) phonon emission which clamps the electron velocity with strong electron-phonon interaction and long LO phonon lifetime in GaN. A comparison with the measured density-dependent saturation velocity showsmore » that it captures the dependence rather well. Finally, the experimental result is applied in TCAD-based device simulator to predict DC and small signal characteristics of a reported GaN HEMT. Good agreement between the simulated and reported experimental results validated the measurement presented in this report and established accurate modeling of GaN HEMTs.« less
Dissipative time-dependent quantum transport theory.
Zhang, Yu; Yam, Chi Yung; Chen, GuanHua
2013-04-28
A dissipative time-dependent quantum transport theory is developed to treat the transient current through molecular or nanoscopic devices in presence of electron-phonon interaction. The dissipation via phonon is taken into account by introducing a self-energy for the electron-phonon coupling in addition to the self-energy caused by the electrodes. Based on this, a numerical method is proposed. For practical implementation, the lowest order expansion is employed for the weak electron-phonon coupling case and the wide-band limit approximation is adopted for device and electrodes coupling. The corresponding hierarchical equation of motion is derived, which leads to an efficient and accurate time-dependent treatment of inelastic effect on transport for the weak electron-phonon interaction. The resulting method is applied to a one-level model system and a gold wire described by tight-binding model to demonstrate its validity and the importance of electron-phonon interaction for the quantum transport. As it is based on the effective single-electron model, the method can be readily extended to time-dependent density functional theory.
November 2013 Analysis of High Energy Electrons on the Japan Experimental Module (JEM: Kibo)
NASA Technical Reports Server (NTRS)
Badavi, Francis F.; Matsumoto, Haruhisa; Koga, Kiyokazu; Mertens, Christopher J.; Slaba, Tony C.; Norbury, John W.
2015-01-01
Albedo (precipitating/splash) electrons, created by galactic cosmic rays (GCR) interaction with the upper atmosphere move upwards away from the surface of the earth. In the past validation work these particles were often considered to have negligible contribution to astronaut radiation exposure on the International Space Station (ISS). Estimates of astronaut exposure based on the available Computer Aided Design (CAD) models of ISS consistently underestimated measurements onboard ISS when the contribution of albedo particles to exposure were neglected. Recent measurements of high energy electrons outside ISS Japan Experimental Module (JEM) using Exposed Facility (EF), Space Environment Data Acquisition Equipment - Attached Payload (SEDA-AP) and Standard DOse Monitor (SDOM), indicates the presence of high energy electrons at ISS altitude. In this presentation the status of these energetic electrons is reviewed and mechanism for the creation of these particles inside/outside South Atlantic Anomaly (SAA) region explained. In addition, limited dosimetric evaluation of these electrons at 600 MeV and 10 GeV is presented.
Coleman, Nathan; Halas, Gayle; Peeler, William; Casaclang, Natalie; Williamson, Tyler; Katz, Alan
2015-02-05
Electronic Medical Records (EMRs) are increasingly used in the provision of primary care and have been compiled into databases which can be utilized for surveillance, research and informing practice. The primary purpose of these records is for the provision of individual patient care; validation and examination of underlying limitations is crucial for use for research and data quality improvement. This study examines and describes the validity of chronic disease case definition algorithms and factors affecting data quality in a primary care EMR database. A retrospective chart audit of an age stratified random sample was used to validate and examine diagnostic algorithms applied to EMR data from the Manitoba Primary Care Research Network (MaPCReN), part of the Canadian Primary Care Sentinel Surveillance Network (CPCSSN). The presence of diabetes, hypertension, depression, osteoarthritis and chronic obstructive pulmonary disease (COPD) was determined by review of the medical record and compared to algorithm identified cases to identify discrepancies and describe the underlying contributing factors. The algorithm for diabetes had high sensitivity, specificity and positive predictive value (PPV) with all scores being over 90%. Specificities of the algorithms were greater than 90% for all conditions except for hypertension at 79.2%. The largest deficits in algorithm performance included poor PPV for COPD at 36.7% and limited sensitivity for COPD, depression and osteoarthritis at 72.0%, 73.3% and 63.2% respectively. Main sources of discrepancy included missing coding, alternative coding, inappropriate diagnosis detection based on medications used for alternate indications, inappropriate exclusion due to comorbidity and loss of data. Comparison to medical chart review shows that at MaPCReN the CPCSSN case finding algorithms are valid with a few limitations. This study provides the basis for the validated data to be utilized for research and informs users of its limitations. Analysis of underlying discrepancies provides the ability to improve algorithm performance and facilitate improved data quality.
Small Portable Analyzer Diagnostic Equipment (SPADE) Program -- Diagnostic Software Validation
1984-07-01
Electronic Equipment Electromagnetic Emission and Susceptibility Requirements for the Control of Electromagnetic Interference Electromagnetic...ONLY. ORIENTATION OF DEFECT LOOKING HHO QIlILL: t -ed’-o· Significant efforts were expended to simulate spalling failures associated with naturally
DoD STINFO Manager Training Course. Training Manual
1993-02-01
The Export Control Classification Number ( ECCN ) 2. Types of controls, e.g., COCOM 3. Requirements, such as: a. Country groups for which a validated...see Export Administration Act EAR - see Export Administration Regulations ECCN - Export Control Classification Number ELINT - Electronic
19 CFR 132.18 - License for certain worsted wool fabric subject to tariff-rate quota.
Code of Federal Regulations, 2014 CFR
2014-04-01
... consumption (Customs Form 7501, column 34), or its electronic equivalent (see paragraph (c)(1) of this section... suits, suit-type jackets, or trousers, as required under these subheadings. (c) Validity of license—(1...
19 CFR 132.18 - License for certain worsted wool fabric subject to tariff-rate quota.
Code of Federal Regulations, 2013 CFR
2013-04-01
... consumption (Customs Form 7501, column 34), or its electronic equivalent (see paragraph (c)(1) of this section... suits, suit-type jackets, or trousers, as required under these subheadings. (c) Validity of license—(1...
19 CFR 132.18 - License for certain worsted wool fabric subject to tariff-rate quota.
Code of Federal Regulations, 2010 CFR
2010-04-01
... consumption (Customs Form 7501, column 34), or its electronic equivalent (see paragraph (c)(1) of this section... suits, suit-type jackets, or trousers, as required under these subheadings. (c) Validity of license—(1...
19 CFR 132.18 - License for certain worsted wool fabric subject to tariff-rate quota.
Code of Federal Regulations, 2012 CFR
2012-04-01
... consumption (Customs Form 7501, column 34), or its electronic equivalent (see paragraph (c)(1) of this section... suits, suit-type jackets, or trousers, as required under these subheadings. (c) Validity of license—(1...
19 CFR 132.18 - License for certain worsted wool fabric subject to tariff-rate quota.
Code of Federal Regulations, 2011 CFR
2011-04-01
... consumption (Customs Form 7501, column 34), or its electronic equivalent (see paragraph (c)(1) of this section... suits, suit-type jackets, or trousers, as required under these subheadings. (c) Validity of license—(1...
NASA Astrophysics Data System (ADS)
Sawlani, Kapil; Herzog, Joshua M.; Kwak, Joowon; Foster, John
2012-10-01
The electron energy distribution function (EEDF) plays a very important role in determining thruster efficiency as it determines various gas phase reaction rates. In Hall thrusters, secondary electron emission derived from the interaction of energetic electrons with ceramic channel surfaces influence the overall shape of the EEDF as well as determine the potential difference between the plasma and wall. The role of secondary electrons on the discharge operation of Hall thrusters is poorly understood. Experimentally, determining this effect is even more taxing as the secondary electron yield (SEY) varies drastically based on many parameters such as incident electron energies, flux and impact angle, and also on the surface properties such as temperature and roughness. The electron transport is also affected by the profile of the magnetic field, which is not uniform across the length of the accelerating channel. The goal of this work is to map out the variation of the EEDF and potential profile in response to the controlled introduction of secondary electrons. This data is expected to serve as a tool to validate and improve existing numerical models by providing boundary conditions and SEY for various situations that are encountered in Hall thrusters.
NASA Astrophysics Data System (ADS)
Hu, Yuan; Wang, Joseph
2017-03-01
This paper presents a fully kinetic particle particle-in-cell simulation study on the emission of a collisionless plasma plume consisting of cold beam ions and thermal electrons. Results are presented for both the two-dimensional macroscopic plume structure and the microscopic electron kinetic characteristics. We find that the macroscopic plume structure exhibits several distinctive regions, including an undisturbed core region, an electron cooling expansion region, and an electron isothermal expansion region. The properties of each region are determined by microscopic electron kinetic characteristics. The division between the undisturbed region and the cooling expansion region approximately matches the Mach line generated at the edge of the emission surface, and that between the cooling expansion region and the isothermal expansion region approximately matches the potential well established in the beam. The interactions between electrons and the potential well lead to a new, near-equilibrium state different from the initial distribution for the electrons in the isothermal expansion region. The electron kinetic characteristics in the plume are also very anisotropic. As the electron expansion process is mostly non-equilibrium and anisotropic, the commonly used assumption that the electrons in a collisionless, mesothermal plasma plume may be treated as a single equilibrium fluid in general is not valid.
2011-01-01
Background Organizational context has the potential to influence the use of new knowledge. However, despite advances in understanding the theoretical base of organizational context, its measurement has not been adequately addressed, limiting our ability to quantify and assess context in healthcare settings and thus, advance development of contextual interventions to improve patient care. We developed the Alberta Context Tool (the ACT) to address this concern. It consists of 58 items representing 10 modifiable contextual concepts. We reported the initial validation of the ACT in 2009. This paper presents the second stage of the psychometric validation of the ACT. Methods We used the Standards for Educational and Psychological Testing to frame our validity assessment. Data from 645 English speaking healthcare aides from 25 urban residential long-term care facilities (nursing homes) in the three Canadian Prairie Provinces were used for this stage of validation. In this stage we focused on: (1) advanced aspects of internal structure (e.g., confirmatory factor analysis) and (2) relations with other variables validity evidence. To assess reliability and validity of scores obtained using the ACT we conducted: Cronbach's alpha, confirmatory factor analysis, analysis of variance, and tests of association. We also assessed the performance of the ACT when individual responses were aggregated to the care unit level, because the instrument was developed to obtain unit-level scores of context. Results Item-total correlations exceeded acceptable standards (> 0.3) for the majority of items (51 of 58). We ran three confirmatory factor models. Model 1 (all ACT items) displayed unacceptable fit overall and for five specific items (1 item on adequate space for resident care in the Organizational Slack-Space ACT concept and 4 items on use of electronic resources in the Structural and Electronic Resources ACT concept). This prompted specification of two additional models. Model 2 used the 7 scaled ACT concepts while Model 3 used the 3 count-based ACT concepts. Both models displayed substantially improved fit in comparison to Model 1. Cronbach's alpha for the 10 ACT concepts ranged from 0.37 to 0.92 with 2 concepts performing below the commonly accepted standard of 0.70. Bivariate associations between the ACT concepts and instrumental research utilization levels (which the ACT should predict) were statistically significant at the 5% level for 8 of the 10 ACT concepts. The majority (8/10) of the ACT concepts also showed a statistically significant trend of increasing mean scores when arrayed across the lowest to the highest levels of instrumental research use. Conclusions The validation process in this study demonstrated additional empirical support for construct validity of the ACT, when completed by healthcare aides in nursing homes. The overall pattern of the data was consistent with the structure hypothesized in the development of the ACT and supports the ACT as an appropriate measure for assessing organizational context in nursing homes. Caution should be applied in using the one space and four electronic resource items that displayed misfit in this study with healthcare aides until further assessments are made. PMID:21767378
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emritte, Mohammad Shehzad; Colafrancesco, Sergio; Marchegiani, Paolo, E-mail: Sergio.Colafrancesco@wits.ac.za, E-mail: emrittes@yahoo.com, E-mail: Paolo.Marchegiani@wits.ac.za
2016-07-01
Inverse Compton (IC) scattering of the anisotropic CMB fluctuations off cosmic electron plasmas generates a polarization of the associated Sunyaev-Zel'dovich (SZ) effect. The polarized SZ effect has important applications in cosmology and in astrophysics of galaxy clusters. However, this signal has been studied so far mostly in the non-relativistic regime which is valid only in the very low electron temperature limit for a thermal electron population and, as such, has limited astrophysical applications. Partial attempts to extend this calculation to the IC scattering of a thermal electron plasma in the relativistic regime have been done but these cannot be appliedmore » to a more general or mildly relativistic electron distribution. In this paper we derive a general form of the SZ effect polarization that is valid in the full relativistic approach for both thermal and non-thermal electron plasmas, as well as for a generic combination of various electron population which can be co-spatially distributed in the environments of galaxy clusters or radiogalaxy lobes. We derive the spectral shape of the Stokes parameters induced by the IC scattering of every CMB multipole for both thermal and non-thermal electron populations, focussing in particular on the CMB quadrupole and octupole that provide the largest detectable signals in cosmic structures (like galaxy clusters). We found that the CMB quadrupole induced Stoke parameter Q is always positive with a maximum amplitude at a frequency ≈ 216 GHz which increases non-linearly with increasing cluster temperature. On the contrary, the CMB octupole induced Q spectrum shows a cross-over frequency which depends on the cluster electron temperature in a linear way, while it shows a non-linear dependence on the minimum momentum p {sub 1} of a non-thermal power-law spectrum as well as a linear dependence on the power-law spectral index of the non-thermal electron population. We discuss some of the possibilities to disentangle the quadrupole-induced Q spectrum from the octupole-induced one which will allow to measure these important cosmological quantities through the SZ effect polarization at different cluster locations in the universe. We finally apply our model to the Bullet cluster and derive the visibility windows of the total, quandrupole-induced and octupole-induced Stoke parameter Q in the frequency ranges accessible to SKA, ALMA, MILLIMETRON and CORE++ experiments.« less
An analytical method for computing voxel S values for electrons and photons.
Amato, Ernesto; Minutoli, Fabio; Pacilio, Massimiliano; Campenni, Alfredo; Baldari, Sergio
2012-11-01
The use of voxel S values (VSVs) is perhaps the most common approach to radiation dosimetry for nonuniform distributions of activity within organs or tumors. However, VSVs are currently available only for a limited number of voxel sizes and radionuclides. The objective of this study was to develop a general method to evaluate them for any spectrum of electrons and photons in any cubic voxel dimension of practical interest for clinical dosimetry in targeted radionuclide therapy. The authors developed a Monte Carlo simulation in Geant4 in order to evaluate the energy deposited per disintegration (E(dep)) in a voxelized region of soft tissue from monoenergetic electrons (10-2000 keV) or photons (10-1000 keV) homogeneously distributed in the central voxel, considering voxel dimensions ranging from 3 mm to 10 mm. E(dep) was represented as a function of a dimensionless quantity termed the "normalized radius," R(n) = R∕l, where l is the voxel size and R is the distance from the origin. The authors introduced two parametric functions in order to fit the electron and photon results, and they interpolated the parameters to derive VSVs for any energy and voxel side within the ranges mentioned above. In order to validate the results, the authors determined VSV for two radionuclides ((131)I and (89)Sr) and two voxel dimensions and they compared them with reference data. A validation study in a simple sphere model, accounting for tissue inhomogeneities, is presented. The E(dep)(R(n)) for both monoenergetic electrons and photons exhibit a smooth variation with energy and voxel size, implying that VSVs for monoenergetic electrons or photons may be derived by interpolation over the range of energies and dimensions considered. By integration, S values for continuous emission spectra from β(-) decay may be derived as well. The approach allows the determination of VSVs for monoenergetic (Auger or conversion) electrons and (x-ray or gamma-ray) photons by means of two functions whose parameters can be interpolated from tabular data provided. Through integration, it is possible to generalize the method to any continuous (beta) spectrum, allowing to calculate VSVs for any electron and photon emitter in a voxelized structure.
Duracinsky, Martin; Lalanne, Christophe; Goujard, Cécile; Herrmann, Susan; Cheung-Lung, Christian; Brosseau, Jean-Paul; Schwartz, Yannick; Chassany, Olivier
2014-04-25
Electronic patient-reported outcomes (PRO) provide quick and usually reliable assessments of patients' health-related quality of life (HRQL). An electronic version of the Patient-Reported Outcomes Quality of Life-human immunodeficiency virus (PROQOL-HIV) questionnaire was developed, and its face validity and reliability were assessed using standard psychometric methods. A sample of 80 French outpatients (66% male, 52/79; mean age 46.7 years, SD 10.9) were recruited. Paper-based and electronic questionnaires were completed in a randomized crossover design (2-7 day interval). Biomedical data were collected. Questionnaire version and order effects were tested on full-scale scores in a 2-way ANOVA with patients as random effects. Test-retest reliability was evaluated using Pearson and intraclass correlation coefficients (ICC, with 95% confidence interval) for each dimension. Usability testing was carried out from patients' survey reports, specifically, general satisfaction, ease of completion, quality and clarity of user interface, and motivation to participate in follow-up PROQOL-HIV electronic assessments. Questionnaire version and administration order effects (N=59 complete cases) were not significant at the 5% level, and no interaction was found between these 2 factors (P=.94). Reliability indexes were acceptable, with Pearson correlations greater than .7 and ICCs ranging from .708 to .939; scores were not statistically different between the two versions. A total of 63 (79%) complete patients' survey reports were available, and 55% of patients (30/55) reported being satisfied and interested in electronic assessment of their HRQL in clinical follow-up. Individual ratings of PROQOL-HIV user interface (85%-100% of positive responses) confirmed user interface clarity and usability. The electronic PROQOL-HIV introduces minor modifications to the original paper-based version, following International Society for Pharmacoeconomics and Outcomes Research (ISPOR) ePRO Task Force guidelines, and shows good reliability and face validity. Patients can complete the computerized PROQOL-HIV questionnaire and the scores from the paper or electronic versions share comparable accuracy and interpretation.
Lalanne, Christophe; Goujard, Cécile; Herrmann, Susan; Cheung-Lung, Christian; Brosseau, Jean-Paul; Schwartz, Yannick; Chassany, Olivier
2014-01-01
Background Electronic patient-reported outcomes (PRO) provide quick and usually reliable assessments of patients’ health-related quality of life (HRQL). Objective An electronic version of the Patient-Reported Outcomes Quality of Life-human immunodeficiency virus (PROQOL-HIV) questionnaire was developed, and its face validity and reliability were assessed using standard psychometric methods. Methods A sample of 80 French outpatients (66% male, 52/79; mean age 46.7 years, SD 10.9) were recruited. Paper-based and electronic questionnaires were completed in a randomized crossover design (2-7 day interval). Biomedical data were collected. Questionnaire version and order effects were tested on full-scale scores in a 2-way ANOVA with patients as random effects. Test-retest reliability was evaluated using Pearson and intraclass correlation coefficients (ICC, with 95% confidence interval) for each dimension. Usability testing was carried out from patients’ survey reports, specifically, general satisfaction, ease of completion, quality and clarity of user interface, and motivation to participate in follow-up PROQOL-HIV electronic assessments. Results Questionnaire version and administration order effects (N=59 complete cases) were not significant at the 5% level, and no interaction was found between these 2 factors (P=.94). Reliability indexes were acceptable, with Pearson correlations greater than .7 and ICCs ranging from .708 to .939; scores were not statistically different between the two versions. A total of 63 (79%) complete patients’ survey reports were available, and 55% of patients (30/55) reported being satisfied and interested in electronic assessment of their HRQL in clinical follow-up. Individual ratings of PROQOL-HIV user interface (85%-100% of positive responses) confirmed user interface clarity and usability. Conclusions The electronic PROQOL-HIV introduces minor modifications to the original paper-based version, following International Society for Pharmacoeconomics and Outcomes Research (ISPOR) ePRO Task Force guidelines, and shows good reliability and face validity. Patients can complete the computerized PROQOL-HIV questionnaire and the scores from the paper or electronic versions share comparable accuracy and interpretation. PMID:24769643
Simulating Pressure Profiles for the Free-Electron Laser Photoemission Gun Using Molflow+
NASA Astrophysics Data System (ADS)
Song, Diego; Hernandez-Garcia, Carlos
2012-10-01
The Jefferson Lab Free Electron Laser (FEL) generates tunable laser light by passing a relativistic electron beam generated in a high-voltage DC electron gun with a semiconducting photocathode through a magnetic undulator. The electron gun is in stringent vacuum conditions in order to guarantee photocathode longevity. Considering an upgrade of the electron gun, this project consists of simulating pressure profiles to determine if the novel design meets the electron gun vacuum requirements. The method of simulation employs the software Molflow+, developed by R. Kersevan at the Organisation Europ'eene pour la Recherche Nucl'eaire (CERN), which uses the test-particle Monte Carlo method to simulate molecular flows in 3D structures. Pressure is obtained along specified chamber axes. Results are then compared to measured pressure values from the existing gun for validation. Outgassing rates, surface area, and pressure were found to be proportionally related. The simulations indicate that the upgrade gun vacuum chamber requires more pumping compared to its predecessor, while it holds similar vacuum conditions. The ability to simulate pressure profiles through tools like Molflow+, allows researchers to optimize vacuum systems during the engineering process.
Feist, Adam M; Nagarajan, Harish; Rotaru, Amelia-Elena; Tremblay, Pier-Luc; Zhang, Tian; Nevin, Kelly P; Lovley, Derek R; Zengler, Karsten
2014-04-01
Geobacter species are of great interest for environmental and biotechnology applications as they can carry out direct electron transfer to insoluble metals or other microorganisms and have the ability to assimilate inorganic carbon. Here, we report on the capability and key enabling metabolic machinery of Geobacter metallireducens GS-15 to carry out CO2 fixation and direct electron transfer to iron. An updated metabolic reconstruction was generated, growth screens on targeted conditions of interest were performed, and constraint-based analysis was utilized to characterize and evaluate critical pathways and reactions in G. metallireducens. The novel capability of G. metallireducens to grow autotrophically with formate and Fe(III) was predicted and subsequently validated in vivo. Additionally, the energetic cost of transferring electrons to an external electron acceptor was determined through analysis of growth experiments carried out using three different electron acceptors (Fe(III), nitrate, and fumarate) by systematically isolating and examining different parts of the electron transport chain. The updated reconstruction will serve as a knowledgebase for understanding and engineering Geobacter and similar species.
Criticality of the electron-nucleus cusp condition to local effective potential-energy theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pan Xiaoyin; Sahni, Viraht; Graduate School of the City University of New York, 360 Fifth Avenue, New York, New York 10016
2003-01-01
Local(multiplicative) effective potential energy-theories of electronic structure comprise the transformation of the Schroedinger equation for interacting Fermi systems to model noninteracting Fermi or Bose systems whereby the equivalent density and energy are obtained. By employing the integrated form of the Kato electron-nucleus cusp condition, we prove that the effective electron-interaction potential energy of these model fermions or bosons is finite at a nucleus. The proof is general and valid for arbitrary system whether it be atomic, molecular, or solid state, and for arbitrary state and symmetry. This then provides justification for all prior work in the literature based on themore » assumption of finiteness of this potential energy at a nucleus. We further demonstrate the criticality of the electron-nucleus cusp condition to such theories by an example of the hydrogen molecule. We show thereby that both model system effective electron-interaction potential energies, as determined from densities derived from accurate wave functions, will be singular at the nucleus unless the wave function satisfies the electron-nucleus cusp condition.« less
Impact of state-specific flowfield modeling on atomic nitrogen radiation
NASA Astrophysics Data System (ADS)
Johnston, Christopher O.; Panesi, Marco
2018-01-01
A hypersonic flowfield model that treats electronic levels of the dominant afterbody radiator N as individual species is presented. This model allows electron-ion recombination rate and two-temperature modeling improvements, the latter which are shown to decrease afterbody radiative heating by up to 30%. This decrease is primarily due to the addition of the electron-impact excitation energy-exchange term to the energy equation governing the vibrational-electronic electron temperature. This model also allows the validity of the often applied quasi-steady-state (QSS) approximation to be assessed. The QSS approximation is shown to fail throughout most of the afterbody region for lower electronic states, although this impacts the radiative intensity reaching the surface by less than 15%. By computing the electronic-state populations of N within the flowfield solver, instead of through the QSS approximation in the radiation solver, the coupling of nonlocal radiative transition rates to the species continuity equations becomes feasible. Implementation of this higher-fidelity level of coupling between the flowfield and radiation solvers is shown to increase the afterbody radiation by up to 50% relative to the conventional model.
A Copmarative Review of Electronic Prescription Systems: Lessons Learned from Developed Countries
Samadbeik, Mahnaz; Ahmadi, Maryam; Sadoughi, Farahnaz; Garavand, Ali
2017-01-01
This review study aimed to compare the electronic prescription systems in five selected countries (Denmark, Finland, Sweden, England, and the United States). Compared developed countries were selected by the identified selection process from the countries that have electronic prescription systems. Required data were collected by searching the valid databases, most widely used search engines, and visiting websites related to the national electronic prescription system of each country and also sending E-mails to the related organizations using specifically designed data collection forms. The findings showed that the electronic prescription system was used at the national, state, local, and area levels in the studied countries and covered the whole prescription process or part of it. There were capabilities of creating electronic prescription, decision support, electronically transmitting prescriptions from prescriber systems to the pharmacies, retrieving the electronic prescription at the pharmacy, electronic refilling prescriptions in all studied countries. The patient, prescriber, and dispenser were main human actors, as well as the prescribing and dispensing providers were main system actors of the Electronic Prescription Service. The selected countries have accurate, regular, and systematic plans to use electronic prescription system, and health ministry of these countries was responsible for coordinating and leading the electronic health. It is suggested to use experiences and programs of the leading countries to design and develop the electronic prescription systems. PMID:28331859
Attainment of Electron Beam Suitable for Medium Energy Electron Cooling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seletskiy, Sergei M.
Electron cooling of charged particle beams is a well-established technique at electron energies of up to 300 keV. However, up to the present time the advance of electron cooling to the MeV-range energies has remained a purely theoretical possibility. The electron cooling project at Fermilab has recently demonstrated the ¯rst cooling of 8.9 GeV/c antiprotons in the Recycler ring, and therefore, has proved the validity of the idea of relativistic electron cool- ing. The Recycler Electron Cooler (REC) is the key component of the Teva- tron Run II luminosity upgrade project. Its performance depends critically on the quality of electronmore » beam. A stable electron beam of 4.3 MeV car- rying 0.5 A of DC current is required. The beam suitable for the Recycler Electron Cooler must have an angular spread not exceeding 200 ¹rad. The full-scale prototype of the REC was designed, built and tested at Fermilab in the Wideband laboratory to study the feasibility of attaining the high-quality electron beam. In this thesis I describe various aspects of development of the Fermilab electron cooling system, and the techniques used to obtain the electron beam suitable for the cooling process. In particular I emphasize those aspects of the work for which I was principally responsible.« less
Testing the Predictive Validity of the Hendrich II Fall Risk Model.
Jung, Hyesil; Park, Hyeoun-Ae
2018-03-01
Cumulative data on patient fall risk have been compiled in electronic medical records systems, and it is possible to test the validity of fall-risk assessment tools using these data between the times of admission and occurrence of a fall. The Hendrich II Fall Risk Model scores assessed during three time points of hospital stays were extracted and used for testing the predictive validity: (a) upon admission, (b) when the maximum fall-risk score from admission to falling or discharge, and (c) immediately before falling or discharge. Predictive validity was examined using seven predictive indicators. In addition, logistic regression analysis was used to identify factors that significantly affect the occurrence of a fall. Among the different time points, the maximum fall-risk score assessed between admission and falling or discharge showed the best predictive performance. Confusion or disorientation and having a poor ability to rise from a sitting position were significant risk factors for a fall.
NASA Technical Reports Server (NTRS)
1998-01-01
A local electronics manufacturer, the Sterling Manufacturing Company, was presented with the opportunity to supply 30,000 automotive cellular antennas to a European subsidiary of a large U.S. auto manufacturer. Although the company built an antenna that they believed would meet the auto manufacturer's specifications, they were unable to conduct the necessary validation tests in-house. They decided to work with NASA Lewis Research Center's Space Electronics Division, which, as part of its technology development program, evaluates the performance of antennas in its Microwave Systems Lab to assess their capabilities for space communications applications. Data measured in Lewis' Microwave Systems Lab proved that Sterling's antenna performed better than specified by the auto manufacturer.
Langmuir-Probe Measurements in Flowing-Afterglow Plasmas
NASA Technical Reports Server (NTRS)
Johnsen, R.; Shunko, E. V.; Gougousi, T.; Golde, M. F.
1994-01-01
The validity of the orbital-motion theory for cylindrical Langmuir probes immersed in flowing- afterglow plasmas is investigated experimentally. It is found that the probe currents scale linearly with probe area only for electron-collecting but not for ion-collecting probes. In general, no agreement is found between the ion and electron densities derived from the probe currents. Measurements in recombining plasmas support the conclusion that only the electron densities derived from probe measurements can be trusted to be of acceptable accuracy. This paper also includes a brief derivation of the orbital-motion theory, a discussion of perturbations of the plasma by the probe current, and the interpretation of plasma velocities obtained from probe measurements.
On the relativistic large-angle electron collision operator for runaway avalanches in plasmas
NASA Astrophysics Data System (ADS)
Embréus, O.; Stahl, A.; Fülöp, T.
2018-02-01
Large-angle Coulomb collisions lead to an avalanching generation of runaway electrons in a plasma. We present the first fully conservative large-angle collision operator, derived from the relativistic Boltzmann operator. The relation to previous models for large-angle collisions is investigated, and their validity assessed. We present a form of the generalized collision operator which is suitable for implementation in a numerical kinetic equation solver, and demonstrate the effect on the runaway-electron growth rate. Finally we consider the reverse avalanche effect, where runaways are slowed down by large-angle collisions, and show that the choice of operator is important if the electric field is close to the avalanche threshold.
Zhu, Hong-Ming; Chen, Jin-Wang; Pan, Xiao-Yin; Sahni, Viraht
2014-01-14
We derive via the interaction "representation" the many-body wave function for harmonically confined electrons in the presence of a magnetostatic field and perturbed by a spatially homogeneous time-dependent electric field-the Generalized Kohn Theorem (GKT) wave function. In the absence of the harmonic confinement - the uniform electron gas - the GKT wave function reduces to the Kohn Theorem wave function. Without the magnetostatic field, the GKT wave function is the Harmonic Potential Theorem wave function. We further prove the validity of the connection between the GKT wave function derived and the system in an accelerated frame of reference. Finally, we provide examples of the application of the GKT wave function.
An analysis of electronic document management in oncology care.
Poulter, Thomas; Gannon, Brian; Bath, Peter A
2012-06-01
In this research in progress, a reference model for the use of electronic patient record (EPR) systems in oncology is described. The model, termed CICERO, comprises technical and functional components, and emphasises usability, clinical safety and user acceptance. One of the functional components of the model-an electronic document and records management (EDRM) system-is monitored in the course of its deployment at a leading oncology centre in the UK. Specifically, the user requirements and design of the EDRM solution are described.The study is interpretative and forms part a wider research programme to define and validate the CICERO model. Preliminary conclusions confirm the importance of a socio-technical perspective in Onco-EPR system design.
Lippolis, Vincenzo; Ferrara, Massimo; Cervellieri, Salvatore; Damascelli, Anna; Epifani, Filomena; Pascale, Michelangelo; Perrone, Giancarlo
2016-02-02
The availability of rapid diagnostic methods for monitoring ochratoxigenic species during the seasoning processes for dry-cured meats is crucial and constitutes a key stage in order to prevent the risk of ochratoxin A (OTA) contamination. A rapid, easy-to-perform and non-invasive method using an electronic nose (e-nose) based on metal oxide semiconductors (MOS) was developed to discriminate dry-cured meat samples in two classes based on the fungal contamination: class P (samples contaminated by OTA-producing Penicillium strains) and class NP (samples contaminated by OTA non-producing Penicillium strains). Two OTA-producing strains of Penicillium nordicum and two OTA non-producing strains of Penicillium nalgiovense and Penicillium salamii, were tested. The feasibility of this approach was initially evaluated by e-nose analysis of 480 samples of both Yeast extract sucrose (YES) and meat-based agar media inoculated with the tested Penicillium strains and incubated up to 14 days. The high recognition percentages (higher than 82%) obtained by Discriminant Function Analysis (DFA), either in calibration and cross-validation (leave-more-out approach), for both YES and meat-based samples demonstrated the validity of the used approach. The e-nose method was subsequently developed and validated for the analysis of dry-cured meat samples. A total of 240 e-nose analyses were carried out using inoculated sausages, seasoned by a laboratory-scale process and sampled at 5, 7, 10 and 14 days. DFA provided calibration models that permitted discrimination of dry-cured meat samples after only 5 days of seasoning with mean recognition percentages in calibration and cross-validation of 98 and 88%, respectively. A further validation of the developed e-nose method was performed using 60 dry-cured meat samples produced by an industrial-scale seasoning process showing a total recognition percentage of 73%. The pattern of volatile compounds of dry-cured meat samples was identified and characterized by a developed HS-SPME/GC-MS method. Seven volatile compounds (2-methyl-1-butanol, octane, 1R-α-pinene, d-limonene, undecane, tetradecanal, 9-(Z)-octadecenoic acid methyl ester) allowed discrimination between dry-cured meat samples of classes P and NP. These results demonstrate that MOS-based electronic nose can be a useful tool for a rapid screening in preventing OTA contamination in the cured meat supply chain. Copyright © 2015 Elsevier B.V. All rights reserved.
Hebert, Courtney; Flaherty, Jennifer; Smyer, Justin; Ding, Jing; Mangino, Julie E
2018-03-01
Surveillance is an important tool for infection control; however, this task can often be time-consuming and take away from infection prevention activities. With the increasing availability of comprehensive electronic health records, there is an opportunity to automate these surveillance activities. The objective of this article is to describe the implementation of an electronic algorithm for ventilator-associated events (VAEs) at a large academic medical center METHODS: This article reports on a 6-month manual validation of a dashboard for VAEs. We developed a computerized algorithm for automatically detecting VAEs and compared the output of this algorithm to the traditional, manual method of VAE surveillance. Manual surveillance by the infection preventionists identified 13 possible and 11 probable ventilator-associated pneumonias (VAPs), and the VAE dashboard identified 16 possible and 13 probable VAPs. The dashboard had 100% sensitivity and 100% accuracy when compared with manual surveillance for possible and probable VAP. We report on the successfully implemented VAE dashboard. Workflow of the infection preventionists was simplified after implementation of the dashboard with subjective time-savings reported. Implementing a computerized dashboard for VAE surveillance at a medical center with a comprehensive electronic health record is feasible; however, this required significant initial and ongoing work on the part of data analysts and infection preventionists. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Quantitative Determination of Spring Water Quality Parameters via Electronic Tongue
Carbó, Noèlia; López Carrero, Javier; Garcia-Castillo, F. Javier; Olivas, Estela; Folch, Elisa; Alcañiz Fillol, Miguel; Soto, Juan
2017-01-01
The use of a voltammetric electronic tongue for the quantitative analysis of quality parameters in spring water is proposed here. The electronic voltammetric tongue consisted of a set of four noble electrodes (iridium, rhodium, platinum, and gold) housed inside a stainless steel cylinder. These noble metals have a high durability and are not demanding for maintenance, features required for the development of future automated equipment. A pulse voltammetry study was conducted in 83 spring water samples to determine concentrations of nitrate (range: 6.9–115 mg/L), sulfate (32–472 mg/L), fluoride (0.08–0.26 mg/L), chloride (17–190 mg/L), and sodium (11–94 mg/L) as well as pH (7.3–7.8). These parameters were also determined by routine analytical methods in spring water samples. A partial least squares (PLS) analysis was run to obtain a model to predict these parameter. Orthogonal signal correction (OSC) was applied in the preprocessing step. Calibration (67%) and validation (33%) sets were selected randomly. The electronic tongue showed good predictive power to determine the concentrations of nitrate, sulfate, chloride, and sodium as well as pH and displayed a lower R2 and slope in the validation set for fluoride. Nitrate and fluoride concentrations were estimated with errors lower than 15%, whereas chloride, sulfate, and sodium concentrations as well as pH were estimated with errors below 10%. PMID:29295592
Casper, T. A.; Meyer, W. H.; Jackson, G. L.; ...
2010-12-08
We are exploring characteristics of ITER startup scenarios in similarity experiments conducted on the DIII-D Tokamak. In these experiments, we have validated scenarios for the ITER current ramp up to full current and developed methods to control the plasma parameters to achieve stability. Predictive simulations of ITER startup using 2D free-boundary equilibrium and 1D transport codes rely on accurate estimates of the electron and ion temperature profiles that determine the electrical conductivity and pressure profiles during the current rise. Here we present results of validation studies that apply the transport model used by the ITER team to DIII-D discharge evolutionmore » and comparisons with data from our similarity experiments.« less
ESTEST: An Open Science Platform for Electronic Structure Research
ERIC Educational Resources Information Center
Yuan, Gary
2012-01-01
Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…
77 FR 34003 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... is to improve, develop, or finance businesses, industries, and employment and improve the economic..., electronic, mechanical, or other technological collection techniques or other forms of information technology... it displays a currently valid OMB control number. 30-Day Federal Register Notice Rural Business...
Code of Federal Regulations, 2010 CFR
2010-04-01
... pharmacy may process electronic prescriptions for controlled substances only if all of the following conditions are met: (1) The pharmacy uses a pharmacy application that meets all of the applicable... pharmacy, specified in part 1306 of this chapter, to ensure the validity of a controlled substance...
78 FR 11135 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-15
..., electronic, mechanical or other technological collection techniques or other forms of information technology... unless it displays a currently valid OMB control number. National Agricultural Statistics Service Title... National Agricultural Statistics Service (NASS) is to prepare and issue State and national estimates of...
Real-time Automated Sampling of Electronic Medical Records Predicts Hospital Mortality
Khurana, Hargobind S.; Groves, Robert H.; Simons, Michael P.; Martin, Mary; Stoffer, Brenda; Kou, Sherri; Gerkin, Richard; Reiman, Eric; Parthasarathy, Sairam
2016-01-01
Background Real-time automated continuous sampling of electronic medical record data may expeditiously identify patients at risk for death and enable prompt life-saving interventions. We hypothesized that a real-time electronic medical record-based alert could identify hospitalized patients at risk for mortality. Methods An automated alert was developed and implemented to continuously sample electronic medical record data and trigger when at least two of four systemic inflammatory response syndrome criteria plus at least one of 14 acute organ dysfunction parameters was detected. The SIRS/OD alert was applied real-time to 312,214 patients in 24 hospitals and analyzed in two phases: training and validation datasets. Results In the training phase, 29,317 (18.8%) triggered the alert and 5.2% of such patients died whereas only 0.2% without the alert died (unadjusted odds ratio 30.1; 95% confidence interval [95%CI] 26.1, 34.5; P<0.0001). In the validation phase, the sensitivity, specificity, area under curve (AUC), positive and negative likelihood ratios for predicting mortality were 0.86, 0.82, 0.84, 4.9, and 0.16, respectively. Multivariate Cox-proportional hazard regression model revealed greater hospital mortality when the alert was triggered (adjusted Hazards Ratio 4.0; 95%CI 3.3, 4.9; P<0.0001). Triggering the alert was associated with additional hospitalization days (+3.0 days) and ventilator days (+1.6 days; P<0.0001). Conclusion An automated alert system that continuously samples electronic medical record-data can be implemented, has excellent test characteristics, and can assist in the real-time identification of hospitalized patients at risk for death. PMID:27019043
Relativistic electron kinetic effects on laser diagnostics in burning plasmas
NASA Astrophysics Data System (ADS)
Mirnov, V. V.; Den Hartog, D. J.
2018-02-01
Toroidal interferometry/polarimetry (TIP), poloidal polarimetry (PoPola), and Thomson scattering systems (TS) are major optical diagnostics being designed and developed for ITER. Each of them relies upon a sophisticated quantitative understanding of the electron response to laser light propagating through a burning plasma. Review of the theoretical results for two different applications is presented: interferometry/polarimetry (I/P) and polarization of Thomson scattered light, unified by the importance of relativistic (quadratic in vTe/c) electron kinetic effects. For I/P applications, rigorous analytical results are obtained perturbatively by expansion in powers of the small parameter τ = Te/me c2, where Te is electron temperature and me is electron rest mass. Experimental validation of the analytical models has been made by analyzing data of more than 1200 pulses collected from high-Te JET discharges. Based on this validation the relativistic analytical expressions are included in the error analysis and design projects of the ITER TIP and PoPola systems. The polarization properties of incoherent Thomson scattered light are being examined as a method of Te measurement relevant to ITER operational regimes. The theory is based on Stokes vector transformation and Mueller matrices formalism. The general approach is subdivided into frequency-integrated and frequency-resolved cases. For each of them, the exact analytical relativistic solutions are presented in the form of Mueller matrix elements averaged over the relativistic Maxwellian distribution function. New results related to the detailed verification of the frequency-resolved solutions are reported. The precise analytic expressions provide output much more rapidly than relativistic kinetic numerical codes allowing for direct real-time feedback control of ITER device operation.
NASA Astrophysics Data System (ADS)
Freethy, S. J.; Görler, T.; Creely, A. J.; Conway, G. D.; Denk, S. S.; Happel, T.; Koenen, C.; Hennequin, P.; White, A. E.; ASDEX Upgrade Team
2018-05-01
Measurements of turbulent electron temperature fluctuation amplitudes, δTe ⊥/Te , frequency spectra, and radial correlation lengths, Lr(Te ⊥) , have been performed at ASDEX Upgrade using a newly upgraded Correlation ECE diagnostic in the range of scales k⊥<1.4 cm-1, kr<3.5 cm-1 ( k⊥ρs<0.28 and krρs<0.7 ). The phase angle between turbulent temperature and density fluctuations, αnT, has also been measured by using an ECE radiometer coupled to a reflectometer along the same line of sight. These quantities are used simultaneously to constrain a set of ion-scale non-linear gyrokinetic turbulence simulations of the outer core (ρtor = 0.75) of a low density, electron heated L-mode plasma, performed using the gyrokinetic simulation code, GENE. The ion and electron temperature gradients were scanned within uncertainties. It is found that gyrokinetic simulations are able to match simultaneously the electron and ion heat flux at this radius within the experimental uncertainties. The simulations were performed based on a reference discharge for which δTe ⊥/Te measurements were available, and Lr(Te ⊥) and αnT were then predicted using synthetic diagnostics prior to measurements in a repeat discharge. While temperature fluctuation amplitudes are overestimated by >50% for all simulations within the sensitivity scans performed, good quantitative agreement is found for Lr(Te ⊥) and αnT. A validation metric is used to quantify the level of agreement of individual simulations with experimental measurements, and the best agreement is found close to the experimental gradient values.
Xu, Stanley; Clarke, Christina L; Newcomer, Sophia R; Daley, Matthew F; Glanz, Jason M
2018-05-16
Vaccine safety studies are often electronic health record (EHR)-based observational studies. These studies often face significant methodological challenges, including confounding and misclassification of adverse event. Vaccine safety researchers use self-controlled case series (SCCS) study design to handle confounding effect and employ medical chart review to ascertain cases that are identified using EHR data. However, for common adverse events, limited resources often make it impossible to adjudicate all adverse events observed in electronic data. In this paper, we considered four approaches for analyzing SCCS data with confirmation rates estimated from an internal validation sample: (1) observed cases, (2) confirmed cases only, (3) known confirmation rate, and (4) multiple imputation (MI). We conducted a simulation study to evaluate these four approaches using type I error rates, percent bias, and empirical power. Our simulation results suggest that when misclassification of adverse events is present, approaches such as observed cases, confirmed case only, and known confirmation rate may inflate the type I error, yield biased point estimates, and affect statistical power. The multiple imputation approach considers the uncertainty of estimated confirmation rates from an internal validation sample, yields a proper type I error rate, largely unbiased point estimate, proper variance estimate, and statistical power. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lima, Gustavo F; Freitas, Victor C G; Araújo, Renan P; Maitelli, André L; Salazar, Andrés O
2017-09-15
The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG's movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG's passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory.
Wright, A; McCoy, A; Henkin, S; Flaherty, M; Sittig, D
2013-01-01
In a prior study, we developed methods for automatically identifying associations between medications and problems using association rule mining on a large clinical data warehouse and validated these methods at a single site which used a self-developed electronic health record. To demonstrate the generalizability of these methods by validating them at an external site. We received data on medications and problems for 263,597 patients from the University of Texas Health Science Center at Houston Faculty Practice, an ambulatory practice that uses the Allscripts Enterprise commercial electronic health record product. We then conducted association rule mining to identify associated pairs of medications and problems and characterized these associations with five measures of interestingness: support, confidence, chi-square, interest and conviction and compared the top-ranked pairs to a gold standard. 25,088 medication-problem pairs were identified that exceeded our confidence and support thresholds. An analysis of the top 500 pairs according to each measure of interestingness showed a high degree of accuracy for highly-ranked pairs. The same technique was successfully employed at the University of Texas and accuracy was comparable to our previous results. Top associations included many medications that are highly specific for a particular problem as well as a large number of common, accurate medication-problem pairs that reflect practice patterns.
Freitas, Victor C. G.; Araújo, Renan P.; Maitelli, André L.; Salazar, Andrés O.
2017-01-01
The pipeline inspection using a device called Pipeline Inspection Gauge (PIG) is safe and reliable when the PIG is at low speeds during inspection. We built a Testing Laboratory, containing a testing loop and supervisory system to study speed control techniques for PIGs. The objective of this work is to present and validate the Testing Laboratory, which will allow development of a speed controller for PIGs and solve an existing problem in the oil industry. The experimental methodology used throughout the project is also presented. We installed pressure transducers on pipeline outer walls to detect the PIG’s movement and, with data from supervisory, calculated an average speed of 0.43 m/s. At the same time, the electronic board inside the PIG received data from odometer and calculated an average speed of 0.45 m/s. We found an error of 4.44%, which is experimentally acceptable. The results showed that it is possible to successfully build a Testing Laboratory to detect the PIG’s passage and estimate its speed. The validation of the Testing Laboratory using data from the odometer and its auxiliary electronic was very successful. Lastly, we hope to develop more research in the oil industry area using this Testing Laboratory. PMID:28914757
Zhou, Ting; Yang, Kaixiang; Thapa, Sudip; Fu, Qiang; Jiang, Yongsheng; Yu, Shiying
2017-04-01
The assessment of quality of life (QOL) is an important part of cachexia management for cancer patients. Functional assessment of anorexia-cachexia therapy (FAACT), a specific QOL instrument for cachexia patients, has not been validated in Chinese population. The aim of this study was to validate the FAACT scale in Chinese cancer patients for its future use. Eligible cancer patients were included in our study. Patients' demographic and clinical characteristics were collected from the electronic medical records. Patients were asked to complete the Chinese version of FAACT scale and the MD Anderson symptom inventory (MDASI), and then the reliability and validity were analyzed. A total of 285 patients were enrolled in our study, data of 241 patients were evaluated. Coefficients of Cronbach's alpha, test-retest and split-half analyses were all greater than 0.8, which indicated an excellent reliability for FAACT scale. In item-subscale correlation analysis and factor analysis, good construct validity for FAACT scale was found. The correlation between FAACT and MDASI interference subscale showed reasonable criterion-related validity, and for further clinical validation, the FAACT scale showed excellent discriminative validity for distinguishing patients in different cachexia status and in different performance status. The Chinese version of FAACT scale has good reliability and validity and is suitable for measuring QOL of cachexia patients in Chinese population.
Mani, Suresh; Sharma, Shobha; Omar, Baharudin; Paungmali, Aatit; Joseph, Leonard
2017-04-01
Purpose The purpose of this review is to systematically explore and summarise the validity and reliability of telerehabilitation (TR)-based physiotherapy assessment for musculoskeletal disorders. Method A comprehensive systematic literature review was conducted using a number of electronic databases: PubMed, EMBASE, PsycINFO, Cochrane Library and CINAHL, published between January 2000 and May 2015. The studies examined the validity, inter- and intra-rater reliabilities of TR-based physiotherapy assessment for musculoskeletal conditions were included. Two independent reviewers used the Quality Appraisal Tool for studies of diagnostic Reliability (QAREL) and the Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool to assess the methodological quality of reliability and validity studies respectively. Results A total of 898 hits were achieved, of which 11 articles based on inclusion criteria were reviewed. Nine studies explored the concurrent validity, inter- and intra-rater reliabilities, while two studies examined only the concurrent validity. Reviewed studies were moderate to good in methodological quality. The physiotherapy assessments such as pain, swelling, range of motion, muscle strength, balance, gait and functional assessment demonstrated good concurrent validity. However, the reported concurrent validity of lumbar spine posture, special orthopaedic tests, neurodynamic tests and scar assessments ranged from low to moderate. Conclusion TR-based physiotherapy assessment was technically feasible with overall good concurrent validity and excellent reliability, except for lumbar spine posture, orthopaedic special tests, neurodynamic testa and scar assessment.
Organic High Electron Mobility Transistors Realized by 2D Electron Gas.
Zhang, Panlong; Wang, Haibo; Yan, Donghang
2017-09-01
A key breakthrough in inorganic modern electronics is the energy-band engineering that plays important role to improve device performance or develop novel functional devices. A typical application is high electron mobility transistors (HEMTs), which utilizes 2D electron gas (2DEG) as transport channel and exhibits very high electron mobility over traditional field-effect transistors (FETs). Recently, organic electronics have made very rapid progress and the band transport model is demonstrated to be more suitable for explaining carrier behavior in high-mobility crystalline organic materials. Therefore, there emerges a chance for applying energy-band engineering in organic semiconductors to tailor their optoelectronic properties. Here, the idea of energy-band engineering is introduced and a novel device configuration is constructed, i.e., using quantum well structures as active layers in organic FETs, to realize organic 2DEG. Under the control of gate voltage, electron carriers are accumulated and confined at quantized energy levels, and show efficient 2D transport. The electron mobility is up to 10 cm 2 V -1 s -1 , and the operation mechanisms of organic HEMTs are also argued. Our results demonstrate the validity of tailoring optoelectronic properties of organic semiconductors by energy-band engineering, offering a promising way for the step forward of organic electronics. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Miliordos, Evangelos; Xantheas, Sotiris S.
2015-03-01
We report the variation of the binding energy of the Formic Acid Dimer with the size of the basis set at the Coupled Cluster with iterative Singles, Doubles and perturbatively connected Triple replacements [CCSD(T)] level of theory, estimate the Complete Basis Set (CBS) limit, and examine the validity of the Basis Set Superposition Error (BSSE)-correction for this quantity that was previously challenged by Kalescky, Kraka, and Cremer (KKC) [J. Chem. Phys. 140, 084315 (2014)]. Our results indicate that the BSSE correction, including terms that account for the substantial geometry change of the monomers due to the formation of two strong hydrogen bonds in the dimer, is indeed valid for obtaining accurate estimates for the binding energy of this system as it exhibits the expected decrease with increasing basis set size. We attribute the discrepancy between our current results and those of KKC to their use of a valence basis set in conjunction with the correlation of all electrons (i.e., including the 1s of C and O). We further show that the use of a core-valence set in conjunction with all electron correlation converges faster to the CBS limit as the BSSE correction is less than half than the valence electron/valence basis set case. The uncorrected and BSSE-corrected binding energies were found to produce the same (within 0.1 kcal/mol) CBS limits. We obtain CCSD(T)/CBS best estimates for De = - 16.1 ± 0.1 kcal/mol and for D0 = - 14.3 ± 0.1 kcal/mol, the later in excellent agreement with the experimental value of -14.22 ± 0.12 kcal/mol.
Cai, Tianxi; Karlson, Elizabeth W.
2013-01-01
Objectives To test whether data extracted from full text patient visit notes from an electronic medical record (EMR) would improve the classification of PsA compared to an algorithm based on codified data. Methods From the > 1,350,000 adults in a large academic EMR, all 2318 patients with a billing code for PsA were extracted and 550 were randomly selected for chart review and algorithm training. Using codified data and phrases extracted from narrative data using natural language processing, 31 predictors were extracted and three random forest algorithms trained using coded, narrative, and combined predictors. The receiver operator curve (ROC) was used to identify the optimal algorithm and a cut point was chosen to achieve the maximum sensitivity possible at a 90% positive predictive value (PPV). The algorithm was then used to classify the remaining 1768 charts and finally validated in a random sample of 300 cases predicted to have PsA. Results The PPV of a single PsA code was 57% (95%CI 55%–58%). Using a combination of coded data and NLP the random forest algorithm reached a PPV of 90% (95%CI 86%–93%) at sensitivity of 87% (95% CI 83% – 91%) in the training data. The PPV was 93% (95%CI 89%–96%) in the validation set. Adding NLP predictors to codified data increased the area under the ROC (p < 0.001). Conclusions Using NLP with text notes from electronic medical records improved the performance of the prediction algorithm significantly. Random forests were a useful tool to accurately classify psoriatic arthritis cases to enable epidemiological research. PMID:20701955
Ye, Tiantian; Wei, Zongsu; Spinney, Richard; Tang, Chong-Jian; Luo, Shuang; Xiao, Ruiyang; Dionysiou, Dionysios D
2017-06-01
Second-order rate constants [Formula: see text] for the reaction of sulfate radical anion (SO 4 •- ) with trace organic contaminants (TrOCs) are of scientific and practical importance for assessing their environmental fate and removal efficiency in water treatment systems. Here, we developed a chemical structure-based model for predicting [Formula: see text] using 32 molecular fragment descriptors, as this type of model provides a quick estimate at low computational cost. The model was constructed using the multiple linear regression (MLR) and artificial neural network (ANN) methods. The MLR method yielded adequate fit for the training set (R training 2 =0.88,n=75) and reasonable predictability for the validation set (R validation 2 =0.62,n=38). In contrast, the ANN method produced a more statistical robustness but rather poor predictability (R training 2 =0.99andR validation 2 =0.42). The reaction mechanisms of SO 4 •- reactivity with TrOCs were elucidated. Our result shows that the coefficients of functional groups reflect their electron donating/withdrawing characters. For example, electron donating groups typically exhibit positive coefficients, indicating enhanced SO 4 •- reactivity. Electron withdrawing groups exhibit negative values, indicating reduced reactivity. With its quick and accurate features, we applied this structure-based model to 55 discrete TrOCs culled from the Contaminant Candidate List 4, and quantitatively compared their removal efficiency with SO 4 •- and OH in the presence of environmental matrices. This high-throughput model helps prioritize TrOCs that are persistent to SO 4 •- based oxidation technologies at the screening level, and provide diagnostics of SO 4 •- reaction mechanisms. Copyright © 2017 Elsevier Ltd. All rights reserved.
The good, the bad and the dubious: VHELIBS, a validation helper for ligands and binding sites
2013-01-01
Background Many Protein Data Bank (PDB) users assume that the deposited structural models are of high quality but forget that these models are derived from the interpretation of experimental data. The accuracy of atom coordinates is not homogeneous between models or throughout the same model. To avoid basing a research project on a flawed model, we present a tool for assessing the quality of ligands and binding sites in crystallographic models from the PDB. Results The Validation HElper for LIgands and Binding Sites (VHELIBS) is software that aims to ease the validation of binding site and ligand coordinates for non-crystallographers (i.e., users with little or no crystallography knowledge). Using a convenient graphical user interface, it allows one to check how ligand and binding site coordinates fit to the electron density map. VHELIBS can use models from either the PDB or the PDB_REDO databank of re-refined and re-built crystallographic models. The user can specify threshold values for a series of properties related to the fit of coordinates to electron density (Real Space R, Real Space Correlation Coefficient and average occupancy are used by default). VHELIBS will automatically classify residues and ligands as Good, Dubious or Bad based on the specified limits. The user is also able to visually check the quality of the fit of residues and ligands to the electron density map and reclassify them if needed. Conclusions VHELIBS allows inexperienced users to examine the binding site and the ligand coordinates in relation to the experimental data. This is an important step to evaluate models for their fitness for drug discovery purposes such as structure-based pharmacophore development and protein-ligand docking experiments. PMID:23895374
Evaluating a Dental Diagnostic Terminology in an Electronic Health Record
White, Joel M.; Kalenderian, Elsbeth; Stark, Paul C.; Ramoni, Rachel L.; Vaderhobli, Ram; Walji, Muhammad F.
2011-01-01
Standardized treatment procedure codes and terms are routinely used in dentistry. Utilization of a diagnostic terminology is common in medicine, but there is not a satisfactory or commonly standardized dental diagnostic terminology available at this time. Recent advances in dental informatics have provided an opportunity for inclusion of diagnostic codes and terms as part of treatment planning and documentation in the patient treatment history. This article reports the results of the use of a diagnostic coding system in a large dental school’s predoctoral clinical practice. A list of diagnostic codes and terms, called Z codes, was developed by dental faculty members. The diagnostic codes and terms were implemented into an electronic health record (EHR) for use in a predoctoral dental clinic. The utilization of diagnostic terms was quantified. The validity of Z code entry was evaluated by comparing the diagnostic term entered to the procedure performed, where valid diagnosis-procedure associations were determined by consensus among three calibrated academically based dentists. A total of 115,004 dental procedures were entered into the EHR during the year sampled. Of those, 43,053 were excluded from this analysis because they represent diagnosis or other procedures unrelated to treatments. Among the 71,951 treatment procedures, 27,973 had diagnoses assigned to them with an overall utilization of 38.9 percent. Of the 147 available Z codes, ninety-three were used (63.3 percent). There were 335 unique procedures provided and 2,127 procedure/diagnosis pairs captured in the EHR. Overall, 76.7 percent of the diagnoses entered were valid. We conclude that dental diagnostic terminology can be incorporated within an electronic health record and utilized in an academic clinical environment. Challenges remain in the development of terms and implementation and ease of use that, if resolved, would improve the utilization. PMID:21546594
Fitmunk: improving protein structures by accurate, automatic modeling of side-chain conformations.
Porebski, Przemyslaw Jerzy; Cymborowski, Marcin; Pasenkiewicz-Gierula, Marta; Minor, Wladek
2016-02-01
Improvements in crystallographic hardware and software have allowed automated structure-solution pipelines to approach a near-`one-click' experience for the initial determination of macromolecular structures. However, in many cases the resulting initial model requires a laborious, iterative process of refinement and validation. A new method has been developed for the automatic modeling of side-chain conformations that takes advantage of rotamer-prediction methods in a crystallographic context. The algorithm, which is based on deterministic dead-end elimination (DEE) theory, uses new dense conformer libraries and a hybrid energy function derived from experimental data and prior information about rotamer frequencies to find the optimal conformation of each side chain. In contrast to existing methods, which incorporate the electron-density term into protein-modeling frameworks, the proposed algorithm is designed to take advantage of the highly discriminatory nature of electron-density maps. This method has been implemented in the program Fitmunk, which uses extensive conformational sampling. This improves the accuracy of the modeling and makes it a versatile tool for crystallographic model building, refinement and validation. Fitmunk was extensively tested on over 115 new structures, as well as a subset of 1100 structures from the PDB. It is demonstrated that the ability of Fitmunk to model more than 95% of side chains accurately is beneficial for improving the quality of crystallographic protein models, especially at medium and low resolutions. Fitmunk can be used for model validation of existing structures and as a tool to assess whether side chains are modeled optimally or could be better fitted into electron density. Fitmunk is available as a web service at http://kniahini.med.virginia.edu/fitmunk/server/ or at http://fitmunk.bitbucket.org/.
Cox, Zachary L; Lewis, Connie M; Lai, Pikki; Lenihan, Daniel J
2017-01-01
We aim to validate the diagnostic performance of the first fully automatic, electronic heart failure (HF) identification algorithm and evaluate the implementation of an HF Dashboard system with 2 components: real-time identification of decompensated HF admissions and accurate characterization of disease characteristics and medical therapy. We constructed an HF identification algorithm requiring 3 of 4 identifiers: B-type natriuretic peptide >400 pg/mL; admitting HF diagnosis; history of HF International Classification of Disease, Ninth Revision, diagnosis codes; and intravenous diuretic administration. We validated the diagnostic accuracy of the components individually (n = 366) and combined in the HF algorithm (n = 150) compared with a blinded provider panel in 2 separate cohorts. We built an HF Dashboard within the electronic medical record characterizing the disease and medical therapies of HF admissions identified by the HF algorithm. We evaluated the HF Dashboard's performance over 26 months of clinical use. Individually, the algorithm components displayed variable sensitivity and specificity, respectively: B-type natriuretic peptide >400 pg/mL (89% and 87%); diuretic (80% and 92%); and International Classification of Disease, Ninth Revision, code (56% and 95%). The HF algorithm achieved a high specificity (95%), positive predictive value (82%), and negative predictive value (85%) but achieved limited sensitivity (56%) secondary to missing provider-generated identification data. The HF Dashboard identified and characterized 3147 HF admissions over 26 months. Automated identification and characterization systems can be developed and used with a substantial degree of specificity for the diagnosis of decompensated HF, although sensitivity is limited by clinical data input. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Komjathy, A.; Wilson, B.; Akopian, V.; Pi, X.; Mannucci, A.; Wang, C.
2008-12-01
We seem to be in the midst of a revolution in ionospheric remote sensing driven by the abundance of ground and space-based GPS receivers, new UV remote sensing satellites, and the advent of data assimilation techniques for space weather. In particular, the COSMIC 6-satellite constellation was launched in April 2006. COSMIC now provides unprecedented global coverage of GPS occultations measurements, each of which yields electron density information with unprecedented ~1 km vertical resolution. Calibrated measurements of ionospheric delay (total electron content or TEC) suitable for input into assimilation models is currently made available in near real-time (NRT) from the COSMIC with a latency of 30 to 120 minutes. The University of Southern California (USC) and the Jet Propulsion Laboratory (JPL) have jointly developed a real-time Global Assimilative Ionospheric Model (GAIM) to monitor space weather, study storm effects, and provide ionospheric calibration for DoD customers and NASA flight projects. JPL/USC GAIM is a physics- based 3D data assimilation model that uses both 4DVAR and Kalman filter techniques to solve for the ion and electron density state and key drivers such as equatorial electrodynamics, neutral winds, and production terms. Daily (delayed) GAIM runs can accept as input ground GPS TEC data from 1200+ sites, occultation links from CHAMP, SAC-C, and the COSMIC constellation, UV limb and nadir scans from the TIMED and DMSP satellites, and in situ data from a variety of satellites (DMSP and C/NOFS). Real-Time GAIM (RTGAIM) ingests multiple data sources in real time, updates the 3D electron density grid every 5 minutes, and solves for improved drivers every 1-2 hours. Since our forward physics model and the adjoint model were expressly designed for data assimilation and computational efficiency, all of this can be accomplished on a single dual- processor Unix workstation. Customers are currently evaluating the accuracy of JPL/USC GAIM 'nowcasts' for ray tracing applications and trans-ionospheric path delay calibration. In the presentation, we will discuss the expected impact of NRT COSMIC occultation and NRT ground-based measurements and present validation results for ingest of COSMIC data into GAIM using measurements from World Days. We will quality check our COSMIC-derived products by comparing Abel profiles and JPL- processed results. Furthermore, we will validate GAIM assimilation results using Incoherent Scatter Radar measurements from Arecibo, Jicamarca and Millstone Hill datasets. We will conclude by characterizing the improved electron density states using dual-frequency altimeter-derived Jason vertical TEC measurements.
An ab initio electronic transport database for inorganic materials.
Ricci, Francesco; Chen, Wei; Aydemir, Umut; Snyder, G Jeffrey; Rignanese, Gian-Marco; Jain, Anubhav; Hautier, Geoffroy
2017-07-04
Electronic transport in materials is governed by a series of tensorial properties such as conductivity, Seebeck coefficient, and effective mass. These quantities are paramount to the understanding of materials in many fields from thermoelectrics to electronics and photovoltaics. Transport properties can be calculated from a material's band structure using the Boltzmann transport theory framework. We present here the largest computational database of electronic transport properties based on a large set of 48,000 materials originating from the Materials Project database. Our results were obtained through the interpolation approach developed in the BoltzTraP software, assuming a constant relaxation time. We present the workflow to generate the data, the data validation procedure, and the database structure. Our aim is to target the large community of scientists developing materials selection strategies and performing studies involving transport properties.
Aguilera Díaz, Jerónimo; Arias, Antonio Eduardo; Budalich, Cintia Mabel; Benítez, Sonia Elizabeth; López, Gastón; Borbolla, Damián; Plazzotta, Fernando; Luna, Daniel; de Quirós, Fernán González Bernaldo
2010-01-01
This paper describes the development and implementation of a web based electronic health record for the Homecare Service program in the Hospital Italiano de Buenos Aires. It reviews the process of the integration of the new electronic health record to the hospital information system, allowing physicians to access the clinical data repository from their Pc's at home and with the capability of consulting past and present history of the patient health care, order, tests, and referrals with others professionals trough the new Electronic Health Record. We also discuss how workflow processes were changed and improved for the physicians, nurses, and administrative personnel of the Homecare Services and the educational methods used to improve acceptance and adoption of these new technologies. We also briefly describe the validation of physicians and their field work with electronic signatures.
Electron-less negative ion extraction from ion-ion plasmas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rafalskyi, Dmytro; Aanesland, Ane
2015-03-09
This paper presents experimental results showing that continuous negative ion extraction, without co-extracted electrons, is possible from highly electronegative SF{sub 6} ion-ion plasma at low gas pressure (1 mTorr). The ratio between the negative ion and electron densities is more than 3000 in the vicinity of the two-grid extraction and acceleration system. The measurements are conducted by both magnetized and non-magnetized energy analyzers attached to the external grid. With these two analyzers, we show that the extracted negative ion flux is almost electron-free and has the same magnitude as the positive ion flux extracted and accelerated when the grids aremore » biased oppositely. The results presented here can be used for validation of numerical and analytical models of ion extraction from ion-ion plasma.« less
Ab initio quantum chemical calculation of electron transfer matrix elements for large molecules
NASA Astrophysics Data System (ADS)
Zhang, Linda Yu; Friesner, Richard A.; Murphy, Robert B.
1997-07-01
Using a diabatic state formalism and pseudospectral numerical methods, we have developed an efficient ab initio quantum chemical approach to the calculation of electron transfer matrix elements for large molecules. The theory is developed at the Hartree-Fock level and validated by comparison with results in the literature for small systems. As an example of the power of the method, we calculate the electronic coupling between two bacteriochlorophyll molecules in various intermolecular geometries. Only a single self-consistent field (SCF) calculation on each of the monomers is needed to generate coupling matrix elements for all of the molecular pairs. The largest calculations performed, utilizing 1778 basis functions, required ˜14 h on an IBM 390 workstation. This is considerably less cpu time than would be necessitated with a supermolecule adiabatic state calculation and a conventional electronic structure code.
A Robust High Current Density Electron Gun
NASA Astrophysics Data System (ADS)
Mako, F.; Peter, W.; Shiloh, J.; Len, L. K.
1996-11-01
Proof-of-principle experiments are proposed to validate a new concept for a robust, high-current density Pierce electron gun (RPG) for use in klystrons and high brightness electron sources for accelerators. This rugged, long-life electron gun avoids the difficulties associated with plasma cathodes, thermionic emitters, and field emission cathodes. The RPG concept employs the emission of secondary electrons in a transmission mode as opposed to the conventional mode of reflection, i.e., electrons exit from the back face of a thin negative electron affinity (NEA) material, and in the same direction as the incident beam. Current amplification through one stage of a NEA material could be over 50 times. The amplification is accomplished in one or more stages consisting of one primary emitter and one or more secondary emitters. The primary emitter is a low current density robust emitter (e.g., thoriated tungsten). The secondary emitters are thin NEA electrodes which emit secondary electrons in the same direction as the incident beam. Specific application is targeted for a klystron gun to be used by SLAC with a cold cathode at 30-40 amps/cm^2 output from the secondary emission stage, a ~2 μs pulse length, and ~200 pulses/second.
Active control of bright electron beams with RF optics for femtosecond microscopy
Williams, J.; Zhou, F.; Sun, T.; ...
2017-08-01
A frontier challenge in implementing femtosecond electron microscopy is to gain precise optical control of intense beams to mitigate collective space charge effects for significantly improving the throughput. In this paper, we explore the flexible uses of an RF cavity as a longitudinal lens in a high-intensity beam column for condensing the electron beams both temporally and spectrally, relevant to the design of ultrafast electron microscopy. Through the introduction of a novel atomic grating approach for characterization of electron bunch phase space and control optics, we elucidate the principles for predicting and controlling the phase space dynamics to reach optimalmore » compressions at various electron densities and generating conditions. We provide strategies to identify high-brightness modes, achieving ~100 fs and ~1 eV resolutions with 10 6 electrons per bunch, and establish the scaling of performance for different bunch charges. These results benchmark the sensitivity and resolution from the fundamental beam brightness perspective and also validate the adaptive optics concept to enable delicate control of the density-dependent phase space structures to optimize the performance, including delivering ultrashort, monochromatic, high-dose, or coherent electron bunches.« less
Gu, Jiande; Wang, Jing; Leszczynski, Jerzy
2014-01-30
Computational chemistry approach was applied to explore the nature of electron attachment to cytosine-rich DNA single strands. An oligomer dinucleoside phosphate deoxycytidylyl-3',5'-deoxycytidine (dCpdC) was selected as a model system for investigations by density functional theory. Electron distribution patterns for the radical anions of dCpdC in aqueous solution were explored. The excess electron may reside on the nucleobase at the 5' position (dC(•-)pdC) or at the 3' position (dCpdC(•-)). From comparison with electron attachment to the cytosine related DNA fragments, the electron affinity for the formation of the cytosine-centered radical anion in DNA is estimated to be around 2.2 eV. Electron attachment to cytosine sites in DNA single strands might cause perturbations of local structural characteristics. Visible absorption spectroscopy may be applied to validate computational results and determine experimentally the existence of the base-centered radical anion. The time-dependent DFT study shows the absorption around 550-600 nm for the cytosine-centered radical anions of DNA oligomers. This indicates that if such species are detected experimentally they would be characterized by a distinctive color.
Active control of bright electron beams with RF optics for femtosecond microscopy
Williams, J.; Zhou, F.; Sun, T.; Tao, Z.; Chang, K.; Makino, K.; Berz, M.; Duxbury, P. M.; Ruan, C.-Y.
2017-01-01
A frontier challenge in implementing femtosecond electron microscopy is to gain precise optical control of intense beams to mitigate collective space charge effects for significantly improving the throughput. Here, we explore the flexible uses of an RF cavity as a longitudinal lens in a high-intensity beam column for condensing the electron beams both temporally and spectrally, relevant to the design of ultrafast electron microscopy. Through the introduction of a novel atomic grating approach for characterization of electron bunch phase space and control optics, we elucidate the principles for predicting and controlling the phase space dynamics to reach optimal compressions at various electron densities and generating conditions. We provide strategies to identify high-brightness modes, achieving ∼100 fs and ∼1 eV resolutions with 106 electrons per bunch, and establish the scaling of performance for different bunch charges. These results benchmark the sensitivity and resolution from the fundamental beam brightness perspective and also validate the adaptive optics concept to enable delicate control of the density-dependent phase space structures to optimize the performance, including delivering ultrashort, monochromatic, high-dose, or coherent electron bunches. PMID:28868325
Understanding metallic bonding: Structure, process and interaction by Rasch analysis
NASA Astrophysics Data System (ADS)
Cheng, Maurice M. W.; Oon, Pey-Tee
2016-08-01
This paper reports the results of a survey of 3006 Year 10-12 students on their understandings of metallic bonding. The instrument was developed based on Chi's ontological categories of scientific concepts and students' understanding of metallic bonding as reported in the literature. The instrument has two parts. Part one probed into students' understanding of metallic bonding as (a) a submicro structure of metals, (b) a process in which individual metal atoms lose their outermost shell electrons to form a 'sea of electrons' and octet metal cations or (c) an all-directional electrostatic force between delocalized electrons and metal cations, that is, an interaction. Part two assessed students' explanation of malleability of metals, for example (a) as a submicro structural rearrangement of metal atoms/cations or (b) based on all-directional electrostatic force. The instrument was validated by the Rasch Model. Psychometric assessment showed that the instrument possessed reasonably good properties of measurement. Results revealed that it was reliable and valid for measuring students' understanding of metallic bonding. Analysis revealed that the structure, process and interaction understandings were unidimensional and in an increasing order of difficulty. Implications for the teaching of metallic bonding, particular through the use of diagrams, critiques and model-based learning, are discussed.
Experimental and analytical investigation of a modified ring cusp NSTAR engine
NASA Technical Reports Server (NTRS)
Sengupta, Anita
2005-01-01
A series of experimental measurements on a modified laboratory NSTAR engine were used to validate a zero dimensional analytical discharge performance model of a ring cusp ion thruster. The model predicts the discharge performance of a ring cusp NSTAR thruster as a function the magnetic field configuration, thruster geometry, and throttle level. Analytical formalisms for electron and ion confinement are used to predict the ionization efficiency for a given thruster design. Explicit determination of discharge loss and volume averaged plasma parameters are also obtained. The model was used to predict the performance of the nominal and modified three and four ring cusp 30-cm ion thruster configurations operating at the full power (2.3 kW) NSTAR throttle level. Experimental measurements of the modified engine configuration discharge loss compare well with the predicted value for propellant utilizations from 80 to 95%. The theory, as validated by experiment, indicates that increasing the magnetic strength of the minimum closed reduces maxwellian electron diffusion and electrostatically confines the ion population and subsequent loss to the anode wall. The theory also indicates that increasing the cusp strength and minimizing the cusp area improves primary electron confinement increasing the probability of an ionization collision prior to loss at the cusp.
NASA Astrophysics Data System (ADS)
Aa, Ercha; Liu, Siqing; Huang, Wengeng; Shi, Liqin; Gong, Jiancun; Chen, Yanhong; Shen, Hua; Li, Jianyong
2016-06-01
In this paper, a regional 3-D ionospheric electron density specification over China and adjacent areas (70°E-140°E in longitude, 15°N-55°N in latitude, and 100-900 km in altitude) is developed on the basis of data assimilation technique. The International Reference Ionosphere (IRI) is used as a background model, and a three-dimensional variational technique is used to assimilate both the ground-based Global Navigation Satellite System (GNSS) observations from the Crustal Movement Observation Network of China (CMONOC) and International GNSS Service (IGS) and the ionospheric radio occultation (RO) data from FORMOSAT-3/COSMIC (F3/C) satellites. The regional 3-D gridded ionospheric electron densities can be generated with temporal resolution of 5 min in universal time, horizontal resolution of 2° × 2° in latitude and longitude, and vertical resolution of 20 km between 100 and 500 km and 50 km between 500 and 900 km. The data assimilation results are validated through extensive comparison with several sources of electron density information, including (1) ionospheric total electron content (TEC); (2) Abel-retrieved F3/C electron density profiles (EDPs); (3) ionosonde foF2 and bottomside EDPs; and (4) the Utah State University Global Assimilation of Ionospheric Measurements (USU-GAIM) under both geomagnetic quiet and disturbed conditions. The validation results show that the data assimilation procedure pushes the climatological IRI model toward the observation, and a general accuracy improvement of 15-30% can be expected. Thecomparisons also indicate that the data assimilation results are more close to the Center for Orbit Determination of Europe (CODE) TEC and Madrigal TEC products than USU-GAIM. These initial results might demonstrate the effectiveness of the data assimilation technique in improving specification of local ionospheric morphology.
76 FR 75520 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-02
..., electronic, mechanical, or other technological collection techniques or other forms of information technology... it displays a currently valid OMB control number. Office of Procurement and Property Management Title... Information: The Office of Procurement and Property Management (OPPM) and the Center for Industrial Research...
ERIC Educational Resources Information Center
Wilson, Thomas F.
1999-01-01
Y2K problems include software programming issues involving chronology and microchips embedded in every conceivable piece of electronic equipment. Procrastination is not in schools' best interest. Administrators should initiate five conversion stages: awareness, assessment, renovation, validation, and implementation. A sample equipment checklist…
40 CFR 1043.41 - EIAPP certification process.
Code of Federal Regulations, 2014 CFR
2014-07-01
... test engine you provide must include appropriate manifolds, aftertreatment devices, electronic control... CONTROLS CONTROL OF NOX, SOX, AND PM EMISSIONS FROM MARINE ENGINES AND VESSELS SUBJECT TO THE MARPOL... application for an EIAPP certificate for each engine family. An EIAPP certificate is valid starting with the...
40 CFR 1043.41 - EIAPP certification process.
Code of Federal Regulations, 2010 CFR
2010-07-01
... test engine you provide must include appropriate manifolds, aftertreatment devices, electronic control... CONTROLS CONTROL OF NOX, SOX, AND PM EMISSIONS FROM MARINE ENGINES AND VESSELS SUBJECT TO THE MARPOL... application for an EIAPP certificate for each engine family. An EIAPP certificate is valid starting with the...
40 CFR 1043.41 - EIAPP certification process.
Code of Federal Regulations, 2012 CFR
2012-07-01
... test engine you provide must include appropriate manifolds, aftertreatment devices, electronic control... CONTROLS CONTROL OF NOX, SOX, AND PM EMISSIONS FROM MARINE ENGINES AND VESSELS SUBJECT TO THE MARPOL... application for an EIAPP certificate for each engine family. An EIAPP certificate is valid starting with the...
40 CFR 1043.41 - EIAPP certification process.
Code of Federal Regulations, 2013 CFR
2013-07-01
... test engine you provide must include appropriate manifolds, aftertreatment devices, electronic control... CONTROLS CONTROL OF NOX, SOX, AND PM EMISSIONS FROM MARINE ENGINES AND VESSELS SUBJECT TO THE MARPOL... application for an EIAPP certificate for each engine family. An EIAPP certificate is valid starting with the...
40 CFR 1043.41 - EIAPP certification process.
Code of Federal Regulations, 2011 CFR
2011-07-01
... test engine you provide must include appropriate manifolds, aftertreatment devices, electronic control... CONTROLS CONTROL OF NOX, SOX, AND PM EMISSIONS FROM MARINE ENGINES AND VESSELS SUBJECT TO THE MARPOL... application for an EIAPP certificate for each engine family. An EIAPP certificate is valid starting with the...
Al-Mamun, Mohammad; Zhu, Zhengju; Yin, Huajie; Su, Xintai; Zhang, Haimin; Liu, Porun; Yang, Huagui; Wang, Dan; Tang, Zhiyong; Wang, Yun; Zhao, Huijun
2016-08-04
A novel surface sulfur (S) doped cobalt (Co) catalyst for the oxygen evolution reaction (OER) is theoretically designed through the optimisation of the electronic structure of highly reactive surface atoms which is also validated by electrocatalytic OER experiments.
Validation of an Electronic System for Recording Medical Student Patient Encounters
Nkoy, Flory L.; Petersen, Sarah; Matheny Antommaria, Armand H.; Maloney, Christopher G.
2008-01-01
The Liaison Committee for Medical Education requires monitoring of the students’ clinical experiences. Student logs, typically used for this purpose, have a number of limitations. We used an electronic system called Patient Tracker to passively generate student encounter data. The data contained in Patient Tracker was compared to the information reported on student logs and data abstracted from the patients’ charts. Patient Tracker identified 30% more encounters than the student logs. Compared to the student logs, Patient Tracker contained a higher average number of diagnoses per encounter (2.28 vs. 1.03, p<0.01). The diagnostic data contained in Patient Tracker was also more accurate under 4 different definitions of accuracy. Only 1.3% (9/677) of diagnoses in Patient Tracker vs. 16.9% (102/601) diagnoses in the logs could not be validated in patients’ charts (p<0.01). Patient Tracker is a more effective and accurate tool for documenting student clinical encounters than the conventional student logs. PMID:18999155
VLF Trimpi modelling on the path NWC-Dunedin using both finite element and 3D Born modelling
NASA Astrophysics Data System (ADS)
Nunn, D.; Hayakawa, K. B. M.
1998-10-01
This paper investigates the numerical modelling of VLF Trimpis, produced by a D region inhomogeneity on the great circle path. Two different codes are used to model Trimpis on the path NWC-Dunedin. The first is a 2D Finite Element Method Code (FEM), whose solutions are rigorous and valid in the strong scattering or non-Born limit. The second code is a 3D model that invokes the Born approximation. The predicted Trimpis from these codes compare very closely, thus confirming the validity of both models. The modal scattering matrices for both codes are analysed in some detail and are found to have a comparable structure. They indicate strong scattering between the dominant TM modes. Analysis of the scattering matrix from the FEM code shows that departure from linear Born behaviour occurs when the inhomogeneity has a horizontal scale size of about 100 km and a maximum electron density enhancement at 75 km altitude of about 6 electrons.
Design of high-strength refractory complex solid-solution alloys
Singh, Prashant; Sharma, Aayush; Smirnov, A. V.; ...
2018-03-28
Nickel-based superalloys and near-equiatomic high-entropy alloys containing molybdenum are known for higher temperature strength and corrosion resistance. Yet, complex solid-solution alloys offer a huge design space to tune for optimal properties at slightly reduced entropy. For refractory Mo-W-Ta-Ti-Zr, we showcase KKR electronic structure methods via the coherent-potential approximation to identify alloys over five-dimensional design space with improved mechanical properties and necessary global (formation enthalpy) and local (short-range order) stability. Deformation is modeled with classical molecular dynamic simulations, validated from our first-principle data. We predict complex solid-solution alloys of improved stability with greatly enhanced modulus of elasticity (3× at 300 K)more » over near-equiatomic cases, as validated experimentally, and with higher moduli above 500 K over commercial alloys (2.3× at 2000 K). We also show that optimal complex solid-solution alloys are not described well by classical potentials due to critical electronic effects.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curceanu, C.; Bragadireanu, M.; Sirghi, D.
The Pauli Exclusion Principle (PEP) is one of the basic principles of modern physics and, even if there are no compelling reasons to doubt its validity, it is still debated today because an intuitive, elementary explanation is still missing, and because of its unique stand among the basic symmetries of physics. We present an experimental test of the validity of the Pauli Exclusion Principle for electrons based on a straightforward idea put forward a few years ago by Ramberg and Snow (E. Ramberg and G. A. Snow 1990 Phys. Lett. B 238 438). We performed a very accurate search ofmore » X-rays from the Pauli-forbidden atomic transitions of electrons in the already filled 1S shells of copper atoms. Although the experiment has a very simple structure, it poses deep conceptual and interpretational problems. Here we describe the experimental method and recent experimental results interpreted as an upper limit for the probability to violate the Pauli Exclusion Principle. We also present future plans to upgrade the experimental apparatus.« less
Design of high-strength refractory complex solid-solution alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Prashant; Sharma, Aayush; Smirnov, A. V.
Nickel-based superalloys and near-equiatomic high-entropy alloys containing molybdenum are known for higher temperature strength and corrosion resistance. Yet, complex solid-solution alloys offer a huge design space to tune for optimal properties at slightly reduced entropy. For refractory Mo-W-Ta-Ti-Zr, we showcase KKR electronic structure methods via the coherent-potential approximation to identify alloys over five-dimensional design space with improved mechanical properties and necessary global (formation enthalpy) and local (short-range order) stability. Deformation is modeled with classical molecular dynamic simulations, validated from our first-principle data. We predict complex solid-solution alloys of improved stability with greatly enhanced modulus of elasticity (3× at 300 K)more » over near-equiatomic cases, as validated experimentally, and with higher moduli above 500 K over commercial alloys (2.3× at 2000 K). We also show that optimal complex solid-solution alloys are not described well by classical potentials due to critical electronic effects.« less
Molecular dynamics-based refinement and validation for sub-5 Å cryo-electron microscopy maps.
Singharoy, Abhishek; Teo, Ivan; McGreevy, Ryan; Stone, John E; Zhao, Jianhua; Schulten, Klaus
2016-07-07
Two structure determination methods, based on the molecular dynamics flexible fitting (MDFF) paradigm, are presented that resolve sub-5 Å cryo-electron microscopy (EM) maps with either single structures or ensembles of such structures. The methods, denoted cascade MDFF and resolution exchange MDFF, sequentially re-refine a search model against a series of maps of progressively higher resolutions, which ends with the original experimental resolution. Application of sequential re-refinement enables MDFF to achieve a radius of convergence of ~25 Å demonstrated with the accurate modeling of β-galactosidase and TRPV1 proteins at 3.2 Å and 3.4 Å resolution, respectively. The MDFF refinements uniquely offer map-model validation and B-factor determination criteria based on the inherent dynamics of the macromolecules studied, captured by means of local root mean square fluctuations. The MDFF tools described are available to researchers through an easy-to-use and cost-effective cloud computing resource on Amazon Web Services.
Wang, Ning; Björvell, Catrin; Hailey, David; Yu, Ping
2014-12-01
To develop an Australian nursing documentation in aged care (Quality of Australian Nursing Documentation in Aged Care (QANDAC)) instrument to measure the quality of paper-based and electronic resident records. The instrument was based on the nursing process model and on three attributes of documentation quality identified in a systematic review. The development process involved five phases following approaches to designing criterion-referenced measures. The face and content validities and the inter-rater reliability of the instrument were estimated using a focus group approach and consensus model. The instrument contains 34 questions in three sections: completion of nursing history and assessment, description of care process and meeting the requirements of data entry. Estimates of the validity and inter-rater reliability of the instrument gave satisfactory results. The QANDAC instrument may be a useful audit tool for quality improvement and research in aged care documentation. © 2013 ACOTA.
Lin, Jou-Wei; Yang, Chen-Wei
2010-01-01
The objective of this study was to develop and validate an automated acquisition system to assess quality of care (QC) measures for cardiovascular diseases. This system combining searching and retrieval algorithms was designed to extract QC measures from electronic discharge notes and to estimate the attainment rates to the current standards of care. It was developed on the patients with ST-segment elevation myocardial infarction and tested on the patients with unstable angina/non-ST-segment elevation myocardial infarction, both diseases sharing almost the same QC measures. The system was able to reach a reasonable agreement (κ value) with medical experts from 0.65 (early reperfusion rate) to 0.97 (β-blockers and lipid-lowering agents before discharge) for different QC measures in the test set, and then applied to evaluate QC in the patients who underwent coronary artery bypass grafting surgery. The result has validated a new tool to reliably extract QC measures for cardiovascular diseases. PMID:20442141
NASA Astrophysics Data System (ADS)
Min, Qi; Su, Maogen; Wang, Bo; Cao, Shiquan; Sun, Duixiong; Dong, Chenzhong
2018-05-01
The radiation and dynamics properties of laser-produced carbon plasma in vacuum were studied experimentally with aid of a spatio-temporally resolved emission spectroscopy technique. In addition, a radiation hydrodynamics model based on the fluid dynamic equations and the radiative transfer equation was presented, and calculation of the charge states was performed within the time-dependent collisional radiative model. Detailed temporal and spatial evolution behavior about plasma parameters have been analyzed, such as velocity, electron temperature, charge state distribution, energy level population, and various atomic processes. At the same time, the effects of different atomic processes on the charge state distribution were examined. Finally, the validity of assuming a local thermodynamic equilibrium in the carbon plasma expansion was checked, and the results clearly indicate that the assumption was valid only at the initial (<80 ns) stage of plasma expansion. At longer delay times, it was not applicable near the plasma boundary because of a sharp drop of plasma temperature and electron density.
McCormick, Jessica; Delfabbro, Paul; Denson, Linley A
2012-12-01
The aim of this study was to conduct an empirical investigation of the validity of Jacobs' (in J Gambl Behav 2:15-31, 1986) general theory of addictions in relation to gambling problems associated with electronic gaming machines (EGM). Regular EGM gamblers (n = 190) completed a series of standardised measures relating to psychological and physiological vulnerability, substance use, dissociative experiences, early childhood trauma and abuse and problem gambling (the Problem Gambling Severity Index). Statistical analysis using structural equation modelling revealed clear relationships between childhood trauma and life stressors and psychological vulnerability, dissociative-like experiences and problem gambling. These findings confirm and extend a previous model validated by Gupta and Derevensky (in J Gambl Stud 14: 17-49, 1998) using an adolescent population. The significance of these findings are discussed for existing pathway models of problem gambling, for Jacobs' theory, and for clinicians engaged in assessment and intervention.
Chang, Yuanhan; Tambe, Abhijit Anil; Maeda, Yoshinobu; Wada, Masahiro; Gonda, Tomoya
2018-03-08
A literature review of finite element analysis (FEA) studies of dental implants with their model validation process was performed to establish the criteria for evaluating validation methods with respect to their similarity to biological behavior. An electronic literature search of PubMed was conducted up to January 2017 using the Medical Subject Headings "dental implants" and "finite element analysis." After accessing the full texts, the context of each article was searched using the words "valid" and "validation" and articles in which these words appeared were read to determine whether they met the inclusion criteria for the review. Of 601 articles published from 1997 to 2016, 48 that met the eligibility criteria were selected. The articles were categorized according to their validation method as follows: in vivo experiments in humans (n = 1) and other animals (n = 3), model experiments (n = 32), others' clinical data and past literature (n = 9), and other software (n = 2). Validation techniques with a high level of sufficiency and efficiency are still rare in FEA studies of dental implants. High-level validation, especially using in vivo experiments tied to an accurate finite element method, needs to become an established part of FEA studies. The recognition of a validation process should be considered when judging the practicality of an FEA study.
Analytic model of a magnetically insulated transmission line with collisional flow electrons
NASA Astrophysics Data System (ADS)
Stygar, W. A.; Wagoner, T. C.; Ives, H. C.; Corcoran, P. A.; Cuneo, M. E.; Douglas, J. W.; Gilliland, T. L.; Mazarakis, M. G.; Ramirez, J. J.; Seamen, J. F.; Seidel, D. B.; Spielman, R. B.
2006-09-01
We have developed a relativistic-fluid model of the flow-electron plasma in a steady-state one-dimensional magnetically insulated transmission line (MITL). The model assumes that the electrons are collisional and, as a result, drift toward the anode. The model predicts that in the limit of fully developed collisional flow, the relation between the voltage Va, anode current Ia, cathode current Ik, and geometric impedance Z0 of a 1D planar MITL can be expressed as Va=IaZ0h(χ), where h(χ)≡[(χ+1)/4(χ-1)]1/2-ln⌊χ+(χ2-1)1/2⌋/2χ(χ-1) and χ≡Ia/Ik. The relation is valid when Va≳1MV. In the minimally insulated limit, the anode current Ia,min=1.78Va/Z0, the electron-flow current If,min=1.25Va/Z0, and the flow impedance Zf,min=0.588Z0. {The electron-flow current If≡Ia-Ik. Following Mendel and Rosenthal [Phys. Plasmas 2, 1332 (1995)PHPAEN1070-664X10.1063/1.871345], we define the flow impedance Zf as Va/(Ia2-Ik2)1/2.} In the well-insulated limit (i.e., when Ia≫Ia,min), the electron-flow current If=9Va2/8IaZ02 and the flow impedance Zf=2Z0/3. Similar results are obtained for a 1D collisional MITL with coaxial cylindrical electrodes, when the inner conductor is at a negative potential with respect to the outer, and Z0≲40Ω. We compare the predictions of the collisional model to those of several MITL models that assume the flow electrons are collisionless. We find that at given values of Va and Z0, collisions can significantly increase both Ia,min and If,min above the values predicted by the collisionless models, and decrease Zf,min. When Ia≫Ia,min, we find that, at given values of Va, Z0, and Ia, collisions can significantly increase If and decrease Zf. Since the steady-state collisional model is valid only when the drift of electrons toward the anode has had sufficient time to establish fully developed collisional flow, and collisionless models assume there is no net electron drift toward the anode, we expect these two types of models to provide theoretical bounds on Ia, If, and Zf.
Limitations on the upconversion of ion sound to Langmuir turbulence
NASA Technical Reports Server (NTRS)
Vlahos, L.; Papadopoulos, K.
1982-01-01
The weak turbulence theory of Tsytovich, Stenflo and Wilhelmsson (1981) for evaluation of the nonlinear transfer of ion acoustic waves to Langmuir waves is shown to be limited in its region of validity to the level of ion acoustic waves. It is also demonstrated that, in applying the upconversion of ion sound to Langmuir waves for electron acceleration, nonlinear scattering should be self-consistently included, with a suppression of the upconversion process resulting. The impossibility of accelerating electrons by such a process for any reasonable physical system is thereby reaffirmed.
Nonequilibrium Nonideal Nanoplasma Generated by a Fast Single Ion in Condensed Matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faenov, A. Ya.; Kansai Photon Science Institut, Japan Atomic Energy Agency; Lankin, A. V.
A plasma model of relaxation of a medium in heavy ion tracks in condensed matter is proposed. The model is based on three assumptions: the Maxwell distribution of plasma electrons, localization of plasma inside the track nanochannel and constant values of the plasma electron density and temperature during the X-ray irradiation. It is demonstrated that the plasma relaxation model adequately describes the X-ray spectra observed upon interaction of a fast ion with condensed target. Preassumptions of plasma relaxation model are validated by the molecular dynamics modeling and simulation.
Multiple scattering calculations of relativistic electron energy loss spectra
NASA Astrophysics Data System (ADS)
Jorissen, K.; Rehr, J. J.; Verbeeck, J.
2010-04-01
A generalization of the real-space Green’s-function approach is presented for ab initio calculations of relativistic electron energy loss spectra (EELS) which are particularly important in anisotropic materials. The approach incorporates relativistic effects in terms of the transition tensor within the dipole-selection rule. In particular, the method accounts for relativistic corrections to the magic angle in orientation resolved EELS experiments. The approach is validated by a study of the graphite CK edge, for which we present an accurate magic angle measurement consistent with the predicted value.
NASA Astrophysics Data System (ADS)
Dowling, J. A.; Burdett, N.; Greer, P. B.; Sun, J.; Parker, J.; Pichler, P.; Stanwell, P.; Chandra, S.; Rivest-Hénault, D.; Ghose, S.; Salvado, O.; Fripp, J.
2014-03-01
Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.
Three-dimensional reconstruction of the giant mimivirus particle with an x-ray free-electron laser.
Ekeberg, Tomas; Svenda, Martin; Abergel, Chantal; Maia, Filipe R N C; Seltzer, Virginie; Claverie, Jean-Michel; Hantke, Max; Jönsson, Olof; Nettelblad, Carl; van der Schot, Gijs; Liang, Mengning; DePonte, Daniel P; Barty, Anton; Seibert, M Marvin; Iwan, Bianca; Andersson, Inger; Loh, N Duane; Martin, Andrew V; Chapman, Henry; Bostedt, Christoph; Bozek, John D; Ferguson, Ken R; Krzywinski, Jacek; Epp, Sascha W; Rolles, Daniel; Rudenko, Artem; Hartmann, Robert; Kimmel, Nils; Hajdu, Janos
2015-03-06
We present a proof-of-concept three-dimensional reconstruction of the giant mimivirus particle from experimentally measured diffraction patterns from an x-ray free-electron laser. Three-dimensional imaging requires the assembly of many two-dimensional patterns into an internally consistent Fourier volume. Since each particle is randomly oriented when exposed to the x-ray pulse, relative orientations have to be retrieved from the diffraction data alone. We achieve this with a modified version of the expand, maximize and compress algorithm and validate our result using new methods.
Validation of heart and lung teleauscultation on an Internet-based system.
Fragasso, Gabriele; De Benedictis, Marialuisa; Palloshi, Altin; Moltrasio, Marco; Cappelletti, Alberto; Carlino, Mauro; Marchisi, Angelo; Pala, Mariagrazia; Alfieri, Ottavio; Margonato, Alberto
2003-11-01
The feasibility and accuracy of an Internet-based system for teleauscultation was evaluated in 103 cardiac patients, who were auscultated by the same cardiologist with a conventional stethoscope and with an Internet-based method, using an electronic stethoscope and transmitting heart and lung sounds between computer work stations. In 92% of patients, the results of electronic and acoustic auscultation coincided, indicating that teleauscultation may be considered a reliable method for assessing cardiac patients and could, therefore, be adopted in the context of comprehensive telecare programs.
The Trojan Horse Method in nuclear astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spitaleri, C., E-mail: spitaleri@lns.infn.it; Mukhamedzhanov, A. M.; Blokhintsev, L. D.
2011-12-15
The study of energy production and nucleosynthesis in stars requires an increasingly precise knowledge of the nuclear reaction rates at the energies of interest. To overcome the experimental difficulties arising from the small cross sections at those energies and from the presence of the electron screening, the Trojan Horse Method has been introduced. The method provides a valid alternative path to measure unscreened low-energy cross sections of reactions between charged particles, and to retrieve information on the electron screening potential when ultra-low energy direct measurements are available.
MaRIE Undulator & XFEL Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Dinh Cong; Marksteiner, Quinn R.; Anisimov, Petr Mikhaylovich
The 22 slides in this presentation treat the subject under the following headings: MaRIE XFEL Performance Parameters, Input Electron Beam Parameters, Undulator Design, Genesis Simulations, Risks, and Summary It is concluded that time-dependent Genesis simulations show the MaRIE XFEL can deliver the number of photons within the required bandwidth, provided a number of assumptions are met; the highest risks are associated with the electron beam driving the XFEL undulator; and risks associated with the undulator and/or distributed seeding technique may be evaluated or retired by performing early validation experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waye, Scot
Power electronics that use high-temperature devices pose a challenge for thermal management. With the devices running at higher temperatures and having a smaller footprint, the heat fluxes increase from previous power electronic designs. This project overview presents an approach to examine and design thermal management strategies through cooling technologies to keep devices within temperature limits, dissipate the heat generated by the devices and protect electrical interconnects and other components for inverter, converter, and charger applications. This analysis, validation, and demonstration intends to take a multi-scale approach over the device, module, and system levels to reduce size, weight, and cost.
Non-perturbative aspects of particle acceleration in non-linear electrodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burton, David A.; Flood, Stephen P.; Wen, Haibao
2015-04-15
We undertake an investigation of particle acceleration in the context of non-linear electrodynamics. We deduce the maximum energy that an electron can gain in a non-linear density wave in a magnetised plasma, and we show that an electron can “surf” a sufficiently intense Born-Infeld electromagnetic plane wave and be strongly accelerated by the wave. The first result is valid for a large class of physically reasonable modifications of the linear Maxwell equations, whilst the second result exploits the special mathematical structure of Born-Infeld theory.
Barnado, April; Casey, Carolyn; Carroll, Robert J; Wheless, Lee; Denny, Joshua C; Crofford, Leslie J
2017-05-01
To study systemic lupus erythematosus (SLE) in the electronic health record (EHR), we must accurately identify patients with SLE. Our objective was to develop and validate novel EHR algorithms that use International Classification of Diseases, Ninth Revision (ICD-9), Clinical Modification codes, laboratory testing, and medications to identify SLE patients. We used Vanderbilt's Synthetic Derivative, a de-identified version of the EHR, with 2.5 million subjects. We selected all individuals with at least 1 SLE ICD-9 code (710.0), yielding 5,959 individuals. To create a training set, 200 subjects were randomly selected for chart review. A subject was defined as a case if diagnosed with SLE by a rheumatologist, nephrologist, or dermatologist. Positive predictive values (PPVs) and sensitivity were calculated for combinations of code counts of the SLE ICD-9 code, a positive antinuclear antibody (ANA), ever use of medications, and a keyword of "lupus" in the problem list. The algorithms with the highest PPV were each internally validated using a random set of 100 individuals from the remaining 5,759 subjects. The algorithm with the highest PPV at 95% in the training set and 91% in the validation set was 3 or more counts of the SLE ICD-9 code, ANA positive (≥1:40), and ever use of both disease-modifying antirheumatic drugs and steroids, while excluding individuals with systemic sclerosis and dermatomyositis ICD-9 codes. We developed and validated the first EHR algorithm that incorporates laboratory values and medications with the SLE ICD-9 code to identify patients with SLE accurately. © 2016, American College of Rheumatology.
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980
Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda
2016-01-01
Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.
Seçkin, Gül; Yeatts, Dale; Hughes, Susan; Hudson, Cassie; Bell, Valarie
2016-07-11
The Internet, with its capacity to provide information that transcends time and space barriers, continues to transform how people find and apply information to their own lives. With the current explosion in electronic sources of health information, including thousands of websites and hundreds of mobile phone health apps, electronic health literacy is gaining an increasing prominence in health and medical research. An important dimension of electronic health literacy is the ability to appraise the quality of information that will facilitate everyday health care decisions. Health information seekers explore their care options by gathering information from health websites, blogs, Web-based forums, social networking websites, and advertisements, despite the fact that information quality on the Internet varies greatly. Nonetheless, research has lagged behind in establishing multidimensional instruments, in part due to the evolving construct of health literacy itself. The purpose of this study was to examine psychometric properties of a new electronic health literacy (ehealth literacy) measure in a national sample of Internet users with specific attention to older users. Our paper is motivated by the fact that ehealth literacy is an underinvestigated area of inquiry. Our sample was drawn from a panel of more than 55,000 participants maintained by Knowledge Networks, the largest national probability-based research panel for Web-based surveys. We examined the factor structure of a 19-item electronic Health Literacy Scale (e-HLS) through exploratory factor analysis (EFA) and confirmatory factor analysis, internal consistency reliability, and construct validity on sample of adults (n=710) and a subsample of older adults (n=194). The AMOS graphics program 21.0 was used to construct a measurement model, linking latent factors obtained from EFA with 19 indicators to determine whether this factor structure achieved a good fit with our entire sample and the subsample (age ≥ 60 years). Linear regression analyses were performed in separate models to examine: (1) the construct validity of the e-HLS and (2) its association with respondents' demographic characteristics and health variables. The EFA produced a 3-factor solution: communication (2 items), trust (4 items), and action (13 items). The 3-factor structure of the e-HLS was found to be invariant for the subsample. Fit indices obtained were as follows: full sample: χ(2) (710)=698.547, df=131, P<.001, comparative fit index (CFI)=0.94, normed fit index (NFI)=0.92, root mean squared error of approximation (RMSEA)=0.08; and for the older subsample (age ≥ 60 years): χ(2) (194)=275.744, df=131, P<.001, CFI=0.95, NFI=0.90, RMSEA=0.08. The analyses supported the e-HLS validity and internal reliability for the full sample and subsample. The overwhelming majority of our respondents reported a great deal of confidence in their ability to appraise the quality of information obtained from the Internet, yet less than half reported performing quality checks contained on the e-HLS.
Yeatts, Dale; Hughes, Susan; Hudson, Cassie; Bell, Valarie
2016-01-01
Background The Internet, with its capacity to provide information that transcends time and space barriers, continues to transform how people find and apply information to their own lives. With the current explosion in electronic sources of health information, including thousands of websites and hundreds of mobile phone health apps, electronic health literacy is gaining an increasing prominence in health and medical research. An important dimension of electronic health literacy is the ability to appraise the quality of information that will facilitate everyday health care decisions. Health information seekers explore their care options by gathering information from health websites, blogs, Web-based forums, social networking websites, and advertisements, despite the fact that information quality on the Internet varies greatly. Nonetheless, research has lagged behind in establishing multidimensional instruments, in part due to the evolving construct of health literacy itself. Objective The purpose of this study was to examine psychometric properties of a new electronic health literacy (ehealth literacy) measure in a national sample of Internet users with specific attention to older users. Our paper is motivated by the fact that ehealth literacy is an underinvestigated area of inquiry. Methods Our sample was drawn from a panel of more than 55,000 participants maintained by Knowledge Networks, the largest national probability-based research panel for Web-based surveys. We examined the factor structure of a 19-item electronic Health Literacy Scale (e-HLS) through exploratory factor analysis (EFA) and confirmatory factor analysis, internal consistency reliability, and construct validity on sample of adults (n=710) and a subsample of older adults (n=194). The AMOS graphics program 21.0 was used to construct a measurement model, linking latent factors obtained from EFA with 19 indicators to determine whether this factor structure achieved a good fit with our entire sample and the subsample (age ≥ 60 years). Linear regression analyses were performed in separate models to examine: (1) the construct validity of the e-HLS and (2) its association with respondents’ demographic characteristics and health variables. Results The EFA produced a 3-factor solution: communication (2 items), trust (4 items), and action (13 items). The 3-factor structure of the e-HLS was found to be invariant for the subsample. Fit indices obtained were as follows: full sample: χ2 (710)=698.547, df=131, P<.001, comparative fit index (CFI)=0.94, normed fit index (NFI)=0.92, root mean squared error of approximation (RMSEA)=0.08; and for the older subsample (age ≥ 60 years): χ2 (194)=275.744, df=131, P<.001, CFI=0.95, NFI=0.90, RMSEA=0.08. Conclusions The analyses supported the e-HLS validity and internal reliability for the full sample and subsample. The overwhelming majority of our respondents reported a great deal of confidence in their ability to appraise the quality of information obtained from the Internet, yet less than half reported performing quality checks contained on the e-HLS. PMID:27400726
Liu, Chung-Feng; Cheng, Tain-Junn
2015-02-07
With respect to information management, most of the previous studies on the acceptance of healthcare information technologies were analyzed from "positive" perspectives. However, such acceptance is always influenced by both positive and negative factors and it is necessary to validate both in order to get a complete understanding. This study aims to explore physicians' acceptance of mobile electronic medical records based on the dual-factor model, which is comprised of inhibitors and enablers, to explain an individual's technology usage. Following an earlier healthcare study in the USA, the researchers conducted a similar survey for an Eastern country (Taiwan) to validate whether perceived threat to professional autonomy acts as a critical inhibitor. In addition, perceived mobility, which is regarded as a critical feature of mobile services, was also evaluated as a common antecedent variable in the model. Physicians from three branch hospitals of a medical group were invited to participate and complete questionnaires. Partial least squares, a structural equation modeling technique, was used to evaluate the proposed model for explanatory power and hypotheses testing. 158 valid questionnaires were collected, yielding a response rate of 33.40%. As expected, the inhibitor of perceived threat has a significant impact on the physicians' perceptions of usefulness as well as their intention to use. The enablers of perceived ease of use and perceived usefulness were also significant. In addition, as expected, perceived mobility was confirmed to have a significant impact on perceived ease of use, perceived usefulness and perceived threat. It was confirmed that the dual-factor model is a comprehensive method for exploring the acceptance of healthcare information technologies, both in Western and Eastern countries. Furthermore, perceived mobility was proven to be an effective antecedent variable in the model. The researchers believe that the results of this study will contribute to the research on the acceptance of healthcare information technologies, particularly with regards to mobile electronic medical records, based on the dual-factor viewpoints of academia and practice.
Sperandio, Naiara; Morais, Dayane de Castro; Priore, Silvia Eloiza
2018-02-01
The scope of this systematic review was to compare the food insecurity scales validated and used in the countries in Latin America and the Caribbean, and analyze the methods used in validation studies. A search was conducted in the Lilacs, SciELO and Medline electronic databases. The publications were pre-selected by titles and abstracts, and subsequently by a full reading. Of the 16,325 studies reviewed, 14 were selected. Twelve validated scales were identified for the following countries: Venezuela, Brazil, Colombia, Bolivia, Ecuador, Costa Rica, Mexico, Haiti, the Dominican Republic, Argentina and Guatemala. Besides these, there is the Latin American and Caribbean scale, the scope of which is regional. The scales ranged from the standard reference used, number of questions and diagnosis of insecurity. The methods used by the studies for internal validation were calculation of Cronbach's alpha and the Rasch model; for external validation the authors calculated association and /or correlation with socioeconomic and food consumption variables. The successful experience of Latin America and the Caribbean in the development of national and regional scales can be an example for other countries that do not have this important indicator capable of measuring the phenomenon of food insecurity.
Scaling laws for positron production in laser-electron beam collisions
NASA Astrophysics Data System (ADS)
Blackburn, Tom; Ilderton, Anton; Murphy, Christopher; Marklund, Mattias
2017-10-01
Showers of gamma rays and positrons are produced when a multi-GeV electron beam collides with a super-intense laser pulse. All-optical realisation of this geometry, where the electron beam is generated by laser-wakefield acceleration, is currently attracting much experimental interest as a probe of radiation reaction and QED effects. These interactions may be modelled theoretically in the framework of strong-field QED or numerically by large-scale PIC simulation. To complement these, we present analytical scaling laws for the electron beam energy loss, gamma ray spectrum, and the positron yield and energy that are valid in the radiation-reaction-dominated regime. These indicate that by employing the collision of a 2 GeV electron beam with a laser pulse of intensity 5 ×1021Wcm-2 , it is possible to produce 10,000 positrons in a single shot at currently available laser facilities. The authors acknowledge support from the Knut and Alice Wallenberg Foundation.
NASA Technical Reports Server (NTRS)
Berman, A. L.
1977-01-01
Observations of Viking differenced S-band/X-band (S-X) range are shown to correlate strongly with Viking Doppler noise. A ratio of proportionality between downlink S-band plasma-induced range error and two-way Doppler noise is calculated. A new parameter (similar to the parameter epsilon which defines the ratio of local electron density fluctuations to mean electron density) is defined as a function of observed data sample interval (Tau) where the time-scale of the observations is 15 Tau. This parameter is interpreted to yield the ratio of net observed phase (or electron density) fluctuations to integrated electron density (in RMS meters/meter). Using this parameter and the thin phase-changing screen approximation, a value for the scale size L is calculated. To be consistent with Doppler noise observations, it is seen necessary for L to be proportional to closest approach distance a, and a strong function of the observed data sample interval, and hence the time-scale of the observations.
Electron Beam Charge Diagnostics for Laser Plasma Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakamura, Kei; Gonsalves, Anthony; Lin, Chen
2011-06-27
A comprehensive study of charge diagnostics is conducted to verify their validity for measuring electron beams produced by laser plasma accelerators (LPAs). First, a scintillating screen (Lanex) was extensively studied using subnanosecond electron beams from the Advanced Light Source booster synchrotron, at the Lawrence Berkeley National Laboratory. The Lanex was cross calibrated with an integrating current transformer (ICT) for up to the electron energy of 1.5 GeV, and the linear response of the screen was confirmed for charge density and intensity up to 160 pC/mm{sup 2} and 0.4 pC/(ps mm{sup 2}), respectively. After the radio-frequency accelerator based cross calibration, amore » series of measurements was conducted using electron beams from an LPA. Cross calibrations were carried out using an activation-based measurement that is immune to electromagnetic pulse noise, ICT, and Lanex. The diagnostics agreed within {+-}8%, showing that they all can provide accurate charge measurements for LPAs.« less
NASA Astrophysics Data System (ADS)
Mrigakshi, Alankrita; Hajdas, Wojtek; Marcinkowski, Radoslaw; Xiao, Hualin; Goncalves, Patricia; Pinto, Marco; Pinto, Costa; Marques, Arlindo; Meier, Dirk
2016-04-01
The RADEM instrument will serve as the radiation monitor for the JUICE spacecraft. It will characterize the highly dynamic radiation environment of the Jovian system by measuring the energy spectra of energetic electrons and protons up to 40 MeV and 250 MeV, respectively. It will also determine the directionality of 0.3-10 MeV electrons. Further goals include the detection of heavy ions, and the determination of the corresponding LET spectra and dose rates. Here, the tests of the Electron and Proton Telescopes, and the Directionality Detector of the RADEM Bread-Board model are described. The objective of these tests is to validate RADEM design and physical concept applied therein. The tests were performed at various irradiation facilities at the Paul Scherrer Institute (PSI) where energy ranges relevant for space applications can be covered (electrons: ≤100 MeV and protons: ≤230 MeV). The measured values are also compared with GEANT4 Monte-Carlo Simulation results.
Electronic structure of aqueous solutions: Bridging the gap between theory and experiments.
Pham, Tuan Anh; Govoni, Marco; Seidel, Robert; Bradforth, Stephen E; Schwegler, Eric; Galli, Giulia
2017-06-01
Predicting the electronic properties of aqueous liquids has been a long-standing challenge for quantum mechanical methods. However, it is a crucial step in understanding and predicting the key role played by aqueous solutions and electrolytes in a wide variety of emerging energy and environmental technologies, including battery and photoelectrochemical cell design. We propose an efficient and accurate approach to predict the electronic properties of aqueous solutions, on the basis of the combination of first-principles methods and experimental validation using state-of-the-art spectroscopic measurements. We present results of the photoelectron spectra of a broad range of solvated ions, showing that first-principles molecular dynamics simulations and electronic structure calculations using dielectric hybrid functionals provide a quantitative description of the electronic properties of the solvent and solutes, including excitation energies. The proposed computational framework is general and applicable to other liquids, thereby offering great promise in understanding and engineering solutions and liquid electrolytes for a variety of important energy technologies.
Model for intensity calculation in electron guns
NASA Astrophysics Data System (ADS)
Doyen, O.; De Conto, J. M.; Garnier, J. P.; Lefort, M.; Richard, N.
2007-04-01
The calculation of the current in an electron gun structure is one of the main investigations involved in the electron gun physics understanding. In particular, various simulation codes exist but often present some important discrepancies with experiments. Moreover, those differences cannot be reduced because of the lack of physical information in these codes. We present a simple physical three-dimensional model, valid for all kinds of gun geometries. This model presents a better precision than all the other simulation codes and models encountered and allows the real understanding of the electron gun physics. It is based only on the calculation of the Laplace electric field at the cathode, the use of the classical Child-Langmuir's current density, and a geometrical correction to this law. Finally, the intensity versus voltage characteristic curve can be precisely described with only a few physical parameters. Indeed, we have showed that only the shape of the electric field at the cathode without beam, and a distance of an equivalent infinite planar diode gap, govern mainly the electron gun current generation.
Superconducting parity effect across the Anderson limit
Vlaic, Sergio; Pons, Stéphane; Zhang, Tianzhen; Assouline, Alexandre; Zimmers, Alexandre; David, Christophe; Rodary, Guillemin; Girard, Jean-Christophe; Roditchev, Dimitri; Aubin, Hervé
2017-01-01
How small can superconductors be? For isolated nanoparticles subject to quantum size effects, P.W. Anderson in 1959 conjectured that superconductivity could only exist when the electronic level spacing δ is smaller than the superconducting gap energy Δ. Here we report a scanning tunnelling spectroscopy study of superconducting lead (Pb) nanocrystals grown on the (110) surface of InAs. We find that for nanocrystals of lateral size smaller than the Fermi wavelength of the 2D electron gas at the surface of InAs, the electronic transmission of the interface is weak; this leads to Coulomb blockade and enables the extraction of electron addition energy of the nanocrystals. For large nanocrystals, the addition energy displays superconducting parity effect, a direct consequence of Cooper pairing. Studying this parity effect as a function of nanocrystal volume, we find the suppression of Cooper pairing when the mean electronic level spacing overcomes the superconducting gap energy, thus demonstrating unambiguously the validity of the Anderson criterion. PMID:28240294
Electronic structure of aqueous solutions: Bridging the gap between theory and experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pham, Tuan Anh; Govoni, Marco; Seidel, Robert
Predicting the electronic properties of aqueous liquids has been a long-standing challenge for quantum mechanical methods. However, it is a crucial step in understanding and predicting the key role played by aqueous solutions and electrolytes in a wide variety of emerging energy and environmental technologies, including battery and photoelectrochemical cell design. We propose an efficient and accurate approach to predict the electronic properties of aqueous solutions, on the basis of the combination of first-principles methods and experimental validation using state-of-the-art spectroscopic measurements. We present results of the photoelectron spectra of a broad range of solvated ions, showing that first-principles molecularmore » dynamics simulations and electronic structure calculations using dielectric hybrid functionals provide a quantitative description of the electronic properties of the solvent and solutes, including excitation energies. The proposed computational framework is general and applicable to other liquids, thereby offering great promise in understanding and engineering solutions and liquid electrolytes for a variety of important energy technologies.« less
Simulation-Based Approach to Determining Electron Transfer Rates Using Square-Wave Voltammetry.
Dauphin-Ducharme, Philippe; Arroyo-Currás, Netzahualcóyotl; Kurnik, Martin; Ortega, Gabriel; Li, Hui; Plaxco, Kevin W
2017-05-09
The efficiency with which square-wave voltammetry differentiates faradic and charging currents makes it a particularly sensitive electroanalytical approach, as evidenced by its ability to measure nanomolar or even picomolar concentrations of electroactive analytes. Because of the relative complexity of the potential sweep it uses, however, the extraction of detailed kinetic and mechanistic information from square-wave data remains challenging. In response, we demonstrate here a numerical approach by which square-wave data can be used to determine electron transfer rates. Specifically, we have developed a numerical approach in which we model the height and the shape of voltammograms collected over a range of square-wave frequencies and amplitudes to simulated voltammograms as functions of the heterogeneous rate constant and the electron transfer coefficient. As validation of the approach, we have used it to determine electron transfer kinetics in both freely diffusing and diffusionless surface-tethered species, obtaining electron transfer kinetics in all cases in good agreement with values derived using non-square-wave methods.
Fermi LAT observations of cosmic-ray electrons from 7 GeV to 1 TeV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ackermann, M.
We present the results of our analysis of cosmic-ray electrons using about 8 × 10 6 electron candidates detected in the first 12 months on-orbit by the Fermi Large Area Telescope. This work extends our previously published cosmic-ray electron spectrum down to 7 GeV, giving a spectral range of approximately 2.5 decades up to 1 TeV. We describe in detail the analysis and its validation using beam-test and on-orbit data. In addition, we describe the spectrum measured via a subset of events selected for the best energy resolution as a cross-check on the measurement using the full event sample. Ourmore » electron spectrum can be described with a power law ∝ E - 3.08 ± 0.05 with no prominent spectral features within systematic uncertainties. Within the limits of our uncertainties, we can accommodate a slight spectral hardening at around 100 GeV and a slight softening above 500 GeV.« less
Fermi LAT observations of cosmic-ray electrons from 7 GeV to 1 TeV
Ackermann, M.
2010-11-01
We present the results of our analysis of cosmic-ray electrons using about 8 × 10 6 electron candidates detected in the first 12 months on-orbit by the Fermi Large Area Telescope. This work extends our previously published cosmic-ray electron spectrum down to 7 GeV, giving a spectral range of approximately 2.5 decades up to 1 TeV. We describe in detail the analysis and its validation using beam-test and on-orbit data. In addition, we describe the spectrum measured via a subset of events selected for the best energy resolution as a cross-check on the measurement using the full event sample. Ourmore » electron spectrum can be described with a power law ∝ E - 3.08 ± 0.05 with no prominent spectral features within systematic uncertainties. Within the limits of our uncertainties, we can accommodate a slight spectral hardening at around 100 GeV and a slight softening above 500 GeV.« less
Zhang, Peng; Lau, Y. Y.
2016-01-01
Laser-driven ultrafast electron emission offers the possibility of manipulation and control of coherent electron motion in ultrashort spatiotemporal scales. Here, an analytical solution is constructed for the highly nonlinear electron emission from a dc biased metal surface illuminated by a single frequency laser, by solving the time-dependent Schrödinger equation exactly. The solution is valid for arbitrary combinations of dc electric field, laser electric field, laser frequency, metal work function and Fermi level. Various emission mechanisms, such as multiphoton absorption or emission, optical or dc field emission, are all included in this single formulation. The transition between different emission processes is analyzed in detail. The time-dependent emission current reveals that intense current modulation may be possible even with a low intensity laser, by merely increasing the applied dc bias. The results provide insights into the electron pulse generation and manipulation for many novel applications based on ultrafast laser-induced electron emission. PMID:26818710