Science.gov

Sample records for computer criticality assessments

  1. Computer-Assisted Instruction Research: A Critical Assessment.

    ERIC Educational Resources Information Center

    Colorado, Rafael J.

    1988-01-01

    Reviews, analyzes, and critically assesses a sample of articles reporting research findings related to the instructional effectiveness of computer-assisted instruction (CAI). Problems in CAI research are described, including validity of the experimental research design, and current trends in CAI research are discussed. (31 references) (LRW)

  2. Making Student Thinking Visible through a Concept Map in Computer-Based Assessment of Critical Thinking

    ERIC Educational Resources Information Center

    Rosen, Yigal; Tager, Maryam

    2014-01-01

    Major educational initiatives in the world place great emphasis on fostering rich computer-based environments of assessment that make student thinking and reasoning visible. Using thinking tools engages students in a variety of critical and complex thinking, such as evaluating, analyzing, and decision making. The aim of this study was to explore…

  3. Making Student Thinking Visible through a Concept Map in Computer-Based Assessment of Critical Thinking

    ERIC Educational Resources Information Center

    Rosen, Yigal; Tager, Maryam

    2014-01-01

    Major educational initiatives in the world place great emphasis on fostering rich computer-based environments of assessment that make student thinking and reasoning visible. Using thinking tools engages students in a variety of critical and complex thinking, such as evaluating, analyzing, and decision making. The aim of this study was to explore…

  4. Content Analysis in Computer-Mediated Communication: Analyzing Models for Assessing Critical Thinking through the Lens of Social Constructivism

    ERIC Educational Resources Information Center

    Buraphadeja, Vasa; Dawson, Kara

    2008-01-01

    This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…

  5. Content Analysis in Computer-Mediated Communication: Analyzing Models for Assessing Critical Thinking through the Lens of Social Constructivism

    ERIC Educational Resources Information Center

    Buraphadeja, Vasa; Dawson, Kara

    2008-01-01

    This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…

  6. Critical assessment of nucleic acid electrostatics via experimental and computational investigation of an unfolded state ensemble

    PubMed Central

    Bai, Yu; Chu, Vincent B.; Lipfert, Jan; Pande, Vijay S.; Herschlag, Daniel; Doniach, Sebastian

    2010-01-01

    Electrostatic forces, acting between helices and modulated by the presence of the ion atmosphere, are key determinants in the energetic balance that governs RNA folding. Previous studies have employed Poisson-Boltzmann (PB) theory to compute the energetic contribution of these forces in RNA folding. However, the complex interaction of these electrostatic forces with RNA features such as tertiary contact formation, specific ion-binding, and complex interhelical junctions present in prior studies precluded a rigorous evaluation of PB theory, especially in physiologically important Mg2+ solutions. To critically assess PB theory, we developed a model system that isolates these electrostatic forces. The model system, composed of two DNA duplexes tethered by a polyethylene glycol junction, is an analog for the unfolded state of canonical helix-junction-helix motifs found in virtually all structured RNAs. This model system lacks the complicating features that have precluded a critical assessment of PB in prior studies, ensuring that interhelical electrostatic forces dominate the behavior of the system. The system’s simplicity allows PB predictions to be directly compared with small angle x-ray scattering experiments over a range of monovalent and divalent ion concentrations. These comparisons indicate that PB is a reasonable description of the underlying electrostatic energies for monovalent ions, but large deviations are observed for divalent ions. The validation of PB for monovalent solutions allows analysis of the change in the conformational ensemble of this simple motif as salt concentration is changed. Addition of ions allows the motif to sample more compact microstates, increasing its conformational entropy. The increase of conformational entropy presents an additional barrier to folding by stabilizing the unfolded state. Neglecting this effect will adversely impact the accuracy of folding analyses and models. PMID:18722445

  7. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    SciTech Connect

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-11-30

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a “black box”. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards.

  8. Handheld computers in critical care

    PubMed Central

    Lapinsky, Stephen E; Weshler, Jason; Mehta, Sangeeta; Varkul, Mark; Hallett, Dave; Stewart, Thomas E

    2001-01-01

    Background Computing technology has the potential to improve health care management but is often underutilized. Handheld computers are versatile and relatively inexpensive, bringing the benefits of computers to the bedside. We evaluated the role of this technology for managing patient data and accessing medical reference information, in an academic intensive-care unit (ICU). Methods Palm III series handheld devices were given to the ICU team, each installed with medical reference information, schedules, and contact numbers. Users underwent a 1-hour training session introducing the hardware and software. Various patient data management applications were assessed during the study period. Qualitative assessment of the benefits, drawbacks, and suggestions was performed by an independent company, using focus groups. An objective comparison between a paper and electronic handheld textbook was achieved using clinical scenario tests. Results During the 6-month study period, the 20 physicians and 6 paramedical staff who used the handheld devices found them convenient and functional but suggested more comprehensive training and improved search facilities. Comparison of the handheld computer with the conventional paper text revealed equivalence. Access to computerized patient information improved communication, particularly with regard to long-stay patients, but changes to the software and the process were suggested. Conclusions The introduction of this technology was well received despite differences in users' familiarity with the devices. Handheld computers have potential in the ICU, but systems need to be developed specifically for the critical-care environment. PMID:11511337

  9. Computer-Based Assessment in Safety-Critical Industries: The Case of Shipping

    ERIC Educational Resources Information Center

    Gekara, Victor Oyaro; Bloor, Michael; Sampson, Helen

    2011-01-01

    Vocational education and training (VET) concerns the cultivation and development of specific skills and competencies, in addition to broad underpinning knowledge relating to paid employment. VET assessment is, therefore, designed to determine the extent to which a trainee has effectively acquired the knowledge, skills, and competencies required by…

  10. Computer-Based Assessment in Safety-Critical Industries: The Case of Shipping

    ERIC Educational Resources Information Center

    Gekara, Victor Oyaro; Bloor, Michael; Sampson, Helen

    2011-01-01

    Vocational education and training (VET) concerns the cultivation and development of specific skills and competencies, in addition to broad underpinning knowledge relating to paid employment. VET assessment is, therefore, designed to determine the extent to which a trainee has effectively acquired the knowledge, skills, and competencies required by…

  11. Assessing Critical Thinking.

    ERIC Educational Resources Information Center

    Cromwell, Lucy S.

    1992-01-01

    Offers guidelines for the assessment of critical thinking ability, suggesting that assessment be integral to learning, involve a range of behaviors, emphasize expected course, program, or institutional outcomes, incorporate structured feedback and an external dimension, and be cumulative. Lists steps in developing an assessment plan. (DMM)

  12. AVLIS Criticality risk assessment

    SciTech Connect

    Brereton, S.J., LLNL

    1998-04-29

    U-235 and uranium depleted in U-235) are cooled and accumulated in solid metallic form in canisters. The collected product and tails material is weighed and transferred into certified, critically safe, shipping containers (DOT specification 6M with 2R containment vessel). These will be temporarily stored, and then shipped offsite either for use by a fuel fabricator, or for disposal. Tails material will be packaged for disposal. A criticality risk assessment was performed for AVLIS IPD runs. In this analysis, the likelihood of occurrence of a criticality was examined. For the AVLIS process, there are a number of areas that have been specifically examined to assess whether or not the frequency of occurrence of a criticality is credible (frequency of occurrence > 10-6/yr). In this paper, we discuss only two of the areas: the separator and canister operations.

  13. Carahunge - A Critical Assessment

    NASA Astrophysics Data System (ADS)

    González-García, A. César

    Carahunge is a megalithic monument in southern Armenia that has often been acclaimed as the oldest observatory. The monument, composed of dozens of standing stones, has some perforated stones. The direction of the holes has been measured and their orientation is related to the sun, moon, and stars, obtaining a date for the construction of such devices. After a critical review of the methods and conclusions, these are shown as untenable.

  14. Critical care procedure logging using handheld computers.

    PubMed

    Martinez-Motta, J Carlos; Walker, Robin; Stewart, Thomas E; Granton, John; Abrahamson, Simon; Lapinsky, Stephen E

    2004-10-01

    We conducted this study to evaluate the feasibility of implementing an internet-linked handheld computer procedure logging system in a critical care training program. Subspecialty trainees in the Interdepartmental Division of Critical Care at the University of Toronto received and were trained in the use of Palm handheld computers loaded with a customized program for logging critical care procedures. The procedures were entered into the handheld device using checkboxes and drop-down lists, and data were uploaded to a central database via the internet. To evaluate the feasibility of this system, we tracked the utilization of this data collection system. Benefits and disadvantages were assessed through surveys. All 11 trainees successfully uploaded data to the central database, but only six (55%) continued to upload data on a regular basis. The most common reason cited for not using the system pertained to initial technical problems with data uploading. From 1 July 2002 to 30 June 2003, a total of 914 procedures were logged. Significant variability was noted in the number of procedures logged by individual trainees (range 13-242). The database generated by regular users provided potentially useful information to the training program director regarding the scope and location of procedural training among the different rotations and hospitals. A handheld computer procedure logging system can be effectively used in a critical care training program. However, user acceptance was not uniform, and continued training and support are required to increase user acceptance. Such a procedure database may provide valuable information that may be used to optimize trainees' educational experience and to document clinical training experience for licensing and accreditation.

  15. Critical care procedure logging using handheld computers

    PubMed Central

    Carlos Martinez-Motta, J; Walker, Robin; Stewart, Thomas E; Granton, John; Abrahamson, Simon; Lapinsky, Stephen E

    2004-01-01

    Introduction We conducted this study to evaluate the feasibility of implementing an internet-linked handheld computer procedure logging system in a critical care training program. Methods Subspecialty trainees in the Interdepartmental Division of Critical Care at the University of Toronto received and were trained in the use of Palm handheld computers loaded with a customized program for logging critical care procedures. The procedures were entered into the handheld device using checkboxes and drop-down lists, and data were uploaded to a central database via the internet. To evaluate the feasibility of this system, we tracked the utilization of this data collection system. Benefits and disadvantages were assessed through surveys. Results All 11 trainees successfully uploaded data to the central database, but only six (55%) continued to upload data on a regular basis. The most common reason cited for not using the system pertained to initial technical problems with data uploading. From 1 July 2002 to 30 June 2003, a total of 914 procedures were logged. Significant variability was noted in the number of procedures logged by individual trainees (range 13–242). The database generated by regular users provided potentially useful information to the training program director regarding the scope and location of procedural training among the different rotations and hospitals. Conclusion A handheld computer procedure logging system can be effectively used in a critical care training program. However, user acceptance was not uniform, and continued training and support are required to increase user acceptance. Such a procedure database may provide valuable information that may be used to optimize trainees' educational experience and to document clinical training experience for licensing and accreditation. PMID:15469577

  16. Computer Resources Handbook for Flight Critical Systems.

    DTIC Science & Technology

    1985-01-01

    in avionic systems are suspected of being due to software. In a study of software reliability for digital flight controls conducted by SoHaR for the...aircraft and flight crew -- the use of computers in flight critical applications. Special reliability and fault tolerance (RAFT) techniques are being Used...tolerance in flight critical systems. Conventional reliability techniques and analysis and reliability improvement techniques at the system level are

  17. Assessing the Criticality of Metals

    NASA Astrophysics Data System (ADS)

    Graedel, T. E.; Harper, E. M.; Nassar, N.

    Today's technology employs virtually the entire periodic table. The stocks and flows of the major metals, essentially unknown a decade ago, are now reasonably well quantified. Those cycles can be used to generate on overview of societal metal use. A key issue is whether scarcity implies long-term shortages or unavailability. To address this issue, a detailed methodology for generating a reliable assessment of the criticality of metals has been completed, making extensive use of peer-reviewed datasets and analytical approaches from the fields of geology, international trade, political science, and international policy, among others. This criticality evaluation has three components — Supply Risk, Environmental Implications, and Vulnerability to Supply Restriction, each of which is itself the composite of several metrics, as shown below.

  18. NASA Critical Facilities Maintenance Assessment

    NASA Technical Reports Server (NTRS)

    Oberhettinger, David J.

    2006-01-01

    Critical Facilities Maintenance Assessment (CFMA) was first implemented by NASA following the March 2000 overtest of the High Energy Solar Spectroscopic Imager (HESSI) spacecraft. A sine burst dynamic test using a 40 year old shaker failed. Mechanical binding/slippage of the slip table imparted 10 times the planned force to the test article. There was major structural damage to HESSI. The mechanical "health" of the shaker had not been assessed and tracked to assure the test equipment was in good working order. Similar incidents have occurred at NASA facilities due to inadequate maintenance (e.g., rainwater from a leaky roof contaminated an assembly facility that housed a spacecraft). The HESSI incident alerted NASA to the urgent need to identify inadequacies in ground facility readiness and maintenance practices. The consequences of failures of ground facilities that service these NASA systems are severe due to the high unit value of NASA products.

  19. NASA Critical Facilities Maintenance Assessment

    NASA Technical Reports Server (NTRS)

    Oberhettinger, David J.

    2006-01-01

    Critical Facilities Maintenance Assessment (CFMA) was first implemented by NASA following the March 2000 overtest of the High Energy Solar Spectroscopic Imager (HESSI) spacecraft. A sine burst dynamic test using a 40 year old shaker failed. Mechanical binding/slippage of the slip table imparted 10 times the planned force to the test article. There was major structural damage to HESSI. The mechanical "health" of the shaker had not been assessed and tracked to assure the test equipment was in good working order. Similar incidents have occurred at NASA facilities due to inadequate maintenance (e.g., rainwater from a leaky roof contaminated an assembly facility that housed a spacecraft). The HESSI incident alerted NASA to the urgent need to identify inadequacies in ground facility readiness and maintenance practices. The consequences of failures of ground facilities that service these NASA systems are severe due to the high unit value of NASA products.

  20. Mission critical cloud computing in a week

    NASA Astrophysics Data System (ADS)

    George, B.; Shams, K.; Knight, D.; Kinney, J.

    NASA's vision is to “ reach for new heights and reveal the unknown so that what we do and learn will benefit all humankind.” While our missions provide large volumes of unique and invaluable data to the scientific community, they also serve to inspire and educate the next generation of engineers and scientists. One critical aspect of “ benefiting all humankind” is to make our missions as visible and accessible as possible to facilitate the transfer of scientific knowledge to the public. The recent successful landing of the Curiosity rover on Mars exemplified this vision: we shared the landing event via live video streaming and web experiences with millions of people around the world. The video stream on Curiosity's website was delivered by a highly scalable stack of computing resources in the cloud to cache and distribute the video stream to our viewers. While this work was done in the context of public outreach, it has extensive implications for the development of mission critical, highly available, and elastic applications in the cloud for a diverse set of use cases across NASA.

  1. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  2. Formative Assessment: A Critical Review

    ERIC Educational Resources Information Center

    Bennett, Randy Elliot

    2011-01-01

    This paper covers six interrelated issues in formative assessment (aka, "assessment for learning"). The issues concern the definition of formative assessment, the claims commonly made for its effectiveness, the limited attention given to domain considerations in its conceptualisation, the under-representation of measurement principles in…

  3. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  4. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  5. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  6. Equivalent damage: A critical assessment

    NASA Technical Reports Server (NTRS)

    Laflen, J. R.; Cook, T. S.

    1982-01-01

    Concepts in equivalent damage were evaluated to determine their applicability to the life prediction of hot path components of aircraft gas turbine engines. Equivalent damage was defined as being those effects which influence the crack initiation life-time beyond the damage that is measured in uniaxial, fully-reversed sinusoidal and isothermal experiments at low homologous temperatures. Three areas of equivalent damage were examined: mean stress, cumulative damage, and multiaxiality. For each area, a literature survey was conducted to aid in selecting the most appropriate theories. Where possible, data correlations were also used in the evaluation process. A set of criteria was developed for ranking the theories in each equivalent damage regime. These criteria considered aspects of engine utilization as well as the theoretical basis and correlative ability of each theory. In addition, consideration was given to the complex nature of the loading cycle at fatigue critical locations of hot path components; this loading includes non-proportional multiaxial stressing, combined temperature and strain fluctuations, and general creep-fatigue interactions. Through applications of selected equivalent damage theories to some suitable data sets it was found that there is insufficient data to allow specific recommendations of preferred theories for general applications. A series of experiments and areas of further investigations were identified.

  7. Assessing Postgraduate Students' Critical Thinking Ability

    ERIC Educational Resources Information Center

    Javed, Muhammad; Nawaz, Muhammad Atif; Qurat-Ul-Ain, Ansa

    2015-01-01

    This paper addresses to assess the critical thinking ability of postgraduate students. The target population was the male and female students at University level in Pakistan. A small sample of 45 male and 45 female students were selected randomly from The Islamia University of Bahawalpur, Pakistan. Cornell Critical Thinking Test Series, The…

  8. Assessing Postgraduate Students' Critical Thinking Ability

    ERIC Educational Resources Information Center

    Javed, Muhammad; Nawaz, Muhammad Atif; Qurat-Ul-Ain, Ansa

    2015-01-01

    This paper addresses to assess the critical thinking ability of postgraduate students. The target population was the male and female students at University level in Pakistan. A small sample of 45 male and 45 female students were selected randomly from The Islamia University of Bahawalpur, Pakistan. Cornell Critical Thinking Test Series, The…

  9. Food labels: a critical assessment.

    PubMed

    Temple, Norman J; Fraser, Joy

    2014-03-01

    Foods sold in packages have both front-of-package (FOP) labels and back-of-package (BOP) labels. The aim of this review is to determine the role they play in informing consumers as to the composition of foods in order to help select a healthy diet. Recent literature was evaluated and findings combined with assessments made by the authors of food labels used in the United States and Canada. Research shows that most consumers have difficulty understanding the information provided by both FOP and BOP food labels used in the United States and Canada. Research has evaluated the merits of alternative designs. FOP labels should be based on a clear and simple design. They should present information on key nutrients (total fat, saturated fat, sugar, and sodium or salt) and also energy value. They should have color and words that indicate "high," "medium," and "low" levels. Labels can also state quantity per serving. The traffic light system is the best example of this design. An extra traffic light indicating the overall health value of the food should be added. A clearer BOP label also is needed. Implementation of a new food labeling system will probably be opposed by the food industry. More research is needed into which food label designs are most effective, especially for persuading consumers to select healthier food. Both FOP and BOP food labels used in the United States and Canada need to be redesigned using a traffic light system. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Critical care computing. Past, present, and future.

    PubMed

    Seiver, A

    2000-10-01

    With rapidly increasing processing power, networks, and bandwidth, we have ever more powerful tools for ICU computing. The challenge is to use these tools to build on the work of the Innovators and Early Adopters, who pioneered the first three generations of systems, and extend computing to the Majority, who still rely on paper. What is needed is compelling evidence that these systems reduce cost and improve quality. The experience of other industries suggests that we need to address fundamental issues, such as clinical organization, roles, behavior, and incentives, before we will be able to prove the benefits of computing technology. When these preconditions are met, the promise of computing will be realized, perhaps with the upcoming fourth-generation systems. ICU computing can then finally cross the chasm and become the standard of care.

  11. Critical Problems in Very Large Scale Computer Systems

    DTIC Science & Technology

    1990-03-31

    MAY I i9cu( CRITICAL PROBLEMS IN VERY LARGE SCALE COMPUTER SYSTEMS Semiannual Technical Report for the Period October 1, 1989 to...suitability for supporting popular models of parallel computation . During the reporting period they have developed an interface definition. A simulator has...queries in computational geometry . Range queries are a fundamental problem in computational geometry with applications to computer graphics and

  12. Assessment of critical thinking: a Delphi study.

    PubMed

    Paul, Sheila A

    2014-11-01

    Nurse educators are responsible for preparing nurses who critically analyze patient information and provide meaningful interventions in today's complex health care system. By using the Delphi research method, this study, utilized the specialized and experiential knowledge of Certified Nurse Educators. This original Delphi research study asked Certified Nurse Educators how to assess the critical-thinking ability of nursing students in the clinical setting. The results showed that nurse educators need time, during the clinical experience, to accurately assess each individual nursing student. This study demonstrated the need for extended student clinical time, and a variety of clinical learning assessment tools.

  13. To assess the reparative ability of differentiated mesenchymal stem cells in a rat critical size bone repair defect model using high frequency co-registered photoacoustic/ultrasound imaging and micro computed tomography

    NASA Astrophysics Data System (ADS)

    Zafar, Haroon; Gaynard, Sean; O'Flatharta, Cathal; Doroshenkova, Tatiana; Devine, Declan; Sharif, Faisal; Barry, Frank; Hayes, Jessica; Murphy, Mary; Leahy, Martin J.

    2016-03-01

    Stem cell based treatments hold great potential and promise to address many unmet clinical needs. The importance of non-invasive imaging techniques to monitor transplanted stem cells qualitatively and quantitatively is crucial. The objective of this study was to create a critical size bone defect in the rat femur and then assess the ability of the differentiated mesenchymal stem cells (MSCs) to repair the defect using high frequency co-registered photoacoustic(PA)/ultrasound(US) imaging and micro computed tomography (μCT) over an 8 week period. Combined PA and US imaging was performed using 256 elements, 21 MHz frequency linear-array transducer combined with multichannel collecting system. In vivo 3D PA and US images of the defect bone in the rat femur were acquired after 4 and 8 weeks of the surgery. 3D co-registered structural such as microvasculature and the functional images such as total concentration of haemoglobin (HbT) and the haemoglobin oxygen saturation (sO2) were obtained using PA and US imaging. Bone formation was assessed after 4 and 8 weeks of the surgery by μCT. High frequency linear-array based coregistered PA/US imaging has been found promising in terms of non-invasiveness, sensitivity, adaptability, high spatial and temporal resolution at sufficient depths for the assessment of the reparative ability of MSCs in a rat critical size bone repair defect model.

  14. Reliability of assessment of critical thinking.

    PubMed

    Allen, George D; Rubenfeld, M Gaie; Scheffer, Barbara K

    2004-01-01

    Although clinical critical thinking skills and behaviors are among the most highly sought characteristics of BSN graduates, they remain among the most difficult to teach and assess. Three reasons for this difficulty have been (1) lack of agreement among nurse educators as to the definition of critical thinking, (2) low correlation between clinical critical thinking and existing standardized tests of critical thinking, and (3) poor reliability in scoring other evidences of critical thinking, such as essays. This article first describes a procedure for teaching critical thinking that is based on a consensus definition of 17 dimensions of critical thinking in clinical nursing practice. This procedure is easily taught to nurse educators and can be flexibly and inexpensively incorporated into any undergraduate nursing curriculum. We then show that students' understanding and use of these dimensions can be assessed with high reliability (coefficient alpha between 0.7 and 0.8) and with great time efficiency for both teachers and students. By using this procedure iteratively across semesters, students can develop portfolios demonstrating attainment of competence in clinical critical thinking, and educators can obtain important summary evaluations of the degree to which their graduates have succeeded in this important area of their education.

  15. Recent Use of Covariance Data for Criticality Safety Assessment

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2008-01-01

    The TSUNAMI codes of the Oak Ridge National Laboratory SCALE code system were applied to a burnup credit application to demonstrate the use of sensitivity and uncertainty analysis with recent cross section covariance data for criticality safety code and data validation. The use of sensitivity and uncertainty analysis provides for the assessment of a defensible computational bias, bias uncertainty, and gap analysis for a complex system that otherwise could be assessed only through the use of expert judgment and conservative assumptions.

  16. Recent Use of Covariance Data for Criticality Safety Assessment

    NASA Astrophysics Data System (ADS)

    Rearden, B. T.; Mueller, D. E.

    2008-12-01

    The TSUNAMI codes of the Oak Ridge National Laboratory SCALE code system were applied to a burnup credit application to demonstrate the use of sensitivity and uncertainty analysis with recent cross section covariance data for criticality safety code and data validation. The use of sensitivity and uncertainty analysis provides for the assessment of a defensible computational bias, bias uncertainty, and gap analysis for a complex system that otherwise could be assessed only through the use of expert judgment and conservative assumptions.

  17. Mission Critical Computer Resources Management Guide

    DTIC Science & Technology

    1988-09-01

    6 SOFTWARE TEST AND EVALUATION 6.1 TEST PLANNING ........ ..................... 6-1 6.1.1 System Support Computer Resources ....... . 6-1 6.1.2...7-14 CHAPTER 8 PLANNING FOR COMPUTER SOFTWARE 8.1 INTRODUCTION ........ ..................... 8-1 8.2 PLANS AND...DOCUMENTATION ..... .................. 8-1 8.2.1 Program Management Plan (PMP) .. ......... .. 8-1 8.2.2 Test and Evaluation Master Plan (TEMP) .... 8-1 8.2.3

  18. Risk-Assessment Computer Program

    NASA Technical Reports Server (NTRS)

    Dias, William C.; Mittman, David S.

    1993-01-01

    RISK D/C is prototype computer program assisting in attempts to do program risk modeling for Space Exploration Initiative (SEI) architectures proposed in Synthesis Group Report. Risk assessment performed with respect to risk events, probabilities, and severities of potential results. Enables ranking, with respect to effectiveness, of risk-mitigation strategies proposed for exploration program architecture. Allows for fact that risk assessment in early phases of planning subjective. Although specific to SEI in present form, also used as software framework for development of risk-assessment programs for other specific uses. Developed for Macintosh(TM) series computer. Requires HyperCard(TM) 2.0 or later, as well as 2 Mb of random-access memory and System 6.0.8 or later.

  19. Assessing individual frontline nurse critical thinking.

    PubMed

    Berkow, Steven; Virkstis, Katherine; Stewart, Jennifer; Aronson, Sarah; Donohue, Meghan

    2011-04-01

    Enhancing the critical thinking skills of frontline nurses has been a longstanding concern for hospital nursing leaders, but increased complexity of care now lends renewed urgency to this challenge. Rising patient acuity and decreasing length of stay contribute to an environment that challenges even tenured nurses long recognized as strong critical thinkers. To ensure safe patient care in a fast-paced care environment, nurse leaders must invest in more individualized development of key critical-thinking competencies. As part of a broader research initiative on elevating frontline critical thinking, the Nursing Executive Center has developed a diagnostic tool for assessing individual performance on 25 critical thinking skills. The authors discuss the tool's development, methodology, and potential applications.

  20. Critical eigenvalue in LMFBRs: a physics assessment

    SciTech Connect

    McKnight, R.D.; Collins, P.J.; Olsen, D.N.

    1984-01-01

    This paper summarizes recent work to put the analysis of past critical eigenvalue measurements from the US critical experiments program on a consistent basis. The integral data base includes 53 configurations built in 11 ZPPR assemblies which simulate mixed oxide LMFBRs. Both conventional and heterogeneous designs representing 350, 700, and 900 MWe sizes and with and without simulated control rods and/or control rod positions have been studied. The review of the integral data base includes quantitative assessment of experimental uncertainties in the measured excess reactivity. Analyses have been done with design level and higher-order methods using ENDF/B-IV data. Comparisons of these analyses with the experiments are used to generate recommended bias factors for criticality predictions. Recommended methods for analysis of LMFBR fast critical assemblies and LMFBR design calculations are presented. Unresolved issues and areas which require additional experimental or analytical study are identified.

  1. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  2. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  3. Critical Reflection on Cultural Difference in the Computer Conference

    ERIC Educational Resources Information Center

    Ziegahn, Linda

    2005-01-01

    Adult educators have a strong interest in designing courses that stimulate learning toward critical, more inclusive cultural perspectives. Critical reflection is a key component of both intercultural learning and a growing medium of instruction, the asynchronous computer conference (CC). This study combined qualitative methodology with a framework…

  4. Mission Critical Computer Resources Management Guide.

    DTIC Science & Technology

    1990-01-01

    COMPUTER RESOURCES 3.5.2 Assembly Language 3-10 3.5.3 High Order Language 3- 11 2.1 HISTORICAL PERSPECTIVE 2-1 3.5.4 Application Generators (4GL) 3-12 2.2...6- 11 4.10 SOFTWARE SUPPORT 4-7 6.4.3 Integration Testing 6-12 4.11 TOP LEVEL SERVICE DIRECTIVES 6.4.3.1 Hot Bench Testing 6-12 AND GUIDELINES 4-7...7-6 5.4 SOFTWARE DEVELOPMENT 5-5 7.7 IMPROVING THE PDSS PROCESS 7-7 5.4.1 Software Requirements Analysis 5-6 7.8 MANAGEMENT GUIDANCE. 7- 11 5.4.2

  5. Problem Solving and Critical Thinking for Computer Science Educators.

    ERIC Educational Resources Information Center

    Norris, Cathleen A., Ed.; Poirot, James L., Ed.

    The eight papers presented in this monograph are a result of the Problem Solving and Critical Thinking Research Workshop that was held in conjunction with the 1990 National Educational Computing Conference (NECC). The intent of the workshop was to provide a unique forum for researchers to share ideas in a special area of educational computing. The…

  6. Radiation exposure and risk assessment for critical female body organs

    NASA Technical Reports Server (NTRS)

    Atwell, William; Weyland, Mark D.; Hardy, Alva C.

    1991-01-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed.

  7. Radiation exposure and risk assessment for critical female body organs

    NASA Technical Reports Server (NTRS)

    Atwell, William; Weyland, Mark D.; Hardy, Alva C.

    1991-01-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed.

  8. DOE/EM Criticality Safety Needs Assessment

    SciTech Connect

    Westfall, Robert Michael; Hopper, Calvin Mitchell

    2011-02-01

    The issue of nuclear criticality safety (NCS) in Department of Energy Environmental Management (DOE/EM) fissionable material operations presents challenges because of the large quantities of material present in the facilities and equipment that are committed to storage and/or material conditioning and dispositioning processes. Given the uncertainty associated with the material and conditions for many DOE/EM fissionable material operations, ensuring safety while maintaining operational efficiency requires the application of the most-effective criticality safety practices. In turn, more-efficient implementation of these practices can be achieved if the best NCS technologies are utilized. In 2002, DOE/EM-1 commissioned a survey of criticality safety technical needs at the major EM sites. These needs were documented in the report Analysis of Nuclear Criticality Safety Technology Supporting the Environmental Management Program, issued May 2002. Subsequent to this study, EM safety management personnel made a commitment to applying the best and latest criticality safety technology, as described by the DOE Nuclear Criticality Safety Program (NCSP). Over the past 7 years, this commitment has enabled the transfer of several new technologies to EM operations. In 2008, it was decided to broaden the basis of the EM NCS needs assessment to include not only current needs for technologies but also NCS operational areas with potential for improvements in controls, analysis, and regulations. A series of NCS workshops has been conducted over the past years, and needs have been identified and addressed by EM staff and contractor personnel. These workshops were organized and conducted by the EM Criticality Safety Program Manager with administrative and technical support by staff at Oak Ridge National Laboratory (ORNL). This report records the progress made in identifying the needs, determining the approaches for addressing these needs, and assimilating new NCS technologies into EM

  9. CRITICAL ISSUES IN HIGH END COMPUTING - FINAL REPORT

    SciTech Connect

    Corones, James

    2013-09-23

    High-End computing (HEC) has been a driver for advances in science and engineering for the past four decades. Increasingly HEC has become a significant element in the national security, economic vitality, and competitiveness of the United States. Advances in HEC provide results that cut across traditional disciplinary and organizational boundaries. This program provides opportunities to share information about HEC systems and computational techniques across multiple disciplines and organizations through conferences and exhibitions of HEC advances held in Washington DC so that mission agency staff, scientists, and industry can come together with White House, Congressional and Legislative staff in an environment conducive to the sharing of technical information, accomplishments, goals, and plans. A common thread across this series of conferences is the understanding of computational science and applied mathematics techniques across a diverse set of application areas of interest to the Nation. The specific objectives of this program are: Program Objective 1. To provide opportunities to share information about advances in high-end computing systems and computational techniques between mission critical agencies, agency laboratories, academics, and industry. Program Objective 2. To gather pertinent data, address specific topics of wide interest to mission critical agencies. Program Objective 3. To promote a continuing discussion of critical issues in high-end computing. Program Objective 4.To provide a venue where a multidisciplinary scientific audience can discuss the difficulties applying computational science techniques to specific problems and can specify future research that, if successful, will eliminate these problems.

  10. Microdosing: a critical assessment of human data.

    PubMed

    Rowland, Malcolm

    2012-11-01

    Ultrasensitive analytical methodologies have now made possible the ability to characterize the pharmacokinetics (PK) of compounds following administration to humans of a minute, subpharmacologic dose, a microdose. This has the potential to provide pre-IND information to help in early candidate selection, but only if such information is reasonably predictive of PK at pharmacologic doses. The published clinical data in this area are critically assessed and perspectives drawn. The place of microdosing, alone and coupled with other innovative methodologies, both pre-IND and during clinical development, is considered as a way forward to improve the efficiency and informativeness of drug development.

  11. Computer-Assisted Assessment: Highlights and Challenges

    ERIC Educational Resources Information Center

    Tahmasebi, Soheila; Rahimi, Ali

    2013-01-01

    Due to the unquestionable roles of technology in language classes, it might be necessary to use computers in assessing language knowledge. This study aimed to examine how computers may be used to assess language ability of [English for Specific Purposes] ESP students. Sixty computer-major university students at Abadan University are the…

  12. Assessing Terrorist Motivations for Attacking Critical Infrastructure

    SciTech Connect

    Ackerman, G; Abhayaratne, P; Bale, J; Bhattacharjee, A; Blair, C; Hansell, L; Jayne, A; Kosal, M; Lucas, S; Moran, K; Seroki, L; Vadlamudi, S

    2006-12-04

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security and way of life. These complex and often interconnected systems have become so ubiquitous and essential to day-to-day life that they are easily taken for granted. Often it is only when the important services provided by such infrastructure are interrupted--when we lose easy access to electricity, health care, telecommunications, transportation or water, for example--that we are conscious of our great dependence on these networks and of the vulnerabilities that stem from such dependence. Unfortunately, it must be assumed that many terrorists are all too aware that CI facilities pose high-value targets that, if successfully attacked, have the potential to dramatically disrupt the normal rhythm of society, cause public fear and intimidation, and generate significant publicity. Indeed, revelations emerging at the time of this writing about Al Qaida's efforts to prepare for possible attacks on major financial facilities in New York, New Jersey, and the District of Columbia remind us just how real and immediate such threats to CI may be. Simply being aware that our nation's critical infrastructure presents terrorists with a plethora of targets, however, does little to mitigate the dangers of CI attacks. In order to prevent and preempt such terrorist acts, better understanding of the threats and vulnerabilities relating to critical infrastructure is required. The Center for Nonproliferation Studies (CNS) presents this document as both a contribution to the understanding of such threats and an initial effort at ''operationalizing'' its findings for use by analysts who work on issues of critical infrastructure protection. Specifically, this study focuses on a subsidiary aspect of CI threat assessment that has thus far remained largely unaddressed by contemporary terrorism research: the motivations and related factors that determine whether a terrorist

  13. Nutritional Assessment in Critically Ill Patients

    PubMed Central

    Hejazi, Najmeh; Mazloom, Zohreh; Zand, Farid; Rezaianzadeh, Abbas; Amini, Afshin

    2016-01-01

    Background: Malnutrition is an important factor in the survival of critically ill patients. The purpose of the present study was to assess the nutritional status of patients in the intensive care unit (ICU) on the days of admission and discharge via a detailed nutritional assessment. Methods: Totally, 125 patients were followed up from admission to discharge at 8ICUs in Shiraz, Iran. The patients’ nutritional status was assessed using subjective global assessment (SGA), anthropometric measurements, biochemical indices, and body composition indicators. Diet prescription and intake was also evaluated. Results: Malnutrition prevalence significantly increased on the day of discharge (58.62%) compared to the day of admission (28.8%) according to SGA (P<0.001). The patients’ weight, mid-upper-arm circumference, mid-arm muscle circumference, triceps skinfold thickness, and calf circumference decreased significantly as well (P<0.001). Lean mass weight and body cell mass also decreased significantly (P<0.001). Biochemical indices showed no notable changes except for magnesium, which decreased significantly (P=0.013). A negative significant correlation was observed between malnutrition on discharge day and anthropometric measurements. Positive and significant correlations were observed between the number of days without enteral feeding, days delayed from ICU admission to the commencement of enteral feeding, and the length of ICU stay and malnutrition on discharge day. Energy and protein intakes were significantly less than the prescribed diet (26.26% and 26.48%, respectively). Conclusion: Malnutrition on discharge day increased in the patients in the ICU according to SGA. Anthropometric measurements were better predictors of the nutritional outcome of our critically ill patients than were biochemical tests. PMID:27217600

  14. Critical Emergency Medicine Procedural Skills: A Comparative Study of Methods for Teaching and Assessment.

    ERIC Educational Resources Information Center

    Chapman, Dane M.; And Others

    Three critical procedural skills in emergency medicine were evaluated using three assessment modalities--written, computer, and animal model. The effects of computer practice and previous procedure experience on skill competence were also examined in an experimental sequential assessment design. Subjects were six medical students, six residents,…

  15. An Assessment of Student Computer Ergonomic Knowledge.

    ERIC Educational Resources Information Center

    Alexander, Melody W.

    1997-01-01

    Business students (n=254) were assessed on their knowledge of computers, health and safety, radiation, workstations, and ergonomic techniques. Overall knowledge was low in all categories. In particular, they had not learned computer-use techniques. (SK)

  16. Computer Interview Problem Assessment of Psychiatric Patients

    PubMed Central

    Angle, Hugh V.; Ellinwood, Everett H.; Carroll, Judith

    1978-01-01

    Behavioral Assessment information, a more general form of Problem- Oriented Record data, appears to have many useful clinical qualities and was selected to be the information content for a computer interview system. This interview system was designed to assess problematic behaviors of psychiatric patients. The computer interview covered 29 life problem areas and took patients from four to eight hours to complete. In two reliability studies, the computer interview was compared to human interviews. A greater number of general and specific patient problems were identified in the computer interview than in the human interviews. The attitudes of computer patients and clinicians receiving the computer reports were surveyed.

  17. HSE's safety assessment principles for criticality safety.

    PubMed

    Simister, D N; Finnerty, M D; Warburton, S J; Thomas, E A; Macphail, M R

    2008-06-01

    The Health and Safety Executive (HSE) published its revised Safety Assessment Principles for Nuclear Facilities (SAPs) in December 2006. The SAPs are primarily intended for use by HSE's inspectors when judging the adequacy of safety cases for nuclear facilities. The revised SAPs relate to all aspects of safety in nuclear facilities including the technical discipline of criticality safety. The purpose of this paper is to set out for the benefit of a wider audience some of the thinking behind the final published words and to provide an insight into the development of UK regulatory guidance. The paper notes that it is HSE's intention that the Safety Assessment Principles should be viewed as a reflection of good practice in the context of interpreting primary legislation such as the requirements under site licence conditions for arrangements for producing an adequate safety case and for producing a suitable and sufficient risk assessment under the Ionising Radiations Regulations 1999 (SI1999/3232 www.opsi.gov.uk/si/si1999/uksi_19993232_en.pdf).

  18. Computer Science and Engineering Students Addressing Critical Issues Regarding Gender Differences in Computing: A Case Study

    ERIC Educational Resources Information Center

    Tsagala, Evrikleia; Kordaki, Maria

    2008-01-01

    This study focuses on how Computer Science and Engineering Students (CSESs) of both genders address certain critical issues for gender differences in the field of Computer Science and Engineering (CSE). This case study is based on research conducted on a sample of 99 Greek CSESs, 43 of which were women. More specifically, these students were asked…

  19. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  20. Critical infrastructure systems of systems assessment methodology.

    SciTech Connect

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  1. Radiation exposure and risk assessment for critical female body organs

    SciTech Connect

    Atwell, W.; Weyland, M.D.; Hardy, A.C. NASA, Johnson Space Center, Houston, TX )

    1991-07-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed. 13 refs.

  2. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    SciTech Connect

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-09-20

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community.

  3. Cryptographic Key Management and Critical Risk Assessment

    SciTech Connect

    Abercrombie, Robert K

    2014-05-01

    The Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) CyberSecurity for Energy Delivery Systems (CSEDS) industry led program (DE-FOA-0000359) entitled "Innovation for Increasing CyberSecurity for Energy Delivery Systems (12CSEDS)," awarded a contract to Sypris Electronics LLC to develop a Cryptographic Key Management System for the smart grid (Scalable Key Management Solutions for Critical Infrastructure Protection). Oak Ridge National Laboratory (ORNL) and Sypris Electronics, LLC as a result of that award entered into a CRADA (NFE-11-03562) between ORNL and Sypris Electronics, LLC. ORNL provided its Cyber Security Econometrics System (CSES) as a tool to be modified and used as a metric to address risks and vulnerabilities in the management of cryptographic keys within the Advanced Metering Infrastructure (AMI) domain of the electric sector. ORNL concentrated our analysis on the AMI domain of which the National Electric Sector Cyber security Organization Resource (NESCOR) Working Group 1 (WG1) has documented 29 failure scenarios. The computational infrastructure of this metric involves system stakeholders, security requirements, system components and security threats. To compute this metric, we estimated the stakes that each stakeholder associates with each security requirement, as well as stochastic matrices that represent the probability of a threat to cause a component failure and the probability of a component failure to cause a security requirement violation. We applied this model to estimate the security of the AMI, by leveraging the recently established National Institute of Standards and Technology Interagency Report (NISTIR) 7628 guidelines for smart grid security and the International Electrotechnical Commission (IEC) 63351, Part 9 to identify the life cycle for cryptographic key management, resulting in a vector that assigned to each stakeholder an estimate of their average loss in terms of dollars per day of system

  4. WSRC approach to validation of criticality safety computer codes

    SciTech Connect

    Finch, D.R.; Mincey, J.F.

    1991-12-31

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K{sub eff}) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope {sup 236}U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed.

  5. WSRC approach to validation of criticality safety computer codes

    SciTech Connect

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K{sub eff}) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope {sup 236}U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed.

  6. Adapting the Critical Thinking Assessment Test for Palestinian Universities

    ERIC Educational Resources Information Center

    Basha, Sami; Drane, Denise; Light, Gregory

    2016-01-01

    Critical thinking is a key learning outcome for Palestinian students. However, there are no validated critical thinking tests in Arabic. Suitability of the US developed Critical Thinking Assessment Test (CAT) for use in Palestine was assessed. The test was piloted with university students in English (n = 30) and 4 questions were piloted in Arabic…

  7. Inequalities, Assessment and Computer Algebra

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.

    2015-01-01

    The goal of this paper is to examine single variable real inequalities that arise as tutorial problems and to examine the extent to which current computer algebra systems (CAS) can (1) automatically solve such problems and (2) determine whether students' own answers to such problems are correct. We review how inequalities arise in contemporary…

  8. Inequalities, Assessment and Computer Algebra

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.

    2015-01-01

    The goal of this paper is to examine single variable real inequalities that arise as tutorial problems and to examine the extent to which current computer algebra systems (CAS) can (1) automatically solve such problems and (2) determine whether students' own answers to such problems are correct. We review how inequalities arise in contemporary…

  9. Computing Critical Properties with Yang-Yang Anomalies

    NASA Astrophysics Data System (ADS)

    Orkoulas, Gerassimos; Cerdeirina, Claudio; Fisher, Michael

    2017-01-01

    Computation of the thermodynamics of fluids in the critical region is a challenging task owing to divergence of the correlation length and lack of particle-hole symmetries found in Ising or lattice-gas models. In addition, analysis of experiments and simulations reveals a Yang-Yang (YY) anomaly which entails sharing of the specific heat singularity between the pressure and the chemical potential. The size of the YY anomaly is measured by the YY ratio Rμ =C μ /CV of the amplitudes of C μ = - T d2 μ /dT2 and of the total specific heat CV. A ``complete scaling'' theory, in which the pressure mixes into the scaling fields, accounts for the YY anomaly. In Phys. Rev. Lett. 116, 040601 (2016), compressible cell gas (CCG) models which exhibit YY and singular diameter anomalies, have been advanced for near-critical fluids. In such models, the individual cell volumes are allowed to fluctuate. The thermodynamics of CCGs can be computed through mapping onto the Ising model via the seldom-used great grand canonical ensemble. The computations indicate that local free volume fluctuations are the origins of the YY effects. Furthermore, local energy-volume coupling (to model water) is another crucial factor underlying the phenomena.

  10. A Review of Computer-Assisted Assessment

    ERIC Educational Resources Information Center

    Conole, Grainne; Warburton, Bill

    2005-01-01

    Pressure for better measurement of stated learning outcomes has resulted in a demand for more frequent assessment. The resources available are seen to be static or dwindling, but Information and Communications Technology is seen to increase productivity by automating assessment tasks. This paper reviews computer-assisted assessment (CAA) and…

  11. Computer Proficiency Questionnaire: Assessing Low and High Computer Proficient Seniors

    PubMed Central

    Boot, Walter R.; Charness, Neil; Czaja, Sara J.; Sharit, Joseph; Rogers, Wendy A.; Fisk, Arthur D.; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-01-01

    Purpose of the Study: Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. Design and Methods: To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. Results: The CPQ demonstrated excellent reliability (Cronbach’s α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. Implications: The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. PMID:24107443

  12. Assessment tool for nursing student computer competencies.

    PubMed

    Elder, Betty L; Koehn, Mary L

    2009-01-01

    Computer skills have been established as important for nursing students and for graduate nurses. No current research was found on the best method to evaluate the skills of incoming nursing students. The purpose of this descriptive, correlational study was to compare student ratings of their computer competency to their performance of those skills on a computer-graded assessment. A convenience sample of 87 nursing students was used. There was a low, but significant correlation between the scores on the survey and the assessment. The results suggest that students rate themselves higher on their skills than their actual performance of computer skills. Implications for educators are presented, and the value of using a computer-graded assessment is discussed.

  13. Self-organized criticality in a computer network model

    PubMed

    Yuan; Ren; Shan

    2000-02-01

    We study the collective behavior of computer network nodes by using a cellular automaton model. The results show that when the load of network is constant, the throughputs and buffer contents of nodes are power-law distributed in both space and time. Also the feature of 1/f noise appears in the power spectrum of the change of the number of nodes that bear a fixed part of the system load. It can be seen as yet another example of self-organized criticality. Power-law decay in the distribution of buffer contents implies that heavy network congestion occurs with small probability. The temporal power-law distribution for throughput might be a reasonable explanation for the observed self-similarity in computer network traffic.

  14. CriTi-CAL: A computer program for Critical Coiled Tubing Calculations

    SciTech Connect

    He, X.

    1995-12-31

    A computer software package for simulating coiled tubing operations has been developed at Rogaland Research. The software is named CriTiCAL, for Critical Coiled Tubing Calculations. It is a PC program running under Microsoft Windows. CriTi-CAL is designed for predicting force, stress, torque, lockup, circulation pressure losses and along-hole-depth corrections for coiled tubing workover and drilling operations. CriTi-CAL features an user-friendly interface, integrated work string and survey editors, flexible input units and output format, on-line documentation and extensive error trapping. CriTi-CAL was developed by using a combination of Visual Basic and C. Such an approach is an effective way to quickly develop high quality small to medium size software for the oil industry. The software is based on the results of intensive experimental and theoretical studies on buckling and post-buckling of coiled tubing at Rogaland Research. The software has been validated by full-scale test results and field data.

  15. Computer Applications in Assessment and Counseling.

    ERIC Educational Resources Information Center

    Veldman, Donald J.; Menaker, Shirley L.

    Public school counselors and psychologists can expect valuable assistance from computer-based assessment and counseling techniques within a few years, as programs now under development become generally available for the typical computers now used by schools for grade-reporting and class-scheduling. Although routine information-giving and gathering…

  16. Critical assessment of automated flow cytometry data analysis techniques.

    PubMed

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R; Brinkman, Ryan; Gottardo, Raphael; Scheuermann, Richard H

    2013-03-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks: (i) mammalian cell population identification, to determine whether automated algorithms can reproduce expert manual gating and (ii) sample classification, to determine whether analysis pipelines can identify characteristics that correlate with external variables (such as clinical outcome). This analysis presents the results of the first FlowCAP challenges. Several methods performed well as compared to manual gating or external variables using statistical performance measures, which suggests that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis.

  17. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    PubMed Central

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  18. Nutritional assessment and enteral support of critically ill children.

    PubMed

    Ista, Erwin; Joosten, Koen

    2005-12-01

    Critical care nurses play an important role in feeding of critically ill children. Many procedures and caregiving interventions, such as placement of feeding tubes, registration of gastric retention, observation and care of the mouth, and administration of nutrition (enteral or parenteral), are within the nursing domain. This article discusses nutritional assessment techniques and enteral nutrition in critically ill children.

  19. Computer-based consultation in "care" of the critically ill patient.

    PubMed

    Siegel, J H; Fichthorn, J; Monteferrante, J; Moody, E; Box, N; Nolan, C; Ardrey, R

    1976-09-01

    Despite far-reaching progress in all areas of surgery, methods of medical data analysis and communication have not kept pace with the increased rate of data acquisition. The needs to organize and communicate these data and to provide a medium for continuing education are great in critical-care areas where the amount and the diversity of data collected are enormous, and the number of surgical team members involved in patient care has grown proportionately. The computer-based Clinical Assessment, Research, and Education System (CARE) is a time-shared computer system now available on a national basis designed to provide a management and education aid for the treatment of critically ill surgical patients. An initial clinical assessment and operative note are entered by the surgeon from which an estimation of the initial fluid, blood, and electrolyte deficits are calculated. Daily doctor's progress notes, shift nurses' summaries of vital signs, clinical information, intake and output data, and drug administration, biochemical, cardiovascular, blood gas, and respiratory information are entered for each shift. From these, a metabolic balance is calculated; fluid, electrolyte, and caloric requirements are determined; cardiorespiratory parameters are computed; and various therapuetic suggestions and cautions are given to alert the physician to problems that may be arising. The surgeon-user is assisted in making the best critical-care decisions through computer-directed, interactive prompting which focuses on the most important clinical conditions and correlations and metabolic considerations and relates the important problem to the relevant literature.

  20. The change in critical technologies for computational physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1990-01-01

    It is noted that the types of technology required for computational physics are changing as the field matures. Emphasis has shifted from computer technology to algorithm technology and, finally, to visual analysis technology as areas of critical research for this field. High-performance graphical workstations tied to a supercommunicator with high-speed communications along with the development of especially tailored visualization software has enabled analysis of highly complex fluid-dynamics simulations. Particular reference is made here to the development of visual analysis tools at NASA's Numerical Aerodynamics Simulation Facility. The next technology which this field requires is one that would eliminate visual clutter by extracting key features of simulations of physics and technology in order to create displays that clearly portray these key features. Research in the tuning of visual displays to human cognitive abilities is proposed. The immediate transfer of technology to all levels of computers, specifically the inclusion of visualization primitives in basic software developments for all work stations and PCs, is recommended.

  1. A Novel Instrument for Assessing Students' Critical Thinking Abilities

    ERIC Educational Resources Information Center

    White, Brian; Stains, Marilyne; Escriu-Sune, Marta; Medaglia, Eden; Rostamnjad, Leila; Chinn, Clark; Sevian, Hannah

    2011-01-01

    Science literacy involves knowledge of both science content and science process skills. In this study, we describe the Assessment of Critical Thinking Ability survey and its preliminary application to assess the critical thinking skills of undergraduate students, graduate students, and postdoctoral fellows. This survey is based on a complex and…

  2. Assessing Quality of Critical Thought in Online Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa; Baltes, Beate; Lynn, Laura Knight

    2009-01-01

    Purpose: The purpose of this paper is to describe a theoretically based coding framework for an integrated analysis and assessment of critical thinking in online discussion. Design/methodology/approach: The critical thinking assessment framework (TAF) is developed through review of theory and previous research, verified by comparing results to…

  3. Computer proficiency questionnaire: assessing low and high computer proficient seniors.

    PubMed

    Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-06-01

    Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. The Collegiate Learning Assessment: A Critical Perspective

    ERIC Educational Resources Information Center

    Shermis, Mark D.

    2008-01-01

    This article describes the Collegiate Learning Assessment (CLA), a postsecondary assessment tool designed to evaluate the "value-added" component of institutional contributions to student learning outcomes. Developed by the Council for Aid to Education (CAE), the instrument ostensibly focuses on the contributions of general education coursework…

  5. Perspectives on sedation assessment in critical care.

    PubMed

    Olson, Daiwai M; Thoyre, Suzanne M; Auyong, David B

    2007-01-01

    Multiple studies have been undertaken to show that neurofunction monitors can correlate to objective sedation assessments. Showing a correlation between these 2 patient assessments tools may not be the correct approach for validation of neurofunction monitors. Two different methods of assessing 2 different modes of the patient's response to sedation should not be expected to precisely correlate unless the desire is to replace one method with the other. We provide a brief summary of several sedation scales, physiologic measures and neurofunction monitoring tools, and correlations literature for bispectral index monitoring, and the Ramsay Scale and the Sedation Agitation Scale. Neurofunction monitors provide near continuous information about a different domain of the sedation response than intermittent observational assessments. Further research should focus on contributions from this technology to the improvement of patient outcomes when neurofunction monitoring is used as a complement, not a replacement, for observational methods of sedation assessment.

  6. Bad Actors Criticality Assessment for Pipeline system

    NASA Astrophysics Data System (ADS)

    Nasir, Meseret; Chong, Kit wee; Osman, Sabtuni; Siaw Khur, Wee

    2015-04-01

    Failure of a pipeline system could bring huge economic loss. In order to mitigate such catastrophic loss, it is required to evaluate and rank the impact of each bad actor of the pipeline system. In this study, bad actors are known as the root causes or any potential factor leading to the system downtime. Fault Tree Analysis (FTA) is used to analyze the probability of occurrence for each bad actor. Bimbaum's Importance and criticality measure (BICM) is also employed to rank the impact of each bad actor on the pipeline system failure. The results demonstrate that internal corrosion; external corrosion and construction damage are critical and highly contribute to the pipeline system failure with 48.0%, 12.4% and 6.0% respectively. Thus, a minor improvement in internal corrosion; external corrosion and construction damage would bring significant changes in the pipeline system performance and reliability. These results could also be useful to develop efficient maintenance strategy by identifying the critical bad actors.

  7. [Pain management for cancer patients with critical pathway on computer].

    PubMed

    Hori, Natsuki; Konishi, Toshiro

    2005-02-01

    For relief from cancer pain, we developed critical pathway (CP) as an effective strategy for the medical staff treating cancer patients. This CP was made out of Microsoft Excel, and was used on personal computers. "Good sleeping" was set as the first goal and the second was "No pain in rest position." To achieve this, physicians and nurses evaluate medical efficacy and complications including nausea/vomiting, constipation, somnolence and hallucination everyday using controlled release oxycodone in addition to NSAIDs and prochlorperazine, stool softener and peristaltic stimulant for adverse effects. These outcomes lead to the medication change the next day by calculation using visual basic function due to opioid titration theory. In twelve patients this CP was acceptable, and all of them achieved the second goal within a week without severe adverse effects except constipation.

  8. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  9. NUMERICAL COMPUTATIONS OF CO-EXISTING SUPER-CRITICAL AND SUB-CRITICAL FLOWS BASED UPON CRD SCHEMES

    NASA Astrophysics Data System (ADS)

    Horie, Katsuya; Okamura, Seiji; Kobayashi, Yusuke; Hyodo, Makoto; Hida, Yoshihisa; Nishimoto, Naoshi; Mori, Akio

    Stream flows in steep gradient bed form complicating flow configurations, where co-exist super-critical and sub-critical flows. Computing numerically such flows are the key to successful river management. This study applied CRD schemes to 1D and 2D stream flow computations and proposed genuine ways to eliminate expansion shock waves. Through various cases of computing stream flows conducted, CRD schemes showed that i) conservativeness of discharge and accuracy of four significant figures are ensured, ii) artificial viscosity is not explicitly used for computational stabilization, and thus iii) 1D and 2D computations based upon CRD schemes are applicable to evaluating complicating stream flows for river management.

  10. Criticality of Water: Aligning Water and Mineral Resources Assessment.

    PubMed

    Sonderegger, Thomas; Pfister, Stephan; Hellweg, Stefanie

    2015-10-20

    The concept of criticality has been used to assess whether a resource may become a limiting factor to economic activities. It has been primarily applied to nonrenewable resources, in particular to metals. However, renewable resources such as water may also be overused and become a limiting factor. In this paper, we therefore developed a water criticality method that allows for a new, user-oriented assessment of water availability and accessibility. Comparability of criticality across resources is desirable, which is why the presented adaptation of the criticality approach to water is based on a metal criticality method, whose basic structure is maintained. With respect to the necessary adaptations to the water context, a transparent water criticality framework is proposed that may pave the way for future integrated criticality assessment of metals, water, and other resources. Water criticality scores were calculated for 159 countries subdivided into 512 geographic units for the year 2000. Results allow for a detailed analysis of criticality profiles, revealing locally specific characteristics of water criticality. This is useful for the screening of sites and their related water criticality, for indication of water related problems and possible mitigation options and water policies, and for future water scenario analysis.

  11. Assessing the physical loading of wearable computers.

    PubMed

    Knight, James F; Baber, Chris

    2007-03-01

    Wearable computers enable workers to interact with computer equipment in situations where previously they were unable. Attaching a computer to the body though has an unknown physical effect. This paper reports a methodology for addressing this, by assessing postural effects and the effect of added weight. Using the example of arm-mounted computers (AMCs), the paper shows that adopting a posture to interact with an AMC generates fatiguing levels of stress and a load of 0.54 kg results in increased level of stress and increased rate of fatigue. The paper shows that, due to poor postures adopted when wearing and interacting with computers and the weight of the device attached to the body, one possible outcome for prolonged exposure is the development of musculoskeletal disorders.

  12. RHIC CRITICAL POINT SEARCH: ASSESSING STARs CAPABILITIES.

    SciTech Connect

    SORENSEN,P.

    2006-07-03

    In this report we discuss the capabilities and limitations of the STAR detector to search for signatures of the QCD critical point in a low energy scan at RHIC. We find that a RHIC low energy scan will cover a broad region of interest in the nuclear matter phase diagram and that the STAR detector--a detector designed to measure the quantities that will be of interest in this search--will provide new observables and improve on previous measurements in this energy range.

  13. A Critical Evaluation of Cognitive Style Assessment.

    ERIC Educational Resources Information Center

    Richter, Ricka

    This document reviews theories of cognitive style and methods of cognitive style assessment as they relate to the context of South Africa, where sociopolitical changes call for reassessment of theoretical assumptions in education and training. The report consists of six chapters. After a brief introductory chapter, the second chapter gives an…

  14. Fuzzy architecture assessment for critical infrastructure resilience

    SciTech Connect

    Muller, George

    2012-12-01

    This paper presents an approach for the selection of alternative architectures in a connected infrastructure system to increase resilience of the overall infrastructure system. The paper begins with a description of resilience and critical infrastructure, then summarizes existing approaches to resilience, and presents a fuzzy-rule based method of selecting among alternative infrastructure architectures. This methodology includes considerations which are most important when deciding on an approach to resilience. The paper concludes with a proposed approach which builds on existing resilience architecting methods by integrating key system aspects using fuzzy memberships and fuzzy rule sets. This novel approach aids the systems architect in considering resilience for the evaluation of architectures for adoption into the final system architecture.

  15. A COMPUTER-ASSIST MATERIAL TRACKING SYSTEM AS A CRITICALITY SAFETY AID TO OPERATORS

    SciTech Connect

    Claybourn, R V; Huang, S T

    2007-03-30

    In today's compliant-driven environment, fissionable material handlers are inundated with work control rules and procedures in carrying out nuclear operations. Historically, human errors are one of the key contributors of various criticality accidents. Since moving and handling fissionable materials are key components of their job functions, any means that can be provided to assist operators in facilitating fissionable material moves will help improve operational efficiency and enhance criticality safety implementation. From the criticality safety perspective, operational issues have been encountered in Lawrence Livermore National Laboratory (LLNL) plutonium operations. Those issues included lack of adequate historical record keeping for the fissionable material stored in containers, a need for a better way of accommodating operations in a research and development setting, and better means of helping material handlers in carrying out various criticality safety controls. Through the years, effective means were implemented including better work control process, standardized criticality control conditions (SCCC) and relocation of criticality safety engineers to the plutonium facility. Another important measure taken was to develop a computer data acquisition system for criticality safety assessment, which is the subject of this paper. The purpose of the Criticality Special Support System (CSSS) is to integrate many of the proven operational support protocols into a software system to assist operators with assessing compliance to procedures during the handling and movement of fissionable materials. Many nuclear facilities utilize mass cards or a computer program to track fissionable material mass data in operations. Additional item specific data such as, the presence of moderators or close fitting reflectors, could be helpful to fissionable material handlers in assessing compliance to SCCC's. Computer-assist checking of a workstation material inventory against the

  16. Assessing Vulnerabilities, Risks, and Consequences of Damage to Critical Infrastructure

    SciTech Connect

    Suski, N; Wuest, C

    2011-02-04

    Phase brings together infrastructure owners and operators to identify critical assets and help the team create a structured information request. During this phase, we gain information about the critical assets from those who are most familiar with operations and interdependencies, making the time we spend on the ground conducting the assessment much more productive and enabling the team to make actionable recommendations. The Assessment Phase analyzes 10 areas: Threat environment, cyber architecture, cyber penetration, physical security, physical penetration, operations security, policies and procedures, interdependencies, consequence analysis, and risk characterization. Each of these individual tasks uses direct and indirect data collection, site inspections, and structured and facilitated workshops to gather data. Because of the importance of understanding the cyber threat, LLNL has built both fixed and mobile cyber penetration, wireless penetration and supporting tools that can be tailored to fit customer needs. The Post-Assessment Phase brings vulnerability and risk assessments to the customer in a format that facilitates implementation of mitigation options. Often the assessment findings and recommendations are briefed and discussed with several levels of management and, if appropriate, across jurisdictional boundaries. The end result is enhanced awareness and informed protective measures. Over the last 15 years, we have continued to refine our methodology and capture lessons learned and best practices. The resulting risk and decision framework thus takes into consideration real-world constraints, including regulatory, operational, and economic realities. In addition to 'on the ground' assessments focused on mitigating vulnerabilities, we have integrated our computational and atmospheric dispersion capability with easy-to-use geo-referenced visualization tools to support emergency planning and response operations. LLNL is home to the National Atmospheric Release

  17. Nurses' critical event risk assessments: a judgement analysis.

    PubMed

    Thompson, Carl; Bucknall, Tracey; Estabrookes, Carole A; Hutchinson, Alison; Fraser, Kim; de Vos, Rien; Binnecade, Jan; Barrat, Gez; Saunders, Jane

    2009-02-01

    To explore and explain nurses' use of readily available clinical information when deciding whether a patient is at risk of a critical event. Half of inpatients who suffer a cardiac arrest have documented but unacted upon clinical signs of deterioration in the 24 hours prior to the event. Nurses appear to be both misinterpreting and mismanaging the nursing-knowledge 'basics' such as heart rate, respiratory rate and oxygenation. Whilst many medical interventions originate from nurses, up to 26% of nurses' responses to abnormal signs result in delays of between one and three hours. A double system judgement analysis using Brunswik's lens model of cognition was undertaken with 245 Dutch, UK, Canadian and Australian acute care nurses. Nurses were asked to judge the likelihood of a critical event, 'at-risk' status, and whether they would intervene in response to 50 computer-presented clinical scenarios in which data on heart rate, systolic blood pressure, urine output, oxygen saturation, conscious level and oxygenation support were varied. Nurses were also presented with a protocol recommendation and also placed under time pressure for some of the scenarios. The ecological criterion was the predicted level of risk from the Modified Early Warning Score assessments of 232 UK acute care inpatients. Despite receiving identical information, nurses varied considerably in their risk assessments. The differences can be partly explained by variability in weightings given to information. Time and protocol recommendations were given more weighting than clinical information for key dichotomous choices such as classifying a patient as 'at risk' and deciding to intervene. Nurses' weighting of cues did not mirror the same information's contribution to risk in real patients. Nurses synthesized information in non-linear ways that contributed little to decisional accuracy. The low-moderate achievement (R(a)) statistics suggests that nurses' assessments of risk were largely inaccurate

  18. Assessment of (Computer-Supported) Collaborative Learning

    ERIC Educational Resources Information Center

    Strijbos, J. -W.

    2011-01-01

    Within the (Computer-Supported) Collaborative Learning (CS)CL research community, there has been an extensive dialogue on theories and perspectives on learning from collaboration, approaches to scaffold (script) the collaborative process, and most recently research methodology. In contrast, the issue of assessment of collaborative learning has…

  19. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  20. Computer Competence: The First National Assessment.

    ERIC Educational Resources Information Center

    Martinez, Michael E.; Mead, Nancy A.

    This report contains the results of a national survey conducted by the National Assessment of Educational Progress (NAEP) during the 1985-86 school year. The report, which attempts to capture the interacting forces influencing computer competence among students, is presented in six chapters: (1) Overview (major findings, significance of this…

  1. Assessment of critical-fluid extractions in the process industries

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The potential for critical-fluid extraction as a separation process for improving the productive use of energy in the process industries is assessed. Critical-fluid extraction involves the use of fluids, normally gaseous at ambient conditions, as extraction solvents at temperatures and pressures around the critical point. Equilibrium and kinetic properties in this regime are very favorable for solvent applications, and generally allow major reductions in the energy requirements for separating and purifying chemical component of a mixture.

  2. Assessment of community contamination: a critical approach.

    PubMed

    Clark, Lauren; Barton, Judith A; Brown, Nancy J

    2002-01-01

    The purpose of this paper is to review data from two Superfund sites and describe the latitude of interpretation of "environmental risk" by residents living in the area, governmental agencies, and the media. The first community was located within a 5-mi perimeter of the Rocky Flats Environmental Technology Site (RFETS) outside Denver, Colorado. The second community was located on the south side of Tucson, Arizona, adjacent to the Tucson International Airport area (TIAA) Superfund site. Critical theory was the perspective used in this analysis and proposal of public health actions to attain social justice. Differences between the two populations' experiences with risk and contamination coincided with divergent levels of trust in government. RFETS residents demanded monitoring, whereas the minority residents at TIAA were ambivalent about their trust in government cleanup activities. Unraveling the purpose of "facts" and the social force of "truth" can direct nurses to address environmental justice issues. By policing governmental and business activities in halting or cleaning up environmental contamination, nurses may become mouthpieces for the concerns underlying the fragile surface of "virtual trust" in contaminated communities. Cutting through competing rhetoric to police environmental safety, the core function of assurance becomes what nurses do, not what they say.

  3. Cyber Security: Critical Infrastructure Controls Assessment Framework

    DTIC Science & Technology

    2011-05-01

    Industry SANS ‐ CAG OASIS Private   ISA‐99 <more…> SOX <more…> OWASP <more…> And Growing Day by Day……………….. CIP Security Controls Assessment...NERC-CIP NIST-Cyber Grid Chemical Cyber Physical System Security Standards PCI OASIS OWASP Nuclear Transportation ISA-99 CIP Security Controls...the Advancement of Structured Information Standards 22. OWASP  ‐ Open Web Application Security Project  23. PCI – Payment Card Industry 24. PCS

  4. Critical temperature: A quantitative method of assessing cold tolerance

    Treesearch

    D.H. DeHayes; M.W., Jr. Williams

    1989-01-01

    Critical temperature (Tc), defined as the highest temperature at which freezing injury to plant tissues can be detected, provides a biologically meaningful and statistically defined assessment of the relative cold tolerance of plant tissues. A method is described for calculating critical temperatures in laboratory freezing studies that use...

  5. Assessment of Critical Thinking Ability in Medical Students

    ERIC Educational Resources Information Center

    Macpherson, Karen; Owen, Cathy

    2010-01-01

    In this study conducted with 80 first-year students in a graduate medical course at the Australian National University, Canberra, students' critical thinking skills were assessed using the Watson-Glaser Critical Thinking Appraisal (Forms A and B) in a test-retest design. Results suggested that overall subjects retained consistent patterns of…

  6. Research on computer aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  7. Computer assisted blast design and assessment tools

    SciTech Connect

    Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.

    1995-12-31

    In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing; evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.

  8. Critical thinking traits of top-tier experts and implications for computer science education

    NASA Astrophysics Data System (ADS)

    Bushey, Dean E.

    of this study suggest a need to examine how critical-thinking abilities are learned in the undergraduate computer science curriculum and the need to foster these abilities in order to produce the high-level, critical-thinking professionals necessary to fill the growing need for these experts. Due to the fact that current measures of academic performance do not adequately depict students' cognitive abilities, assessment of these skills must be incorporated into existing curricula.

  9. Computed Tomography: Image and Dose Assessment

    SciTech Connect

    Valencia-Ortega, F.; Ruiz-Trejo, C.; Rodriguez-Villafuerte, M.; Buenfil, A. E.; Mora-Hernandez, L. A.

    2006-09-08

    In this work an experimental evaluation of image quality and dose imparted during a computed tomography study in a Public Hospital in Mexico City is presented; The measurements required the design and construction of two phantoms at the Institute of Physics, UNAM, according to the recommendations of American Association of Physicists in Medicine (AAPM). Image assessment was performed in terms the spatial resolution and image contrast. Dose measurements were carried out using LiF: Mg,Ti (TLD-100) dosemeters and pencil-shaped ionisation chamber; The results for a computed tomography head study in single and multiple detector modes are presented.

  10. Criticality safety assessment of tank 241-C-106 remediation

    SciTech Connect

    Waltar, A.E., Westinghouse Hanford

    1996-07-19

    A criticality safety assessment was performed in support of Project 320 for the retrieval of waste from tank 241-C-106 to tank 241-AY-102. The assessment was performed by a multi-disciplined team consisting of expertise covering the range of nuclear engineering, plutonium and nuclear waste chemistry,and physical mixing hydraulics. Technical analysis was performed to evaluate the physical and chemical behavior of fissile material in neutralized Hanford waste as well as modeling of the fluid dynamics for the retrieval activity. The team has not found evidence of any credible mechanism to attain neutronic criticality in either tank and has concluded that a criticality accident is incredible.

  11. Accessible high performance computing solutions for near real-time image processing for time critical applications

    NASA Astrophysics Data System (ADS)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  12. Critical Assessment of Correction Methods for Fisheye Lens Distortion

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Tian, C.; Huang, Y.

    2016-06-01

    A fisheye lens is widely used to create a wide panoramic or hemispherical image. It is an ultra wide-angle lens that produces strong visual distortion. The distortion modeling and estimation of the fisheye lens are the crucial step for fisheye lens calibration and image rectification in computer vision and close-range photography. There are two kinds of distortion: radial and tangential distortion. Radial distortion is large for fisheye imaging and critical for the subsequent image processing. Although many researchers have developed calibration algorithms of radial distortion of fisheye lens, quantitative evaluation of the correction performance has remained a challenge. This is the first paper that intuitively and objectively evaluates the performance of five different calibration algorithms. Upto- date research on fisheye lens calibration is comprehensively reviewed to identify the research need. To differentiate their performance in terms of precision and ease-using, five methods are then tested using a diverse set of actual images of the checkerboard that are taken at Wuhan University, China under varying lighting conditions, shadows, and shooting angles. The method of rational function model, which was generally used for wide-angle lens correction, outperforms the other methods. However, the one parameter division model is easy for practical use without compromising too much the precision. The reason is that it depends on the linear structure in the image and requires no preceding calibration. It is a tradeoff between correction precision and ease-using. By critically assessing the strengths and limitations of the existing algorithms, the paper provides valuable insight and guideline for future practice and algorithm development that are important for fisheye lens calibration. It is promising for the optimal design of lens correction models that are suitable for the millions of portable imaging devices.

  13. Mobile sources critical review: 1998 NARSTO assessment

    NASA Astrophysics Data System (ADS)

    Sawyer, R. F.; Harley, R. A.; Cadle, S. H.; Norbeck, J. M.; Slott, R.; Bravo, H. A.

    Mobile sources of air pollutants encompass a range of vehicle, engine, and fuel combinations. They emit both of the photochemical ozone precursors, hydrocarbons and oxides of nitrogen. The most important source of hydrocarbons and oxides of nitrogen are light- and heavy-duty on-road vehicles and heavy-duty off-road vehicles, utilizing spark and compression ignition engines burning gasoline and diesel respectively. Fuel consumption data provide a convenient starting point for assessing current and future emissions. Modern light-duty, gasoline vehicles when new have very low emissions. The in-use fleet, due largely to emissions from a small "high emitter" fraction, has significantly larger emissions. Hydrocarbons and carbon monoxide are higher than reported in current inventories. Other gasoline powered mobile sources (motorcycles, recreational vehicles, lawn, garden, and utility equipment, and light aircraft) have high emissions on a per quantity of fuel consumed basis, but their contribution to total emissions is small. Additional uncertainties in spatial and temporal distribution of emissions exist. Heavy-duty diesel vehicles are becoming the dominant mobile source of oxides of nitrogen. Oxides of nitrogen emissions may be greater than reported in current inventories, but the evidence for this is mixed. Oxides of nitrogen emissions on a fuel-consumed basis are much greater from diesel mobile sources than from gasoline mobile sources. This is largely the result of stringent control of gasoline vehicle emissions and a lesser (heavy-duty trucks) or no control (construction equipment, locomotives, ships) of heavy-duty mobile sources. The use of alternative fuels, natural gas, propane, alcohols, and oxygenates in motor vehicles is increasing but remains small. Vehicles utilizing these fuels can be but are not necessarily cleaner than their gasoline or diesel counterparts. Historical vehicle kilometers traveled growth rates of about 2% annually in both the United States

  14. Computer Software Training and HRD: What Are the Critical Issues?

    ERIC Educational Resources Information Center

    Altemeyer, Brad

    2005-01-01

    The paper explores critical issues for HRD practice from a parsonian framework across the HRD legs of organizational development, adult learning, and training and development. Insights into the critical issues emerge from this approach. Identifying successful transfer of training to be critical for organizational, group, and individual success.…

  15. Integrating Critical Thinking into the Assessment of College Writing

    ERIC Educational Resources Information Center

    McLaughlin, Frost; Moore, Miriam

    2012-01-01

    When writing teachers at any level get together to assess student essays, they often disagree in their evaluations of the writing at hand. This is no surprise as writing is a complex process, and in evaluating it, teachers go through a complex sequence of thoughts before emerging with an overall assessment. Critical thinking, or the complexity of…

  16. Criticism and Assessment Applied to New Media Art

    ERIC Educational Resources Information Center

    Ursyn, Anna

    2015-01-01

    This text examines educational criticism and assessment with an emphasis on the new media arts. The article shares with readers the versatile, abridged to four points criteria, based on a research on assessment made by students, faculty, and non-art-related professionals, thus providing a preliminary tool for the use in the classroom environment.…

  17. Establishing the Critical Elements That Determine Authentic Assessment

    ERIC Educational Resources Information Center

    Ashford-Rowe, Kevin; Herrington, Janice; Brown, Christine

    2014-01-01

    This study sought to determine the critical elements of an authentic learning activity, design them into an applicable framework and then use this framework to guide the design, development and application of work-relevant assessment. Its purpose was to formulate an effective model of task design and assessment. The first phase of the study…

  18. Guidelines for a Scientific Approach to Critical Thinking Assessment

    ERIC Educational Resources Information Center

    Bensley, D. Alan; Murtagh, Michael P.

    2012-01-01

    Assessment of student learning outcomes can be a powerful tool for improvement of instruction when a scientific approach is taken; unfortunately, many educators do not take full advantage of this approach. This article examines benefits of taking a scientific approach to critical thinking assessment and proposes guidelines for planning,…

  19. Integrating Critical Thinking into the Assessment of College Writing

    ERIC Educational Resources Information Center

    McLaughlin, Frost; Moore, Miriam

    2012-01-01

    When writing teachers at any level get together to assess student essays, they often disagree in their evaluations of the writing at hand. This is no surprise as writing is a complex process, and in evaluating it, teachers go through a complex sequence of thoughts before emerging with an overall assessment. Critical thinking, or the complexity of…

  20. Establishing the Critical Elements That Determine Authentic Assessment

    ERIC Educational Resources Information Center

    Ashford-Rowe, Kevin; Herrington, Janice; Brown, Christine

    2014-01-01

    This study sought to determine the critical elements of an authentic learning activity, design them into an applicable framework and then use this framework to guide the design, development and application of work-relevant assessment. Its purpose was to formulate an effective model of task design and assessment. The first phase of the study…

  1. Criticism and Assessment Applied to New Media Art

    ERIC Educational Resources Information Center

    Ursyn, Anna

    2015-01-01

    This text examines educational criticism and assessment with an emphasis on the new media arts. The article shares with readers the versatile, abridged to four points criteria, based on a research on assessment made by students, faculty, and non-art-related professionals, thus providing a preliminary tool for the use in the classroom environment.…

  2. Transfer matrix computation of generalized critical polynomials in percolation

    NASA Astrophysics Data System (ADS)

    Scullard, Christian R.; Lykke Jacobsen, Jesper

    2012-12-01

    Percolation thresholds have recently been studied by means of a graph polynomial PB(p), henceforth referred to as the critical polynomial, that may be defined on any periodic lattice. The polynomial depends on a finite subgraph B, called the basis, and the way in which the basis is tiled to form the lattice. The unique root of PB(p) in [0, 1] either gives the exact percolation threshold for the lattice, or provides an approximation that becomes more accurate with appropriately increasing size of B. Initially PB(p) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give an alternative probabilistic definition of PB(p), which allows for much more efficient computations, by using the transfer matrix, than was previously possible with contraction-deletion. We present bond percolation polynomials for the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162 and 243 edges, much larger than the previous limit of 36 edges using contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. For the largest bases, we obtain the thresholds pc(4, 82) = 0.676 803 329…, pc(kagome) = 0.524 404 998…, pc(3, 122) = 0.740 420 798…, comparable to the best simulation results. We also show that the alternative definition of PB(p) can be applied to study site percolation problems. This article is part of ‘Lattice models and integrability’, a special issue of Journal of Physics A: Mathematical and Theoretical in honour of F Y Wu's 80th birthday.

  3. Computer-Based Cognitive Rehabilitation Interventions for Traumatic Brain Injury: A Critical Review of the Literature.

    PubMed

    Fetta, Joseph; Starkweather, Angela; Gill, Jessica M

    2017-08-01

    Computer-based interventions have been developed to improve cognitive performance after mild traumatic brain injury; however, a thorough evaluation of this body of research has not been addressed in the literature. The aim of this study was to provide a synthesis and critical review of current research studies that have tested the efficacy of computer-based interventions on cognitive performance after mild traumatic brain injury. A critical review was conducted by identifying relevant studies in the electronic databases PubMed/MEDLINE, PsycInfo, and CINAHL from 2011 to the present. Because of the limited number of publications focused exclusively on mild traumatic brain injury, research studies that assessed the impact of computer-based interventions on cognitive outcomes in populations with acquired brain injury were included. Of the 58 studies identified, only 10 publications included participants with mild traumatic brain injury. Overall, the identified studies did not use a standard method for assessing the severity of traumatic brain injury, and many studies included participants with a wide variety of etiologies for acquired brain injury and used multiple measures of cognitive performance, which made comparisons difficult across studies. In addition to small sample sizes, the study samples were heterogeneous in regard to the number of previous traumatic brain injuries, time elapsed since injury, and age and gender distributions. Preinjury comorbidities that may affect cognitive performance, such as depression, anxiety, or learning disabilities, were often not assessed. There is weak evidence that computer-based interventions can improve working memory and cognitive function in individuals after mild traumatic brain injury. Because of the low-quality evidence, seminal questions remain regarding the optimal format, dosage, timing, and duration of computer-based intervention for improving cognitive performance. Future studies should focus on using a strong

  4. An Exploration of Three-Dimensional Integrated Assessment for Computational Thinking

    ERIC Educational Resources Information Center

    Zhong, Baichang; Wang, Qiyun; Chen, Jie; Li, Yi

    2016-01-01

    Computational thinking (CT) is a fundamental skill for students, and assessment is a critical factor in education. However, there is a lack of effective approaches to CT assessment. Therefore, we designed the Three-Dimensional Integrated Assessment (TDIA) framework in this article. The TDIA has two aims: one was to integrate three dimensions…

  5. An Exploration of Three-Dimensional Integrated Assessment for Computational Thinking

    ERIC Educational Resources Information Center

    Zhong, Baichang; Wang, Qiyun; Chen, Jie; Li, Yi

    2016-01-01

    Computational thinking (CT) is a fundamental skill for students, and assessment is a critical factor in education. However, there is a lack of effective approaches to CT assessment. Therefore, we designed the Three-Dimensional Integrated Assessment (TDIA) framework in this article. The TDIA has two aims: one was to integrate three dimensions…

  6. Effects of Computer-Aided Personalized System of Instruction in Developing Knowledge and Critical Thinking in Blended Learning Courses

    ERIC Educational Resources Information Center

    Svenningsen, Louis; Pear, Joseph J.

    2011-01-01

    Two experiments were conducted to assess an online version of Keller's personalized system of instruction, called computer-aided personalized system of instruction (CAPSI), as part of a blended learning design with regard to course knowledge and critical thinking development. In Experiment 1, two lecture sections of an introduction to University…

  7. Critical issues using brain-computer interfaces for augmentative and alternative communication.

    PubMed

    Hill, Katya; Kovacs, Thomas; Shin, Sangeun

    2015-03-01

    Brain-computer interfaces (BCIs) may potentially be of significant practical value to patients in advanced stages of amyotrophic lateral sclerosis and locked-in syndrome for whom conventional augmentative and alternative communication (AAC) systems, which require some measure of consistent voluntary muscle control, are not satisfactory options. However, BCIs have primarily been used for communication in laboratory research settings. This article discusses 4 critical issues that should be addressed as BCIs are translated out of laboratory settings to become fully functional BCI/AAC systems that may be implemented clinically. These issues include (1) identification of primary, secondary, and tertiary system features; (2) integrating BCI/AAC systems in the World Health Organization's International Classification of Functioning, Disability and Health framework; (3) implementing language-based assessment and intervention; and (4) performance measurement. A clinical demonstration project is presented as an example of research beginning to address these critical issues.

  8. Alcohol assessment using wireless handheld computers

    PubMed Central

    Bernhardt, Jay M.; Usdan, Stuart; Mays, Darren; Arriola, Kimberly Jacob; Martin, Ryan J.; Cremeens, Jennifer; McGill, Tia; Weitzel, Jessica Aungst

    2007-01-01

    The present study sought to test the feasibility of measuring quantity and frequency of self-reported alcohol consumption among college students using the Handheld Assisted Network Diary (HAND) by comparing results to a retrospective Timeline Followback (TLFB). A total of 40 undergraduate college students completed a HAND assessment during the two-week study period and completed a TLFB at follow-up. The HAND recorded similar levels of alcohol consumption compared to the TLFB. There were no significant differences in overall alcohol consumption, drinks per drinking day, or heavy drinking days between the two methods of assessment. Handheld computers may represent a useful tool for assessing daily alcohol use among college students. PMID:17499442

  9. None but Ourselves Can Free Our Minds: Critical Computational Literacy as a Pedagogy of Resistance

    ERIC Educational Resources Information Center

    Lee, Clifford H.; Soep, Elisabeth

    2016-01-01

    Critical computational literacy (CCL) is a new pedagogical and conceptual framework that combines the strengths of critical literacy and computational thinking. Through CCL, young people conceptualize, create, and disseminate digital projects that break silences, expose important truths, and challenge unjust systems, all the while building skills…

  10. None but Ourselves Can Free Our Minds: Critical Computational Literacy as a Pedagogy of Resistance

    ERIC Educational Resources Information Center

    Lee, Clifford H.; Soep, Elisabeth

    2016-01-01

    Critical computational literacy (CCL) is a new pedagogical and conceptual framework that combines the strengths of critical literacy and computational thinking. Through CCL, young people conceptualize, create, and disseminate digital projects that break silences, expose important truths, and challenge unjust systems, all the while building skills…

  11. VOXMAT: Hybrid Computational Phantom for Dose Assessment

    SciTech Connect

    Akkurt, Hatice; Eckerman, Keith F

    2007-01-01

    The Oak Ridge National Laboratory (ORNL) computational phantoms have been the standard for assessing the radiation dose due to internal and external exposure over the past three decades. In these phantoms, the body surface and each organ are approximated by mathematical equations; hence, some of the organs are not necessarily realistic in their shape. Over the past two decades, these phantoms have been revised and updated: some of the missing internal organs have been added and the locations of the existing organs have been revised (e.g., thyroid). In the original phantom, only three elemental compositions were used to describe all body tissues. Recently, the compositions of the organs have been updated based on ICRP-89 standards. During the past decade, phantoms based on CT scans were developed for use in dose assessment. Although their shapes are realistic, some computational challenges are noted; including increased computational times and increased memory requirements. For good spatial resolution, more than several million voxels are used to represent the human body. Moreover, when CT scans are obtained, the subject is in a supine position with arms at the side. In some occupational exposure cases, it is necessary to evaluate the dose with the arms and legs in different positions. It will be very difficult and inefficient to reposition the voxels defining the arms and legs to simulate these exposure geometries. In this paper, a new approach for computational phantom development is presented. This approach utilizes the combination of a mathematical phantom and a voxelized phantom for the representation of the anatomy.

  12. Computed tomography assessment of reinforced concrete

    SciTech Connect

    Martz, H.E.; Schneberk, D.J.; Roberson, G.P.; Monteiro, J.M.

    1991-05-24

    Gamma-ray computed tomography (CT) is potentially powerful nondestructive method for assessing the degree of distress that exists in reinforced-concrete structures. In a study to determine the feasibility of using CT to inspect reinforced-concrete specimens, we verified that CT can quantitatively image the internal details of reinforced concrete. To assess the accuracy of CT in determining voids and cracks, we inspected two fiber-reinforced concrete cylinders (one loaded and one unloaded) and a third cylinder containing a single reinforcing bar (rebar). To evaluate the accuracy of CT in establishing the location of reinforcing bars, we also inspected a concrete block containing rebars with different diameters. The results indicate that CT was able to revolve the many different phases in reinforced concrete (voids, cracks, rebars, and concrete) with great accuracy. 15 refs., 10 figs.

  13. Assessing knowledge change in computer science

    NASA Astrophysics Data System (ADS)

    Gradwohl Nash, Jane; Bravaco, Ralph J.; Simonson, Shai

    2006-03-01

    The purpose of this study was to assess structural knowledge change across a two-week workshop designed to provide high-school teachers with training in Java and Object Oriented Programming. Both before and after the workshop, teachers assigned relatedness ratings to pairs of key concepts regarding Java and Object Oriented Programming. Their ratings were submitted to the Pathfinder network-scaling algorithm, which uses distance estimates to generate an individual's knowledge structure representation composed of nodes that are connected by links. Results showed that significant change in teachers' knowledge structure occurred during the workshop, both in terms of individual teacher networks and their averaged networks. Moreover, these changes were significantly related to performance in the workshop. The results of this study suggest several implications for teaching and assessment in computer science.

  14. Antiracist Education in Theory and Practice: A Critical Assessment

    ERIC Educational Resources Information Center

    Niemonen, Jack

    2007-01-01

    "Antiracist Education in Theory and Practice: A Critical Assessment" As a set of pedagogical, curricular, and organizational strategies, antiracist education claims to be the most progressive way today to understand race relations. Constructed from whiteness studies and the critique of colorblindness, its foundational core is located in…

  15. What Is a Good School? Critical Thoughts about Curriculum Assessments

    ERIC Educational Resources Information Center

    Zierer, Klaus

    2013-01-01

    Within the educational field, measurements such as the Programme for International Student Assessment (PISA), the Trends in International Mathematics and Science Study (TIMSS), and the Progress in International Reading Literacy Study (PIRLS) suggest we are living in a time of competition. This article takes a critical view of the modern drive to…

  16. Fostering and Assessing Critical Listening Skills in the Speech Course

    ERIC Educational Resources Information Center

    Ferrari-Bridgers, Franca; Vogel, Rosanne; Lynch, Barbara

    2017-01-01

    In this article we present the results of two listening assessments conducted in spring 2013 and fall 2013. Our primary goal is of a pedagogical nature and is concerned with the design and the testing of a tool that could measure students' critical listening skill improvement during the span of a semester. A total of N = 370 students participated…

  17. Complicating Canons: A Critical Literacy Challenge to Common Core Assessment

    ERIC Educational Resources Information Center

    Peel, Anne

    2017-01-01

    The widespread adoption of the Common Core State Standards in the US has prioritized rigorous reading of complex texts. The emphasis on text complexity has led to instructional and assessment materials that constrain critical literacy practices by emphasizing quantitative features of text, such as sentence length, and a static list of text…

  18. Implementation and Critical Assessment of the Flipped Classroom Experience

    ERIC Educational Resources Information Center

    Scheg, Abigail G., Ed.

    2015-01-01

    In the past decade, traditional classroom teaching models have been transformed in order to better promote active learning and learner engagement. "Implementation and Critical Assessment of the Flipped Classroom Experience" seeks to capture the momentum of non-traditional teaching methods and provide a necessary resource for individuals…

  19. What Is a Good School? Critical Thoughts about Curriculum Assessments

    ERIC Educational Resources Information Center

    Zierer, Klaus

    2013-01-01

    Within the educational field, measurements such as the Programme for International Student Assessment (PISA), the Trends in International Mathematics and Science Study (TIMSS), and the Progress in International Reading Literacy Study (PIRLS) suggest we are living in a time of competition. This article takes a critical view of the modern drive to…

  20. Antiracist Education in Theory and Practice: A Critical Assessment

    ERIC Educational Resources Information Center

    Niemonen, Jack

    2007-01-01

    "Antiracist Education in Theory and Practice: A Critical Assessment" As a set of pedagogical, curricular, and organizational strategies, antiracist education claims to be the most progressive way today to understand race relations. Constructed from whiteness studies and the critique of colorblindness, its foundational core is located in…

  1. Implementation and Critical Assessment of the Flipped Classroom Experience

    ERIC Educational Resources Information Center

    Scheg, Abigail G., Ed.

    2015-01-01

    In the past decade, traditional classroom teaching models have been transformed in order to better promote active learning and learner engagement. "Implementation and Critical Assessment of the Flipped Classroom Experience" seeks to capture the momentum of non-traditional teaching methods and provide a necessary resource for individuals…

  2. Computational Tools to Assess Turbine Biological Performance

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

  3. Computer Assessment of Mild Cognitive Impairment

    PubMed Central

    Saxton, Judith; Morrow, Lisa; Eschman, Amy; Archer, Gretchen; Luther, James; Zuccolotto, Anthony

    2009-01-01

    Many older individuals experience cognitive decline with aging. The causes of cognitive dysfunction range from the devastating effects of Alzheimer’s disease (AD) to treatable causes of dysfunction and the normal mild forgetfulness described by many older individuals. Even mild cognitive dysfunction can impact medication adherence, impair decision making, and affect the ability to drive or work. However, primary care physicians do not routinely screen for cognitive difficulties and many older patients do not report cognitive problems. Identifying cognitive impairment at an office visit would permit earlier referral for diagnostic work-up and treatment. The Computer Assessment of Mild Cognitive Impairment (CAMCI) is a self-administered, user-friendly computer test that scores automatically and can be completed independently in a quiet space, such as a doctor’s examination room. The goal of this study was to compare the sensitivity and specificity of the CAMCI and the Mini Mental State Examination (MMSE) to identify mild cognitive impairment (MCI) in 524 nondemented individuals > 60 years old who completed a comprehensive neuropsychological and clinical assessment together with the CAMCI and MMSE. We hypothesized that the CAMCI would exhibit good sensitivity and specificity and would be superior compared with the MMSE in these measures. The results indicated that the MMSE was relatively insensitive to MCI. In contrast, the CAMCI was highly sensitive (86%) and specific (94%) for the identification of MCI in a population of community-dwelling nondemented elderly individuals. PMID:19332976

  4. An Analysis of Mission Critical Computer Software in Naval Aviation

    DTIC Science & Technology

    1991-03-01

    Configuration Control - Engineering Changes, Deviations and Waivers, 15 July 1988; Department of the Navy Tactical Digital Standard A, Standard Definitions...for Embedded Computer Resources in Tactical Digital Systems, 2 July 1980; Department of the Navy Tactical Digital Standard D Revision 1, Reserve...Computer Resources in Department of the Navy Systems, 11 June 1979. 68 Department of the Navy, Tactical Digital Standard (TADSTANDS) B, Standard

  5. Critical Problems in Very Large Scale Computer Systems

    DTIC Science & Technology

    1990-09-30

    Paper . I [23] William J. Dally. Network and processor architecture for message-driven computers. In VLSI and Parallel Computation, pages 140-222...minimize the number of widely-shared objects. This paper proposes the LimitLESS cache coherence protocol, which realizes the performance I 2 I of the...nodes communicate via messages through a direct network [21] with a mesh topology using wormhole routing [22]. A single-chip controller on each node

  6. A critical review of population health literacy assessment.

    PubMed

    Guzys, Diana; Kenny, Amanda; Dickson-Swift, Virginia; Threlkeld, Guinever

    2015-03-04

    Defining health literacy from a public health perspective places greater emphasis on the knowledge and skills required to prevent disease and for promoting health in everyday life. Addressing health literacy at the community level provides great potential for improving health knowledge, skills and behaviours resulting in better health outcomes. Yet there is a notable absence of discussion in the literature of what a health literate population looks like, or how this is best assessed. The emphasis in assessing health literacy has predominantly focused on the functional health literacy of individuals in clinical settings. This review examines currently available health literacy assessment tools to identify how well suited they are in addressing health literacy beyond clinical care settings and beyond the individual. Although public health literature appears to place greater emphasis on conceptualizing critical health literacy, the focus continues to remain on assessing individuals, rather than on health literacy within the context of families, communities and population groups. When a population approach is adopted, an aggregate of individual health literacy assessment is generally used. Aggregation of individual health literacy fails to capture the dynamic and often synergistic relationships within communities, and fails to reflect societal influences on health knowledge, beliefs and behaviours. We hypothesise that a different assessment framework is required to adequately address the complexities of community health literacy. We assert that a public health approach, founded on health promotion theories provides a useful scaffold to assess the critical health literacy of population groups. It is proposed that inclusion of community members in the research process is a necessary requirement to coproduce such an appropriate assessment framework. We contend that health literacy assessment and potential interventions need to shift to promoting the knowledge and skills

  7. Computer-Supported Development of Critical Reasoning Skills

    ERIC Educational Resources Information Center

    Spurrett, David

    2005-01-01

    Thinking skills are important and education is expected to develop them. Empirical results suggest that formal education makes a modest and largely indirect difference. This paper will describe the early stages of an ongoing curriculum initiative in the teaching of critical reasoning skills in the philosophy curriculum on the Howard College Campus…

  8. Critical fault patterns determination in fault-tolerant computer systems

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Losq, J.

    1978-01-01

    The method proposed tries to enumerate all the critical fault-patterns (successive occurrences of failures) without analyzing every single possible fault. The conditions for the system to be operating in a given mode can be expressed in terms of the static states. Thus, one can find all the system states that correspond to a given critical mode of operation. The next step consists in analyzing the fault-detection mechanisms, the diagnosis algorithm and the process of switch control. From them, one can find all the possible system configurations that can result from a failure occurrence. Thus, one can list all the characteristics, with respect to detection, diagnosis, and switch control, that failures must have to constitute critical fault-patterns. Such an enumeration of the critical fault-patterns can be directly used to evaluate the overall system tolerance to failures. Present research is focused on how to efficiently make use of these system-level characteristics to enumerate all the failures that verify these characteristics.

  9. Laptop Computer - Based Facial Recognition System Assessment

    SciTech Connect

    R. A. Cain; G. B. Singleton

    2001-03-01

    The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results. After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in

  10. Critical Evaluation of Thermodynamic Properties for Halobenzoic Acids Through Consistency Analyses for Results from Experiment and Computational Chemistry

    NASA Astrophysics Data System (ADS)

    Chirico, Robert D.; Kazakov, Andrei; Bazyleva, Ala; Diky, Vladimir; Kroenlein, Kenneth; Emel′yanenko, Vladimir N.; Verevkin, Sergey P.

    2017-06-01

    Thermodynamic properties of the twelve monohalobenzoic acids are critically evaluated through the application of computational chemistry methods for the ideal-gas phase and thermodynamic consistency assessment of properties determined experimentally and reported in the literature, including enthalpies of combustion, enthalpies of sublimation, and enthalpies of fusion. The compounds of interest are the 2-, 3-, and 4-halo isomers of fluoro-, chloro-, bromo-, and iodobenzoic acids. Computations were validated by comparison with critically evaluated entropies and heat capacities in the ideal-gas state for benzoic acid, benzene, and some halobenzenes. Experimental enthalpies of formation for 2- and 3-bromobenzoic acids, measured by well-established research groups, are mutually inconsistent and further, are shown to be inconsistent with the computations and assessment in this work. Origins of the discrepancies are unknown, and recommended values for these compounds are based on computations and enthalpies of sublimation validated, in part, by a structure-property (i.e., group-additivity) analysis. Lesser, but significant, inconsistencies between experimental and computed results are demonstrated also for 3- and 4-iodobenzoic acids. The comparison of enthalpies of formation based on the experiment and computation for the ideal-gas state of 1- and 2-chloro-, bromo-, and iodonaphthalenes provides additional support for the findings for halobenzoic acids and also reveals some anomalous results in the experimental literature for chloronaphthalenes. Computations are discussed in detail to demonstrate the approach required to obtain optimal results with modern quantum chemical methods.

  11. Antibiotic prophylaxis and reflux: critical review and assessment

    PubMed Central

    Baquerizo, Bernarda Viteri

    2014-01-01

    The use of continuous antibiotic prophylaxis (CAP) was critical in the evolution of vesicoureteral reflux (VUR) from a condition in which surgery was the standard of treatment to its becoming a medically managed condition. The efficacy of antibiotic prophylaxis in the management of VUR has been challenged in recent years, and significant confusion exists as to its clinical value. This review summarizes the critical factors in the history, use, and investigation of antibiotic prophylaxis in VUR. This review provides suggestions for assessing the potential clinical utility of prophylaxis. PMID:25580258

  12. Assessing Terrorist Motivations for Attacking Critical "Chemical" Infrastructure

    SciTech Connect

    Ackerman, G; Bale, J; Moran, K

    2004-12-14

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security, and way of life. One particular type of CI--that relating to chemicals--constitutes both an important element of our nation's infrastructure and a particularly attractive set of potential targets. This is primarily because of the large quantities of toxic industrial chemicals (TICs) it employs in various operations and because of the essential economic functions it serves. This study attempts to minimize some of the ambiguities that presently impede chemical infrastructure threat assessments by providing new insight into the key motivational factors that affect terrorist organizations propensity to attack chemical facilities. Prepared as a companion piece to the Center for Nonproliferation Studies August 2004 study--''Assessing Terrorist Motivations for Attacking Critical Infrastructure''--it investigates three overarching research questions: (1) why do terrorists choose to attack chemical-related infrastructure over other targets; (2) what specific factors influence their target selection decisions concerning chemical facilities; and (3) which, if any, types of groups are most inclined to attack chemical infrastructure targets? The study involved a multi-pronged research design, which made use of four discrete investigative techniques to answer the above questions as comprehensively as possible. These include: (1) a review of terrorism and threat assessment literature to glean expert consensus regarding terrorist interest in targeting chemical facilities; (2) the preparation of case studies to help identify internal group factors and contextual influences that have played a significant role in leading some terrorist groups to attack chemical facilities; (3) an examination of data from the Critical Infrastructure Terrorist Incident Catalog (CrITIC) to further illuminate the nature of terrorist attacks against chemical facilities to date; and (4

  13. Fool's Gold: A Critical Look at Computers in Childhood.

    ERIC Educational Resources Information Center

    Cordes, Colleen, Ed.; Miller, Edward, Ed.

    Noting that computers are reshaping children's lives in profound and unexpected ways, this report examines potential harms and promised benefits of these changes, focusing on early childhood and elementary education. Chapter 1 argues that popular attempts to hurry children intellectually are at odds with the natural pace of human development.…

  14. Quality assessment of clinical computed tomography

    NASA Astrophysics Data System (ADS)

    Berndt, Dorothea; Luckow, Marlen; Lambrecht, J. Thomas; Beckmann, Felix; Müller, Bert

    2008-08-01

    Three-dimensional images are vital for the diagnosis in dentistry and cranio-maxillofacial surgery. Artifacts caused by highly absorbing components such as metallic implants, however, limit the value of the tomograms. The dominant artifacts observed are blowout and streaks. Investigating the artifacts generated by metallic implants in a pig jaw, the data acquisition for the patients in dentistry should be optimized in a quantitative manner. A freshly explanted pig jaw including related soft-tissues served as a model system. Images were recorded varying the accelerating voltage and the beam current. The comparison with multi-slice and micro computed tomography (CT) helps to validate the approach with the dental CT system (3D-Accuitomo, Morita, Japan). The data are rigidly registered to comparatively quantify their quality. The micro CT data provide a reasonable standard for quantitative data assessment of clinical CT.

  15. Assessing nutritional status in chronically critically ill adult patients.

    PubMed

    Higgins, Patricia A; Daly, Barbara J; Lipson, Amy R; Guo, Su-Er

    2006-03-01

    Numerous methods are used to measure and assess nutritional status of chronically critically ill patients. To discuss the multiple methods used to assess nutritional status in chronically critically ill patients, describe the nutritional status of chronically critically ill patients, and assess the relationship between nutritional indicators and outcomes of mechanical ventilation. A descriptive, longitudinal design was used to collect weekly data on 360 adult patients who required more than 72 hours of mechanical ventilation and had a hospital stay of 7 days or more. Data on body mass index and biochemical markers of nutritional status were collected. Patients' nutritional intake compared with physicians' orders, dieticians' recommendations, and indirect calorimetry and physicians' orders compared with dieticians' recommendations were used to assess nutritional status. Relationships between nutritional indicators and variables of mechanical ventilation were determined. Inconsistencies among nurses' implementation, physicians' orders, and dieticians' recommendations resulted in wide variations in patients' calculated nutritional adequacy. Patients received a mean of 83% of the energy intake ordered by their physicians (SD 33%, range 0%-200%). Patients who required partial or total ventilator support upon discharge had a lower body mass index at admission than did patients with spontaneous respirations (Mann-Whitney U = 8441, P = .001). In this sample, the variability in weaning progression and outcomes most likely reflects illness severity and complexity rather than nutritional status or nutritional therapies. Further studies are needed to determine the best methods to define nutritional adequacy and to evaluate nutritional status.

  16. Factors confounding the assessment of reflection: a critical review

    PubMed Central

    2011-01-01

    Background Reflection on experience is an increasingly critical part of professional development and lifelong learning. There is, however, continuing uncertainty about how best to put principle into practice, particularly as regards assessment. This article explores those uncertainties in order to find practical ways of assessing reflection. Discussion We critically review four problems: 1. Inconsistent definitions of reflection; 2. Lack of standards to determine (in)adequate reflection; 3. Factors that complicate assessment; 4. Internal and external contextual factors affecting the assessment of reflection. Summary To address the problem of inconsistency, we identified processes that were common to a number of widely quoted theories and synthesised a model, which yielded six indicators that could be used in assessment instruments. We arrived at the conclusion that, until further progress has been made in defining standards, assessment must depend on developing and communicating local consensus between stakeholders (students, practitioners, teachers, supervisors, curriculum developers) about what is expected in exercises and formal tests. Major factors that complicate assessment are the subjective nature of reflection's content and the dependency on descriptions by persons being assessed about their reflection process, without any objective means of verification. To counter these validity threats, we suggest that assessment should focus on generic process skills rather than the subjective content of reflection and where possible to consider objective information about the triggering situation to verify described reflections. Finally, internal and external contextual factors such as motivation, instruction, character of assessment (formative or summative) and the ability of individual learning environments to stimulate reflection should be considered. PMID:22204704

  17. Exact computation of the critical exponents of the jamming transition

    NASA Astrophysics Data System (ADS)

    Zamponi, Francesco

    2015-03-01

    The jamming transition marks the emergence of rigidity in a system of amorphous and athermal grains. It is characterized by a divergent correlation length of the force-force correlation and non-trivial critical exponents that are independent of spatial dimension, suggesting that a mean field theory can correctly predict their values. I will discuss a mean field approach to the problem based on the exact solution of the hard sphere model in infinite dimension. An unexpected analogy with the Sherrington-Kirkpatrick spin glass model emerges in the solution: as in the SK model, the glassy states turn out to be marginally stable, and are described by a Parisi equation. Marginal stability has a deep impact on the critical properties of the jamming transition and allows one to obtain analytic predictions for the critical exponents. The predictions are consistent with a recently developed scaling theory of the jamming transition, and with numerical simulations. Finally, I will briefly discuss some possible extensions of this approach to other open issues in the theory of glasses.

  18. Critical thinking: assessing the risks to the future security of supply of critical metals

    NASA Astrophysics Data System (ADS)

    Gunn, Gus

    2015-04-01

    Increasing world population, the spread of prosperity across the globe and the demands of new technologies have led to a revival of concerns about the availability of raw materials needed by society. Despite scare stories about resource depletion, physical exhaustion of minerals is considered to be unlikely. However, we do need to know which materials might be of concern so that we can develop strategies to secure adequate supplies and to mitigate the effects of supply disruption. This requirement has led to renewed interest in criticality, a term that is generally used to refer to metals and minerals of high economic importance that have a relatively high likelihood of supply disruption. The European Union (EU) developed a quantitative methodology for the assessment of criticality which led to the definition of 14 raw materials as critical to the EU economy (EC, 2010). This has succeeded in raising awareness of potential supply issues and in helping to prioritise requirements for new policies and supporting research. The EU has recently assessed a larger number of candidate materials of which 20 are now identified as critical to the EU (EC, 2014). These include metals such as indium, mostly used in flat-screen displays, antimony for flame retardants and cobalt for rechargeable batteries, alloys and a host of other products. Although there is no consensus on the methodology for criticality assessments and broad analyses at this scale are inevitably imperfect, they can, nevertheless, provide early warning of supply problems. However, in order to develop more rigorous and dynamic assessments of future availability detailed analysis of the whole life-cycle of individual metals to identify specific problems and develop appropriate solutions is required. New policies, such as the Raw Materials Initiative (2008) and the European Innovation Partnership on Raw Materials (2013), have been developed by the European Commission (EC) and are aimed at securing sustainable

  19. Adaptive critic design for computer intrusion detection system

    NASA Astrophysics Data System (ADS)

    Novokhodko, Alexander; Wunsch, Donald C., II; Dagli, Cihan H.

    2001-03-01

    This paper summarizes ongoing research. A neural network is used to detect a computer system intrusion basing on data from the system audit trail generated by Solaris Basic Security Module. The data have been provided by Lincoln Labs, MIT. The system alerts the human operator, when it encounters suspicious activity logged in the audit trail. To reduce the false alarm rate and accommodate the temporal indefiniteness of moment of attack a reinforcement learning approach is chosen to train the network.

  20. Geospatial decision support framework for critical infrastructure interdependency assessment

    NASA Astrophysics Data System (ADS)

    Shih, Chung Yan

    Critical infrastructures, such as telecommunications, energy, banking and finance, transportation, water systems and emergency services are the foundations of modern society. There is a heavy dependence on critical infrastructures at multiple levels within the supply chain of any good or service. Any disruptions in the supply chain may cause profound cascading effect to other critical infrastructures. A 1997 report by the President's Commission on Critical Infrastructure Protection states that a serious interruption in freight rail service would bring the coal mining industry to a halt within approximately two weeks and the availability of electric power could be reduced in a matter of one to two months. Therefore, this research aimed at representing and assessing the interdependencies between coal supply, transportation and energy production. A proposed geospatial decision support framework was established and applied to analyze interdependency related disruption impact. By utilizing the data warehousing approach, geospatial and non-geospatial data were retrieved, integrated and analyzed based on the transportation model and geospatial disruption analysis developed in the research. The results showed that by utilizing this framework, disruption impacts can be estimated at various levels (e.g., power plant, county, state, etc.) for preventative or emergency response efforts. The information derived from the framework can be used for data mining analysis (e.g., assessing transportation mode usages; finding alternative coal suppliers, etc.).

  1. Assessing Computer Knowledge among College Students.

    ERIC Educational Resources Information Center

    Parrish, Allen; And Others

    This paper reports on a study involving the administration of two examinations that were designed to evaluate student knowledge in several areas of computing. The tests were given both to computer science majors and to those enrolled in computer science classes from other majors. They sought to discover whether computer science majors demonstrated…

  2. Nutritional risk assessment in critically ill cancer patients: systematic review

    PubMed Central

    Fruchtenicht, Ana Valéria Gonçalves; Poziomyck, Aline Kirjner; Kabke, Geórgia Brum; Loss, Sérgio Henrique; Antoniazzi, Jorge Luiz; Steemburgo, Thais; Moreira, Luis Fernando

    2015-01-01

    Objective To systematically review the main methods for nutritional risk assessment used in critically ill cancer patients and present the methods that better assess risks and predict relevant clinical outcomes in this group of patients, as well as to discuss the pros and cons of these methods according to the current literature. Methods The study consisted of a systematic review based on analysis of manuscripts retrieved from the PubMed, LILACS and SciELO databases by searching for the key words “nutritional risk assessment”, “critically ill” and “cancer”. Results Only 6 (17.7%) of 34 initially retrieved papers met the inclusion criteria and were selected for the review. The main outcomes of these studies were that resting energy expenditure was associated with undernourishment and overfeeding. The high Patient-Generated Subjective Global Assessment score was significantly associated with low food intake, weight loss and malnutrition. In terms of biochemical markers, higher levels of creatinine, albumin and urea were significantly associated with lower mortality. The worst survival was found for patients with worse Eastern Cooperative Oncologic Group - performance status, high Glasgow Prognostic Score, low albumin, high Patient-Generated Subjective Global Assessment score and high alkaline phosphatase levels. Geriatric Nutritional Risk Index values < 87 were significantly associated with mortality. A high Prognostic Inflammatory and Nutritional Index score was associated with abnormal nutritional status in critically ill cancer patients. Among the reviewed studies that examined weight and body mass index alone, no significant clinical outcome was found. Conclusion None of the methods reviewed helped to define risk among these patients. Therefore, assessment by a combination of weight loss and serum measurements, preferably in combination with other methods using scores such as Eastern Cooperative Oncologic Group - performance status, Glasgow Prognostic

  3. Industry-University SBIR NDT Projects — A Critical Assessment

    NASA Astrophysics Data System (ADS)

    Reinhart, Eugene R.

    2007-03-01

    The Small Business Innovative Research (SBIR) program, funded by various United States government agencies (DOD, DOE, NSF, etc.), provides funds for Research and Development (R&D) of nondestructive testing (NDT) techniques and equipment, thereby supplying valuable money for NDT development by small businesses and stimulating cooperative university programs. A review and critical assessment of the SBIR program as related to NDT is presented and should provide insight into reasons for or against pursuing this source of R&D funding.

  4. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Möderl, Michael; Rauch, Wolfgang

    2011-12-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g., by terrorist attacks, infrastructure deterioration or climate change. For the spatial risk assessment, vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process. Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios. Thereby parameters are varied according to the specific impact of a particular threat scenario. Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past. The application of the spatial risk assessment is exemplified by means of a case study for a water supply system, but the principal concept is applicable likewise to other critical network infrastructure. The aim of the approach is to help decision makers in choosing zones for preventive measures.

  5. Critical evaluation of oxygen-uptake assessment in swimming.

    PubMed

    Sousa, Ana; Figueiredo, Pedro; Pendergast, David; Kjendlie, Per-Ludvik; Vilas-Boas, João P; Fernandes, Ricardo J

    2014-03-01

    Swimming has become an important area of sport science research since the 1970s, with the bioenergetic factors assuming a fundamental performance-influencing role. The purpose of this study was to conduct a critical evaluation of the literature concerning oxygen-uptake (VO2) assessment in swimming, by describing the equipment and methods used and emphasizing the recent works conducted in ecological conditions. Particularly in swimming, due to the inherent technical constraints imposed by swimming in a water environment, assessment of VO2max was not accomplished until the 1960s. Later, the development of automated portable measurement devices allowed VO2max to be assessed more easily, even in ecological swimming conditions, but few studies have been conducted in swimming-pool conditions with portable breath-by-breath telemetric systems. An inverse relationship exists between the velocity corresponding to VO2max and the time a swimmer can sustain it at this velocity. The energy cost of swimming varies according to its association with velocity variability. As, in the end, the supply of oxygen (whose limitation may be due to central-O2 delivery and transportation to the working muscles-or peripheral factors-O2 diffusion and utilization in the muscles) is one of the critical factors that determine swimming performance, VO2 kinetics and its maximal values are critical in understanding swimmers' behavior in competition and to develop efficient training programs.

  6. Literary and Electronic Hypertext: Borges, Criticism, Literary Research, and the Computer.

    ERIC Educational Resources Information Center

    Davison, Ned J.

    1991-01-01

    Examines what "hypertext" means to literary criticism on the one hand (i.e., intertextuality) and computing on the other, to determine how the two concepts may serve each other in a mutually productive way. (GLR)

  7. Literary and Electronic Hypertext: Borges, Criticism, Literary Research, and the Computer.

    ERIC Educational Resources Information Center

    Davison, Ned J.

    1991-01-01

    Examines what "hypertext" means to literary criticism on the one hand (i.e., intertextuality) and computing on the other, to determine how the two concepts may serve each other in a mutually productive way. (GLR)

  8. Comments and Criticism: Comment on "Identification of Student Misconceptions in Genetics Problem Solving via Computer Program."

    ERIC Educational Resources Information Center

    Smith, Mike U.

    1991-01-01

    Criticizes an article by Browning and Lehman (1988) for (1) using "gene" instead of allele, (2) misusing the word "misconception," and (3) the possible influences of the computer environment on the results of the study. (PR)

  9. Computer-Based Assessments. Information Capsule. Volume 0918

    ERIC Educational Resources Information Center

    Blazer, Christie

    2010-01-01

    This Information Capsule reviews research conducted on computer-based assessments. Advantages and disadvantages associated with computer-based testing programs are summarized and research on the comparability of computer-based and paper-and-pencil assessments is reviewed. Overall, studies suggest that for most students, there are few if any…

  10. Breadth-Oriented Outcomes Assessment in Computer Science.

    ERIC Educational Resources Information Center

    Cordes, David; And Others

    Little work has been done regarding the overall assessment of quality of computer science graduates at the undergraduate level. This paper reports on a pilot study at the University of Alabama of a prototype computer science outcomes assessment designed to evaluate the breadth of knowledge of computer science seniors. The instrument evaluated two…

  11. Benchmarking Pain Assessment Rate in Critical Care Transport.

    PubMed

    Reichert, Ryan J; Gothard, M David; Schwartz, Hamilton P; Bigham, Michael T

    The purpose of this study is to determine the rate of pain assessment in pediatric neonatal critical care transport (PNCCT). The GAMUT database was interrogated for an 18-month period and excluded programs with less than 10% pediatric or neonatal patient contacts and less than 3 months of any metric data reporting during the study period. We hypothesized pain assessment during PNCCT is superior to prehospital pain assessment rates, although inferior to in-hospital rates. Sixty-two programs representing 104,445 patient contacts were analyzed. A total of 21,693 (20.8%) patients were reported to have a documented pain assessment. Subanalysis identified 17 of the 62 programs consistently reporting pain assessments. This group accounted for 24,599 patients and included 7,273 (29.6%) neonatal, 12,655 (51.5%) pediatric, and 4,664 (19.0%) adult patients. Among these programs, the benchmark rate of pain assessment was 90.0%. Our analysis shows a rate below emergency medical services and consistent with published hospital rates of pain assessment. Poor rates of tracking of this metric among participating programs was noted, suggesting an opportunity to investigate the barriers to documentation and reporting of pain assessments in PNCCT and a potential quality improvement initiative.

  12. A CAD (Classroom Assessment Design) of a Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, Nazir S.

    2012-01-01

    This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified…

  13. Critical evaluation of soil contamination assessment methods for trace metals.

    PubMed

    Desaules, André

    2012-06-01

    Correctly distinguishing between natural and anthropogenic trace metal contents in soils is crucial for assessing soil contamination. A series of assessment methods is critically outlined. All methods rely on assumptions of reference values for natural content. According to the adopted reference values, which are based on various statistical and geochemical procedures, there is a considerable range and discrepancy in the assessed soil contamination results as shown by the five methods applied to three weakly contaminated sites. This is a serious indication of their high methodological specificity and bias. No method with off-site reference values could identify any soil contamination in the investigated trace metals (Pb, Cu, Zn, Cd, Ni), while the specific and sensitive on-site reference methods did so for some sites. Soil profile balances are considered to produce the most plausible site-specific results, provided the numerous assumptions are realistic and the required data reliable. This highlights the dilemma between model and data uncertainty. Data uncertainty, however, is a neglected issue in soil contamination assessment so far. And the model uncertainty depends much on the site-specific realistic assumptions of pristine natural trace metal contents. Hence, the appropriate assessment of soil contamination is a subtle optimization exercise of model versus data uncertainty and specification versus generalization. There is no general and accurate reference method and soil contamination assessment is still rather fuzzy, with negative implications for the reliability of subsequent risk assessments.

  14. Ultrasound to assess diaphragmatic function in the critically ill—a critical perspective

    PubMed Central

    Haaksma, Mark; Tuinman, Pieter Roel

    2017-01-01

    Ultrasound of the diaphragm in critically ill patients has become a diagnostic technique of emerging interest among clinicians and scientists. The advantages include that it is widely available, non-invasive and examination can be performed after relatively short training and at low costs. It is used to estimate muscle mass by measurement of muscle thickness and diagnose weakness by the assessment of diaphragm movement during unassisted breathing. Thickening of the muscle during inspiration has been used to quantify force generation. The enthusiasm that surrounds this topic is shared by many clinicians and we agree that ultrasound is a valuable tool to screen for diaphragm dysfunction in intensive care unit (ICU) patients. However, in our opinion much more studies are required to validate ultrasound as a tool to quantify breathing effort. More sophisticated ultrasound techniques, such as speckle tracking imaging are promising techniques to evaluate respiratory muscle function in patients, including the critically ill. PMID:28361079

  15. Clinical significance of computed tomography assessment for third molar surgery

    PubMed Central

    Nakamori, Kenji; Tomihara, Kei; Noguchi, Makoto

    2014-01-01

    Surgical extraction of the third molar is the most commonly performed surgical procedure in the clinical practice of oral surgery. Third molar surgery is warranted when there is inadequate space for eruption, malpositioning, or risk for cyst or odontogenic tumor formation. Preoperative assessment should include a detailed morphologic analysis of the third molar and its relationship to adjacent structures and surrounding tissues. Due to developments in medical engineering technology, computed tomography (CT) now plays a critical role in providing the clear images required for adequate assessment prior to third molar surgery. Removal of the maxillary third molar is associated with a risk for maxillary sinus perforation, whereas removal of the mandibular third molar can put patients at risk for a neurosensory deficit from damage to the lingual nerve or inferior alveolar nerve. Multiple factors, including demographic, anatomic, and treatment-related factors, influence the incidence of nerve injury during or following removal of the third molar. CT assessment of the third molar prior to surgery can identify some of these risk factors, such as the absence of cortication between the mandibular third molar and the inferior alveolar canal, prior to surgery to reduce the risk for nerve damage. This topic highlight presents an overview of the clinical significance of CT assessment in third molar surgery. PMID:25071882

  16. Computer Simulations to Support Science Instruction and Learning: A Critical Review of the Literature

    ERIC Educational Resources Information Center

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-01-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is…

  17. 24 CFR 901.105 - Computing assessment score.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Computing assessment score. 901.105 Section 901.105 Housing and Urban Development Regulations Relating to Housing and Urban Development... DEVELOPMENT PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.105 Computing assessment score. (a)...

  18. A new approach in utilizing a computer data acquisition system for criticality safety control

    SciTech Connect

    Hopkins, H; Song, H; Warren, F

    1999-05-06

    A new approach in utilizing a computer data acquisition system is proposed to address many issues associated with criticality safety control. This Criticality Safety Support System (CSSS) utilizes many features of computer and information process technology such as digital pictures, barcodes, voice data entry, etc. to enhance criticality safety in an R and D environment. Due to on-line data retrieving, data recording, and data management offered by new technology, the CSSS would provide a framework to design new solutions to old problems. This pilot program is the first step in developing this application for the years to come.

  19. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    PubMed

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  20. Ultrasonographic Assessment of Diaphragm Function in Critically Ill Subjects.

    PubMed

    Umbrello, Michele; Formenti, Paolo

    2016-04-01

    The majority of patients admitted to the ICU require mechanical ventilation as a part of their process of care. However, mechanical ventilation itself or the underlying disease can lead to dysfunction of the diaphragm, a condition that may contribute to the failure of weaning from mechanical ventilation. However, extended time on the ventilator increases health-care costs and greatly increases patient morbidity and mortality. Nevertheless, symptoms and signs of muscle disease in a bedridden (or bed rest-only) ICU patient are often difficult to assess because of concomitant confounding factors. Conventional assessment of diaphragm function lacks specific, noninvasive, time-saving, and easily performed bedside tools or requires patient cooperation. Recently, the use of ultrasound has raised great interest as a simple, noninvasive method of quantification of diaphragm contractile activity. In this review, we discuss the physiology and the relevant pathophysiology of diaphragm function, and we summarize the recent findings concerning the evaluation of its (dys)function in critically ill patients, with a special focus on the role of ultrasounds. We describe how to assess diaphragm excursion and diaphragm thickening during breathing and the meaning of these measurements under spontaneous or mechanical ventilation as well as the reference values in health and disease. The spread of ultrasonographic assessment of diaphragm function may possibly result in timely identification of patients with diaphragm dysfunction and to a potential improvement in the assessment of recovery from diaphragm weakness. Copyright © 2016 by Daedalus Enterprises.

  1. Computer utilization and clinical judgment in psychological assessment reports.

    PubMed

    Lichtenberger, Elizabeth O

    2006-01-01

    The process of assessment report writing is a complex one, involving both the statistical evaluation of data and clinical methods of data interpretation to appropriately answer referral questions. Today, a computer often analyzes data generated in a psychological assessment, at least in part. In this article, the author focuses on the interaction between the decision-making processes of human clinicians and the test interpretations that are computer-based. The benefits and problems with computers in assessment are highlighted and are presented alongside the research on the validity of automated assessment, as well as research comparing clinicians and computers in the decision-making process. The author concludes that clinical judgment and computer-based test interpretation each have weaknesses. However, by using certain strategies to reduce clinicians' susceptibility to errors in decision making and to ensure that only valid computer-based test interpretations are used, clinicians can optimize the accuracy of conclusions that they draw in their assessment report.

  2. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  3. Criticality assessment of the Defense Waste Processing Facility

    SciTech Connect

    Ha, B.C.; Williamson, T.G.; Clemmons, J.S.; Chandler, M.C.

    1996-08-01

    Assessment of nuclear criticality potential of the S-Area Defense Waste Processing Facility (DWPF) is required to ensure the safe processing of radioactive waste for final disposal. At the Savannah River Site (SRS), high-level radioactive wastes are stored as caustic slurries. During storage, the wastes separate into a supernate layer and a sludge layer. The radionuclides from the sludge and supernate will be immobilized into borosilicate glass for storage and eventual disposal. The DWPF will initially immobilize sludge only, with simulated non-radioactive Precipitate Hydrolysis Aqueous (PHA) product. This paper demonstrates that criticality poses only a negligible risk in the DWPF process because of the characteristics of the waste and the DWPF process. The waste contains low concentration of fissile material and many elements which act as neutron poisons. Also, the DWPF process chemistry does not affect separation and accumulation of fissile materials. Experiments showed that DWPF can process all the high-level radioactive wastes currently stored at SRS with negligible criticality risk under normal and abnormal/process upset operation.

  4. Reverse engineering of metabolic networks, a critical assessment.

    PubMed

    Hendrickx, Diana M; Hendriks, Margriet M W B; Eilers, Paul H C; Smilde, Age K; Hoefsloot, Huub C J

    2011-02-01

    Inferring metabolic networks from metabolite concentration data is a central topic in systems biology. Mathematical techniques to extract information about the network from data have been proposed in the literature. This paper presents a critical assessment of the feasibility of reverse engineering of metabolic networks, illustrated with a selection of methods. Appropriate data are simulated to study the performance of four representative methods. An overview of sampling and measurement methods currently in use for generating time-resolved metabolomics data is given and contrasted with the needs of the discussed reverse engineering methods. The results of this assessment show that if full inference of a real-world metabolic network is the goal there is a large discrepancy between the requirements of reverse engineering of metabolic networks and contemporary measurement practice. Recommendations for improved time-resolved experimental designs are given.

  5. Computer Viruses: An Assessment of Student Perceptions.

    ERIC Educational Resources Information Center

    Jones, Mary C.; Arnett, Kirk P.

    1992-01-01

    A majority of 213 college business students surveyed had knowledge of computer viruses; one-fourth had been exposed to them. Many believed that computer professionals are responsible for prevention and cure. Educators should make students aware of multiple sources of infection, the breadth and extent of possible damage, and viral detection and…

  6. Computer Viruses: An Assessment of Student Perceptions.

    ERIC Educational Resources Information Center

    Jones, Mary C.; Arnett, Kirk P.

    1992-01-01

    A majority of 213 college business students surveyed had knowledge of computer viruses; one-fourth had been exposed to them. Many believed that computer professionals are responsible for prevention and cure. Educators should make students aware of multiple sources of infection, the breadth and extent of possible damage, and viral detection and…

  7. A critical assessment of vector control for dengue prevention.

    PubMed

    Achee, Nicole L; Gould, Fred; Perkins, T Alex; Reiner, Robert C; Morrison, Amy C; Ritchie, Scott A; Gubler, Duane J; Teyssou, Remy; Scott, Thomas W

    2015-05-01

    Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations.

  8. Risk assessment and critical control points from the production perspective.

    PubMed

    Serra, J A; Domenech, E; Escriche, I; Martorell, S

    1999-01-12

    The implementation of a risk analysis program as risk assessment and critical control points (RACCP) is most necessary in order to accomplish the foodborne industries current objective of total quality. The novelty of this technique, when compared to actual hazard analysis and critical control points (HACCP) and its extension to incorporate elements of quantitative risk analysis (QRA), is that RACCP considers the risk of the consequences produced by the production process performance deviations, both inside and outside the company, and also identifies their causative factors. On the other hand, the techniques to be taken in order to prevent or mitigate the consequences of such deviations must be consistent with the former data, but the need for a cost-benefit assessment must not be ignored so that the chosen technique be most profitable for the company. An example developed in a mineral water bottling plant showed that RACCP application is feasible and useful. During this example case, RACCP demonstrated it could obtain a profitable production process that keeps quality and safety of the final product at its maximum, while providing protection to both company and consumer.

  9. A Critical Assessment of Vector Control for Dengue Prevention

    PubMed Central

    Achee, Nicole L.; Gould, Fred; Perkins, T. Alex; Reiner, Robert C.; Morrison, Amy C.; Ritchie, Scott A.; Gubler, Duane J.; Teyssou, Remy; Scott, Thomas W.

    2015-01-01

    Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations. PMID:25951103

  10. Assessing monoclonal antibody product quality attribute criticality through clinical studies.

    PubMed

    Goetze, Andrew M; Schenauer, Matthew R; Flynn, Gregory C

    2010-01-01

    Recombinant therapeutic proteins, including antibodies, contain a variety of chemical and physical modifications. Great effort is expended during process and formulation development in controlling and minimizing this heterogeneity, which may not affect safety or efficacy, and, therefore, may not need to be controlled. Many of the chemical conversions also occur in vivo, and knowledge about the alterations can be applied to assessment of the potential impact on characteristics and the biological activity of therapeutic proteins. Other attributes may affect the drug clearance and thereby alter drug efficacy. In this review article, we describe attribute studies conducted using clinical samples and how information gleaned from them is applied to attribute criticality assessment. In general, how fast attributes change in vivo compared to the rate of mAb elimination is the key parameter used in these evaluations. An attribute with more rapidly changing levels may have greater potential to affect safety or efficacy and thereby reach the status of a Critical Quality Attribute (CQA) that should be controlled during production and storage, but the effect will depend on whether compositional changes are due to chemical conversion or differential clearance.

  11. Limited transthoracic echocardiography assessment in anaesthesia and critical care.

    PubMed

    Faris, John G; Veltman, Michael G; Royse, Colin F

    2009-09-01

    The use of echocardiography in anaesthesia and critical care started with transoesophageal echocardiography, whereas transthoracic echocardiography was largely the domain of the cardiologist. In recent times, there has been a change in focus towards transthoracic echocardiography owing to the development of small and portable, yet high-fidelity, echocardiography machines. The cost has reduced, thereby increasing the availability of equipment. A parallel development has been the concept of limited transthoracic echocardiography that can be performed by practitioners with limited experience. The basis of these examinations is to provide the practising clinician with immediate information to help guide management with a focus on haemodynamic evaluation, and limited structural (valve) assessment to categorise whether there is a valve disorder that may or may not cause haemodynamic instability. The limited examination is therefore goal directed. A number of named examinations exist which differ in their scope and views. All of these require a limited knowledge base, and are designed for the clinician to recognise patterns consistent with haemodynamic or anatomical abnormalities. They range from very limited two-dimensional assessments of ventricular function to more complex (yet presently limited) studies such as HEART (haemodynamic echocardiography assessment in real time) scan, which is designed to provide haemodynamic state, as well as basic valvular and pericardial assessment. It is suitable for goal-directed examination in the operating theatre, emergency department or intensive care unit (ICU) and for preoperative screening.

  12. Pharmacist Computer Skills and Needs Assessment Survey

    PubMed Central

    Jewesson, Peter J

    2004-01-01

    Background To use technology effectively for the advancement of patient care, pharmacists must possess a variety of computer skills. We recently introduced a novel applied informatics program in this Canadian hospital clinical service unit to enhance the informatics skills of our members. Objective This study was conducted to gain a better understanding of the baseline computer skills and needs of our hospital pharmacists immediately prior to the implementation of an applied informatics program. Methods In May 2001, an 84-question written survey was distributed by mail to 106 practicing hospital pharmacists in our multi-site, 1500-bed, acute-adult-tertiary care Canadian teaching hospital in Vancouver, British Columbia. Results Fifty-eight surveys (55% of total) were returned within the two-week study period. The survey responses reflected the opinions of licensed BSc and PharmD hospital pharmacists with a broad range of pharmacy practice experience. Most respondents had home access to personal computers, and regularly used computers in the work environment for drug distribution, information management, and communication purposes. Few respondents reported experience with handheld computers. Software use experience varied according to application. Although patient-care information software and e-mail were commonly used, experience with spreadsheet, statistical, and presentation software was negligible. The respondents were familiar with Internet search engines, and these were reported to be the most common method of seeking clinical information online. Although many respondents rated themselves as being generally computer literate and not particularly anxious about using computers, the majority believed they required more training to reach their desired level of computer literacy. Lack of familiarity with computer-related terms was prevalent. Self-reported basic computer skill was typically at a moderate level, and varied depending on the task. Specifically

  13. Pharmacist computer skills and needs assessment survey.

    PubMed

    Balen, Robert M; Jewesson, Peter J

    2004-03-29

    To use technology effectively for the advancement of patient care, pharmacists must possess a variety of computer skills. We recently introduced a novel applied informatics program in this Canadian hospital clinical service unit to enhance the informatics skills of our members. This study was conducted to gain a better understanding of the baseline computer skills and needs of our hospital pharmacists immediately prior to the implementation of an applied informatics program. In May 2001, an 84-question written survey was distributed by mail to 106 practicing hospital pharmacists in our multi-site, 1500-bed, acute-adult-tertiary care Canadian teaching hospital in Vancouver, British Columbia. Fifty-eight surveys (55% of total) were returned within the two-week study period. The survey responses reflected the opinions of licensed BSc and PharmD hospital pharmacists with a broad range of pharmacy practice experience. Most respondents had home access to personal computers, and regularly used computers in the work environment for drug distribution, information management, and communication purposes. Few respondents reported experience with handheld computers. Software use experience varied according to application. Although patient-care information software and e-mail were commonly used, experience with spreadsheet, statistical, and presentation software was negligible. The respondents were familiar with Internet search engines, and these were reported to be the most common method of seeking clinical information online. Although many respondents rated themselves as being generally computer literate and not particularly anxious about using computers, the majority believed they required more training to reach their desired level of computer literacy. Lack of familiarity with computer-related terms was prevalent. Self-reported basic computer skill was typically at a moderate level, and varied depending on the task. Specifically, respondents rated their ability to manipulate

  14. Data on NAEP 2011 writing assessment prior computer use.

    PubMed

    Tate, Tamara P; Warschauer, Mark; Abedi, Jamal

    2016-09-01

    This data article contains information based on the 2011 National Assessment of Educational Progress in Writing Restricted-Use Data, available from the National Center for Education Statistics (NCES Pub. No. 2014476). https://nces.ed.gov/nationsreportcard/researchcenter/datatools.aspx. The data include the statistical relationships between survey reports of teachers and students regarding prior use of computers and other technology and writing achievement levels on the 2011 computer-based NAEP writing assessment. This data article accompanies "The Effects of Prior Computer Use on Computer-Based Writing: The 2011 NAEP Writing Assessment" [1].

  15. The Acceptance and Use of Computer Based Assessment

    ERIC Educational Resources Information Center

    Terzis, Vasileios; Economides, Anastasios A.

    2011-01-01

    The effective development of a computer based assessment (CBA) depends on students' acceptance. The purpose of this study is to build a model that demonstrates the constructs that affect students' behavioral intention to use a CBA. The proposed model, Computer Based Assessment Acceptance Model (CBAAM) is based on previous models of technology…

  16. A Technology Assessment of Personal Computers. Vol. I: Summary.

    ERIC Educational Resources Information Center

    Nilles, Jack M.

    This volume summarizes the results of a 2-year technology assessment of personal computers. The purpose of this study was to explore possible future modes of growth of the personal computer and related industries, to assess the impacts and consequences of that growth, and to present some of the policy issues and options which may arise as a…

  17. The Acceptance and Use of Computer Based Assessment

    ERIC Educational Resources Information Center

    Terzis, Vasileios; Economides, Anastasios A.

    2011-01-01

    The effective development of a computer based assessment (CBA) depends on students' acceptance. The purpose of this study is to build a model that demonstrates the constructs that affect students' behavioral intention to use a CBA. The proposed model, Computer Based Assessment Acceptance Model (CBAAM) is based on previous models of technology…

  18. Collected Wisdom: Assessment Tools for Computer Science Programs

    ERIC Educational Resources Information Center

    Sanders, Kathryn E.; McCartney, Robert

    2004-01-01

    In this paper, we investigate the question of what assessment tools are being used in practice by United States computing programs and what the faculty doing the assessment think of the tools they use. After presenting some background with regard to the design, implementation, and use of assessment, with particular attention to assessment tools,…

  19. Collected Wisdom: Assessment Tools for Computer Science Programs

    ERIC Educational Resources Information Center

    Sanders, Kathryn E.; McCartney, Robert

    2004-01-01

    In this paper, we investigate the question of what assessment tools are being used in practice by United States computing programs and what the faculty doing the assessment think of the tools they use. After presenting some background with regard to the design, implementation, and use of assessment, with particular attention to assessment tools,…

  20. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    SciTech Connect

    Barker, Ashley D.; Bernholdt, David E.; Bland, Arthur S.; Gary, Jeff D.; Hack, James J.; McNally, Stephen T.; Rogers, James H.; Smith, Brian E.; Straatsma, T. P.; Sukumar, Sreenivas Rangan; Thach, Kevin G.; Tichenor, Suzy; Vazhkudai, Sudharshan S.; Wells, Jack C.

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  1. Risk assessment for physical and cyber attacks on critical infrastructures.

    SciTech Connect

    Smith, Bryan J.; Sholander, Peter E.; Phelan, James M.; Wyss, Gregory Dane; Varnado, G. Bruce; Depoy, Jennifer Mae

    2005-08-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies. Existing risk assessment methodologies consider physical security and cyber security separately. As such, they do not accurately model attacks that involve defeating both physical protection and cyber protection elements (e.g., hackers turning off alarm systems prior to forced entry). This paper presents a risk assessment methodology that accounts for both physical and cyber security. It also preserves the traditional security paradigm of detect, delay and respond, while accounting for the possibility that a facility may be able to recover from or mitigate the results of a successful attack before serious consequences occur. The methodology provides a means for ranking those assets most at risk from malevolent attacks. Because the methodology is automated the analyst can also play 'what if with mitigation measures to gain a better understanding of how to best expend resources towards securing the facilities. It is simple enough to be applied to large infrastructure facilities without developing highly complicated models. Finally, it is applicable to facilities with extensive security as well as those that are less well-protected.

  2. A critically appraised topic review of computer-aided design/computer-aided machining of removable partial denture frameworks.

    PubMed

    Lang, Lisa A; Tulunoglu, Ibrahim

    2014-01-01

    A critically appraised topic (CAT) review is presented about the use of computer-aided design (CAD)/computer-aided machining (CAM) removable partial denture (RPD) frameworks. A systematic search of the literature supporting CAD/CAM RPD systems revealed no randomized clinical trials, hence the CAT review was performed. A PubMed search yielded 9 articles meeting the inclusion criteria. Each article was characterized by study design and level of evidence. No clinical outcomes research has been published on the use of CAD/CAM RPDs. Low levels of evidence were found in the available literature. Clinical research studies are needed to determine the efficacy of this treatment modality.

  3. A Technology Assessment of Personal Computers. Vol. II: Personal Computer Technology, Users, and Uses.

    ERIC Educational Resources Information Center

    Nilles, Jack M.

    This volume reports on the initial phase of a technology assessment of personal computers. First, technological developments that will influence the rate of diffusion of personal computer technology among the general populace are examined. Then the probable market for personal computers is estimated and analyzed on a functional basis, segregating…

  4. A Practical and Theoretical Approach to Assessing Computer Attitudes: The Computer Attitudes Measure (CAM).

    ERIC Educational Resources Information Center

    Kay, Robin H.

    1989-01-01

    Describes study conducted at the University of Toronto that assessed the attitudes of student teachers toward computers by using a multicomponent model, the Computer Attitude Measure (CAM). Cognitive, affective, and behavioral attitudes are examined, and correlations of computer literacy, experience, and internal locus of control are discussed.…

  5. Policy Issues in Computer Education. Assessing the Cognitive Consequences of Computer Environments for Learning (ACCCEL).

    ERIC Educational Resources Information Center

    Linn, Marcia

    This paper analyzes the capabilities of the computer learning environment identified by the Assessing the Cognitive Consequences of Computer Environments for Learning (ACCCEL) Project, augments the analysis with experimental work, and discusses how schools can implement policies which provide for the maximum potential of computers. The ACCCEL…

  6. [Graphic assessment of retinal findings by computer].

    PubMed

    Effert, R; Wilberts, T; Reim, M

    1989-01-01

    Despite the increasing amount of patient data in the area of words and numbers that is being stored by computer, the graphic storage of ophthalmological findings has found only limited success. However, a sketch is much more instructive than a description in words. This paper shows that by using a suitable computer with a graphic oriented disc-operating system and a purchasable graphic and data base program, it is easily possible to generate sketches of retinal detachment on a computer screen. In the graphic program, all of the necessary symbols are already available when the program is started. The user just makes a copy of the symbols he needs to "draw" the actual fundus findings. We use the system of Meyer-Schwickerath. Afterwards, the drawing on the monitor is transferred into the data base program and stored.

  7. Critical factors in assessing risk from exposure to nasal carcinogens.

    PubMed

    Bogdanffy, M S; Mathison, B H; Kuykendall, J R; Harman, A E

    1997-10-31

    Anatomical, physiological, biochemical and molecular factors that contribute to chemical-induced nasal carcinogenesis are either largely divergent between test species and humans, or we know very little of them. These factors, let alone the uncertainty associated with our knowledge gap, present a risk assessor with the formidable task of making judgments about risks to human health from exposure to chemicals that have been identified in rodent studies to be nasal carcinogens. This paper summarizes some of the critical attributes of the hazard identification and dose-response aspects of risk assessments for nasal carcinogens that must be accounted for by risk assessors in order to make informed decisions. Data on two example compounds, dimethyl sulfate and hexamethylphosphoramide, are discussed to illustrate the diversity of information that can be used to develop informed hypotheses about mode of action and decisions on appropriate dosimeters for interspecies extrapolation. Default approaches to interspecies dosimetry extrapolation are described briefly and are followed by a discussion of a generalized physiologically based pharmacokinetic model that, unlike default approaches, is flexible and capable of incorporating many of the critical species-specific factors. Recent advancements in interspecies nasal dosimetry modeling are remarkable. However, it is concluded that without the development of research programs aimed at understanding carcinogenic susceptibility factors in human and rodent nasal tissues, development of plausible modes of action will lag behind the advancements made in dosimetry modeling.

  8. Assessing the Hydraulic Criticality of Deep Ocean Overflows

    NASA Astrophysics Data System (ADS)

    Pratt, L. J.; Helfrich, K. R.

    2004-12-01

    Two methods for assessing the hydraulic criticality of a modelled or observed deep overflow are discussed. The methods should be of use in determining the position of the control section, which is needed to establish the transport relation helpful for long-term monitoring from upstream. Both approaches are based on a multiple streamtube idealization in which the observed flow at a particular section is divided up into subsections (streamtubes). There are no restrictions on the bottom topography or potential vorticity distribution. The first criteria involves evauation of a generalized Jacobian condition based on the conservation laws for each streamtube; the second involves direct calculation of the long-wave phase speeds. We also comment on the significance of the local Froude number F of the flow and argue that F must pass through unity across a section of hydraulic control. These criteria are applied to some numerically modelled flows and are used in the companion presentation (Girton, et al.) to evaluate the hydraulic criticality of the Faroe Bank Channel.

  9. Report on the 2011 Critical Assessment of Function Annotation (CAFA) meeting

    SciTech Connect

    Friedberg, Iddo

    2015-01-21

    The Critical Assessment of Function Annotation meeting was held July 14-15, 2011 at the Austria Conference Center in Vienna, Austria. There were 73 registered delegates at the meeting. We thank the DOE for this award. It helped us organize and support a scientific meeting AFP 2011 as a special interest group (SIG) meeting associated with the ISMB 2011 conference. The conference was held in Vienna, Austria, in July 2011. The AFP SIG was held on July 15-16, 2011 (immediately preceding the conference). The meeting consisted of two components, the first being a series of talks (invited and contributed) and discussion sections dedicated to protein function research, with an emphasis on the theory and practice of computational methods utilized in functional annotation. The second component provided a large-scale assessment of computational methods through participation in the Critical Assessment of Functional Annotation (CAFA). The meeting was exciting and, based on feedback, quite successful. There were 73 registered participants. The schedule was only slightly different from the one proposed, due to two cancellations. Dr. Olga Troyanskaya has canceled and we invited Dr. David Jones instead. Similarly, instead of Dr. Richard Roberts, Dr. Simon Kasif gave a closing keynote. The remaining invited speakers were Janet Thornton (EBI) and Amos Bairoch (University of Geneva).

  10. Ethics and health technology assessment: handmaiden and/or critic?

    PubMed

    Braunack-Mayer, Annette J

    2006-01-01

    This study examines the content and role of ethical analysis in health technology assessment (HTA) and horizon scanning publications. It proposes that ethical analysis in HTA is of at least two different types: an ethics of HTA and an ethics in HTA. I examine the critical differences between these approaches through the examples of the analysis of genetic screening for breast cancer and home blood glucose testing in diabetes. I then argue that, although both approaches subscribe to similar views concerning HTA and ethics, they use different theoretical and methodological traditions to interpret and explain them. I conclude by suggesting that we need the interpretive insights of both these approaches, taken together, to explain why ethics has not been able yet to contribute fully to HTA and to demonstrate the scope and complexity of ethical work in this domain.

  11. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  12. Empirically Assessing the Importance of Computer Skills

    ERIC Educational Resources Information Center

    Baker, William M.

    2013-01-01

    This research determines which computer skills are important for entry-level accountants, and whether some skills are more important than others. Students participated before and after internships in public accounting. Longitudinal analysis is also provided; responses from 2001 are compared to those from 2008-2009. Responses are also compared to…

  13. Empirically Assessing the Importance of Computer Skills

    ERIC Educational Resources Information Center

    Baker, William M.

    2013-01-01

    This research determines which computer skills are important for entry-level accountants, and whether some skills are more important than others. Students participated before and after internships in public accounting. Longitudinal analysis is also provided; responses from 2001 are compared to those from 2008-2009. Responses are also compared to…

  14. A Computer Assessment Tool for Concept Mapping

    ERIC Educational Resources Information Center

    Akkaya, Recai; Karakirik, Erol; Durmus, Soner

    2005-01-01

    Current educational theories emphasize assessment as a vital part of teaching-learning process. Alternative assessment techniques aim to expose and promote the process of the learning rather than the final outcome. Concept mapping is a technique for representing conceptual knowledge and relationships between concepts in a graphical form. Requiring…

  15. Critical assessment of the evidence for striped nanoparticles.

    PubMed

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques.

  16. Critical Assessment of the Evidence for Striped Nanoparticles

    PubMed Central

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques. PMID:25402426

  17. Computer assessment of atherosclerosis from angiographic images

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Brooks, S. H.; Crawford, D. W.; Cashin, W. L.

    1982-01-01

    A computer method for detection and quantification of atherosclerosis from angiograms has been developed and used to measure lesion change in human clinical trials. The technique involves tracking the vessel edges and measuring individual lesions as well as the overall irregularity of the arterial image. Application of the technique to conventional arterial-injection femoral and coronary angiograms is outlined and an experimental study to extend the technique to analysis of intravenous angiograms of the carotid and cornary arteries is described.

  18. Enhancing the Resilience of Interdependent Critical Infrastructure Systems Using a Common Computational Framework

    NASA Astrophysics Data System (ADS)

    Little, J. C.; Filz, G. M.

    2016-12-01

    As modern societies become more complex, critical interdependent infrastructure systems become more likely to fail under stress unless they are designed and implemented to be resilient. Hurricane Katrina clearly demonstrated the catastrophic and as yet unpredictable consequences of such failures. Resilient infrastructure systems maintain the flow of goods and services in the face of a broad range of natural and manmade hazards. In this presentation, we illustrate a generic computational framework to facilitate high-level decision-making about how to invest scarce resources most effectively to enhance resilience in coastal protection, transportation, and the economy of a region. Coastal Louisiana, our study area, has experienced the catastrophic effects of several land-falling hurricanes in recent years. In this project, we implement and further refine three process models (a coastal protection model, a transportation model, and an economic model) for the coastal Louisiana region. We upscale essential mechanistic features of the three detailed process models to the systems level and integrate the three reduced-order systems models in a modular fashion. We also evaluate the proposed approach in annual workshops with input from stakeholders. Based on stakeholder inputs, we derive a suite of goals, targets, and indicators for evaluating resilience at the systems level, and assess and enhance resilience using several deterministic scenarios. The unifying framework will be able to accommodate the different spatial and temporal scales that are appropriate for each model. We combine our generic computational framework, which encompasses the entire system of systems, with the targets, and indicators needed to systematically meet our chosen resilience goals. We will start with targets that focus on technical and economic systems, but future work will ensure that targets and indicators are extended to other dimensions of resilience including those in the environmental and

  19. Interaction and Critical Inquiry in Asynchronous Computer-Mediated Conferencing: A Research Agenda

    ERIC Educational Resources Information Center

    Hopkins, Joseph; Gibson, Will; Ros i. Sole, Cristina; Savvides, Nicola; Starkey, Hugh

    2008-01-01

    This paper reviews research on learner and tutor interaction in asynchronous computer-mediated (ACM) conferences used in distance learning. The authors note claims made for the potential of ACM conferences to promote higher-order critical inquiry and the social construction of knowledge, and argue that there is a general lack of evidence regarding…

  20. Two Configurations for Accessing Classroom Computers: Differential Impact on Students' Critical Reflections and Their Empowerment

    ERIC Educational Resources Information Center

    Solhaug, T.

    2009-01-01

    The context of this article is the new technological environment and the struggle to use meaningful teaching practices in Norwegian schools. Students' critical reflections in two different technological learning environments in six upper secondary schools are compared. Three of these schools offer Internet-connected computers in special computer…

  1. An Artistic Criticism of "Writing To Read," A Computer-Based Beginning Reading Program.

    ERIC Educational Resources Information Center

    Huenecke, Dorothy

    A study used aesthetic criteria and artistic criticism to find meaning in one part of the kindergarten and first grade curriculum, "Writing To Read," a computer-based program for beginning reading. Subjects, students in a kindergarten class, were observed several days a week from the day in April when they began the program until the day…

  2. The Effect of Computer Games on Students' Critical Thinking Disposition and Educational Achievement

    ERIC Educational Resources Information Center

    Seifi, Mohammad; Derikvandi, Zahra; Moosavipour, Saeed; Khodabandelou, Rouhollah

    2015-01-01

    The main aim of this research was to investigate the effect of computer games on student' critical thinking disposition and educational achievement. The research method was descriptive, and its type was casual-comparative. The sample included 270 female high school students in Andimeshk town selected by multistage cluster method. Ricketts…

  3. Two Configurations for Accessing Classroom Computers: Differential Impact on Students' Critical Reflections and Their Empowerment

    ERIC Educational Resources Information Center

    Solhaug, T.

    2009-01-01

    The context of this article is the new technological environment and the struggle to use meaningful teaching practices in Norwegian schools. Students' critical reflections in two different technological learning environments in six upper secondary schools are compared. Three of these schools offer Internet-connected computers in special computer…

  4. Manipulating Critical Variables: A Framework for Improving the Impact of Computers in the School Environment.

    ERIC Educational Resources Information Center

    Collis, Betty

    Previous work assessing the effectiveness of computers in education has gone no further than acknowledging a network of interconnected variables (comprising a system) which contribute to computer impact and describing its component parts. An impact systems model developed by Glasman and Bibiaminov (1981) has been adapted to facilitate measurement…

  5. [Computer-assisted surgery: assessment and perspectives].

    PubMed

    Demongeot, J

    The hospital in the future will be faced with the major problem of managing and optimizing the use of images provided from numerous sources examining both anatomy (MRI, CT-scan...) and function (gamma-camera, PET-scan...). One of the first to benefit from such rationalization will be the surgeon. After studying the results of the physical examination, the laboratory reports and the medical imaging, the surgeon will decide on the best curative measured and the best surgical route before operating. He thus needs a computer to assist him in integrating the multi-modal information available for his patient, in particular the imaging with automatic integration and visualisation in synoptic mode (perception step), showing the trajectory of possible access routes to the target organ, memorization of the chosen route (decision step) and real operation either using laser or a manuel tool, or with robot assistance under human control (action step). This close cooperation between surgery and computers is called computer-assisted surgery. A few examples of current uses an future perspectives of this new field of surgery are presented.

  6. Does Computer-Aided Formative Assessment Improve Learning Outcomes?

    ERIC Educational Resources Information Center

    Hannah, John; James, Alex; Williams, Phillipa

    2014-01-01

    Two first-year engineering mathematics courses used computer-aided assessment (CAA) to provide students with opportunities for formative assessment via a series of weekly quizzes. Most students used the assessment until they achieved very high (>90%) quiz scores. Although there is a positive correlation between these quiz marks and the final…

  7. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  8. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  9. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    Through unsustainable land use practices, mining, deforestation, urbanisation and degradation by industrial pollution, soil losses are now hypothesized to be much faster (100 times or more) than soil formation - with the consequence that soil has become a finite resource. The crucial challenge for the international research community is to understand the rates of processes that dictate soil mass stocks and their function within Earth's Critical Zone (CZ). The CZ is the environment where soils are formed, degrade and provide their essential ecosystem services. Key among these ecosystem services are food and fibre production, filtering, buffering and transformation of water, nutrients and contaminants, storage of carbon and maintaining biological habitat and genetic diversity. We have initiated a new research project to address the priority research areas identified in the European Union Soil Thematic Strategy and to contribute to the development of a global network of Critical Zone Observatories (CZO) committed to soil research. Our hypothesis is that the combined physical-chemical-biological structure of soil can be assessed from first-principles and the resulting soil functions can be quantified in process models that couple the formation and loss of soil stocks with descriptions of biodiversity and nutrient dynamics. The objectives of this research are to 1. Describe from 1st principles how soil structure influences processes and functions of soils, 2. Establish 4 European Critical Zone Observatories to link with established CZOs, 3. Develop a CZ Integrated Model of soil processes and function, 4. Create a GIS-based modelling framework to assess soil threats and mitigation at EU scale, 5. Quantify impacts of changing land use, climate and biodiversity on soil function and its value and 6. Form with international partners a global network of CZOs for soil research and deliver a programme of public outreach and research transfer on soil sustainability. The

  10. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  11. Comparative assessment of life cycle assessment methods used for personal computers.

    PubMed

    Yao, Marissa A; Higgs, Tim G; Cullen, Michael J; Stewart, Scott; Brady, Todd A

    2010-10-01

    This article begins with a summary of findings from commonly cited life cycle assessments (LCA) of Information and Communication Technology (ICT) products. While differing conclusions regarding environmental impact are expected across product segments (mobile phones, personal computers, servers, etc.) significant variation and conflicting conclusions are observed even within product segments such as the desktop Personal Computer (PC). This lack of consistent conclusions and accurate data limits the effectiveness of LCA to influence policy and product design decisions. From 1997 to 2010, the majority of published studies focused on the PC concluded that the use phase contributes most to the life cycle energy demand of PC products with a handful of studies suggesting that manufacturing phase of the PC has the largest impact. The purpose of this article is to critically review these studies in order to analyze sources of uncertainty, including factors that extend beyond data quality to the models and assumptions used. These findings suggest existing methods to combine process-based LCA data with product price data and remaining value adjustments are not reliable in conducting life cycle assessments for PC products. Recommendations are provided to assist future LCA work.

  12. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  13. Computer-Based Assessment of Problem Solving.

    ERIC Educational Resources Information Center

    Baker, E. L.; Mayer, R. E.

    1999-01-01

    Examines the components required to assess student problem solving in technology environments. Discusses the purposes of testing, provides an example demonstrating the difference between retention and transfer, defines and analyzes problem solving, and explores techniques and standards for measuring the quality of student understanding. Contains…

  14. Concept Map Assessment for Teaching Computer Programming

    ERIC Educational Resources Information Center

    Keppens, Jeroen; Hay, David

    2008-01-01

    A key challenge of effective teaching is assessing and monitoring the extent to which students have assimilated the material they were taught. Concept mapping is a methodology designed to model what students have learned. In effect, it seeks to produce graphical representations (called concept maps) of the concepts that are important to a given…

  15. Computational Toxicology in Cancer Risk Assessment

    EPA Science Inventory

    Risk assessment over the last half century has, for many individual cases served us well, but has proceeded on an extremely slow pace and has left us with considerable uncertainty. There are certainly thousands of compounds and thousands of exposure scenarios that remain unteste...

  16. Assessing Knowledge Change in Computer Science

    ERIC Educational Resources Information Center

    Nash, Jane Gradwohl; Bravaco, Ralph J.; Simonson, Shai

    2006-01-01

    The purpose of this study was to assess structural knowledge change across a two-week workshop designed to provide high-school teachers with training in Java and Object Oriented Programming. Both before and after the workshop, teachers assigned relatedness ratings to pairs of key concepts regarding Java and Object Oriented Programming. Their…

  17. Assessing Knowledge Change in Computer Science

    ERIC Educational Resources Information Center

    Nash, Jane Gradwohl; Bravaco, Ralph J.; Simonson, Shai

    2006-01-01

    The purpose of this study was to assess structural knowledge change across a two-week workshop designed to provide high-school teachers with training in Java and Object Oriented Programming. Both before and after the workshop, teachers assigned relatedness ratings to pairs of key concepts regarding Java and Object Oriented Programming. Their…

  18. The Development of Computer-Based Piagetian Assessment Instruments.

    ERIC Educational Resources Information Center

    Barman, Charles R.

    1986-01-01

    Described are the development and validation of two computer-based Piagetian assessment instruments, designed to assist teachers in identifying cognitive reasoning patterns. Implications for teachers are presented. (Author/MT)

  19. International Computer and Information Literacy Study: Assessment Framework

    ERIC Educational Resources Information Center

    Fraillon, Julian; Schulz, Wolfram; Ainley, John

    2013-01-01

    The purpose of the International Computer and Information Literacy Study 2013 (ICILS 2013) is to investigate, in a range of countries, the ways in which young people are developing "computer and information literacy" (CIL) to support their capacity to participate in the digital age. To achieve this aim, the study will assess student…

  20. Need Assessment of Computer Science and Engineering Graduates

    ERIC Educational Resources Information Center

    Surakka, Sami; Malmi, Lauri

    2005-01-01

    This case study considered the syllabus of the first and second year studies in computer science. The aim of the study was to reveal which topics covered in the syllabi were really needed during the following years of study or in working life. The program that was assessed in the study was a Masters program in computer science and engineering at a…

  1. Geography Students Assess Their Learning Using Computer-Marked Tests.

    ERIC Educational Resources Information Center

    Hogg, Jim

    1997-01-01

    Reports on a pilot study designed to assess the potential of computer-marked tests for allowing students to monitor their learning. Students' answers to multiple choice tests were fed into a computer that provided a full analysis of their strengths and weaknesses. Students responded favorably to the feedback. (MJP)

  2. Assessing Existing Item Bank Depth for Computer Adaptive Testing.

    ERIC Educational Resources Information Center

    Bergstrom, Betty A.; Stahl, John A.

    This paper reports a method for assessing the adequacy of existing item banks for computer adaptive testing. The method takes into account content specifications, test length, and stopping rules, and can be used to determine if an existing item bank is adequate to administer a computer adaptive test efficiently across differing levels of examinee…

  3. Using Computer-Assisted Assessment Heuristics for Usability Evaluations

    ERIC Educational Resources Information Center

    Sim, Gavin; Read, Janet C.

    2016-01-01

    Teaching practices within educational institutions have evolved through the increased adoption of technology to deliver the curriculum and the use of computers for assessment purposes. For educational technologists, there is a vast array of commercial computer applications available for the delivery of objective tests, and in some instances,…

  4. Using Computer-Assisted Assessment Heuristics for Usability Evaluations

    ERIC Educational Resources Information Center

    Sim, Gavin; Read, Janet C.

    2016-01-01

    Teaching practices within educational institutions have evolved through the increased adoption of technology to deliver the curriculum and the use of computers for assessment purposes. For educational technologists, there is a vast array of commercial computer applications available for the delivery of objective tests, and in some instances,…

  5. Special Assessment Reports: A Critical Part of the Sustained National Climate Assessment Process

    NASA Astrophysics Data System (ADS)

    Lough, G. C.

    2015-12-01

    The U.S. Global Change Research program is conducting a sustained National Climate Assessment process that will ultimately facilitate continuous and transparent participation of scientists and stakeholders across regions and sectors, enabling new information and insights to be synthesized as they emerge. Focused scientific assessments, such as the Second State of the Carbon Cycle Report (SOCCR-2), are a critical component of the process, as they synthesize new science and provide decision makers with actionable information. These special assessments provide opportunities for scientific experts and decision makers to share knowledge about the climate-related issues, impacts, and potential response actions that are most important to a particular topic, region, or sector. The development process for USGCRP's sustained assessment products is also highly transparent in order to produce results that are credible, salient, and legitimate for both scientists and stakeholders. Ultimately, the rigorous and transparent process makes the results of scientific assessments of climate-related information extremely useful to decision makers.

  6. Marine proteomics: a critical assessment of an emerging technology.

    PubMed

    Slattery, Marc; Ankisetty, Sridevi; Corrales, Jone; Marsh-Hunkin, K Erica; Gochfeld, Deborah J; Willett, Kristine L; Rimoldi, John M

    2012-10-26

    The application of proteomics to marine sciences has increased in recent years because the proteome represents the interface between genotypic and phenotypic variability and, thus, corresponds to the broadest possible biomarker for eco-physiological responses and adaptations. Likewise, proteomics can provide important functional information regarding biosynthetic pathways, as well as insights into mechanism of action, of novel marine natural products. The goal of this review is to (1) explore the application of proteomics methodologies to marine systems, (2) assess the technical approaches that have been used, and (3) evaluate the pros and cons of this proteomic research, with the intent of providing a critical analysis of its future roles in marine sciences. To date, proteomics techniques have been utilized to investigate marine microbe, plant, invertebrate, and vertebrate physiology, developmental biology, seafood safety, susceptibility to disease, and responses to environmental change. However, marine proteomics studies often suffer from poor experimental design, sample processing/optimization difficulties, and data analysis/interpretation issues. Moreover, a major limitation is the lack of available annotated genomes and proteomes for most marine organisms, including several "model species". Even with these challenges in mind, there is no doubt that marine proteomics is a rapidly expanding and powerful integrative molecular research tool from which our knowledge of the marine environment, and the natural products from this resource, will be significantly expanded.

  7. Redefining second modernity for East Asia: a critical assessment.

    PubMed

    Han, Sang-Jin; Shim, Young-Hee

    2010-09-01

    The aim of this paper is to critically assess the extent to which the concept of second modernity and reflexive modernization proposed by Beck and Grande is relevant to East Asia. Concepts such as driving forces, human agency, objective-structural versus cultural-discursive dimensions, radicalizing versus deficiencies aspects of modernity, push versus pull factors are used to clarify the basic conditions of this historical transformation. Utilizing these conceptual schemes, this paper has advanced the following central claims: 1) Second modernity and reflexive modernization, as a global trend, affects East Asia as deeply as it does in the West, especially when we see this as a structurally conditioned historical transformation; 2) Global risks, as a driving force of second modernity, are more relevant in East Asia because, as a result of the side-effects of the rush-to development, East Asian countries face complex risks of far greater intensity than in the West; 3) The action-mediated pull factor of second-modern transformation in East Asia, expressed through the cultural-discursive articulation of collective desire and aspiration, differs significantly from the West. Consequently, the East Asian pathways to individualization display distinctive characteristics despite the common structural background where push factors operate; 4) East Asia also differs from the West in terms of the normative vision anchored in second modernity; 5) Nevertheless, concrete pathways to second modernity within East Asia differ from one country to another. © London School of Economics and Political Science 2010.

  8. Assessing Critical Thinking in Higher Education: The HEIghten™ Approach and Preliminary Validity Evidence

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Mao, Liyang; Frankel, Lois; Xu, Jun

    2016-01-01

    Critical thinking is a learning outcome highly valued by higher education institutions and the workforce. The Educational Testing Service (ETS) has designed a next generation assessment, the HEIghten™ critical thinking assessment, to measure students' critical thinking skills in analytical and synthetic dimensions. This paper introduces the…

  9. Assessing Critical Thinking in Higher Education: The HEIghten™ Approach and Preliminary Validity Evidence

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Mao, Liyang; Frankel, Lois; Xu, Jun

    2016-01-01

    Critical thinking is a learning outcome highly valued by higher education institutions and the workforce. The Educational Testing Service (ETS) has designed a next generation assessment, the HEIghten™ critical thinking assessment, to measure students' critical thinking skills in analytical and synthetic dimensions. This paper introduces the…

  10. The Effects of Using a Critical Thinking Scoring Rubric to Assess Undergraduate Students' Reading Skills

    ERIC Educational Resources Information Center

    Leist, Cathy W.; Woolwine, Mark A.; Bays, Cathy L.

    2012-01-01

    The purpose of this study was to investigate the use of a critical thinking rubric as an assessment of reading achievement for students enrolled in a reading intervention course. A reading prompt and scoring rubric, based on Richard Paul and Linda Elder's critical thinking framework, were created to assess critical reading in an intervention…

  11. A critical assessment of wind tunnel results for the NACA 0012 airfoil

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.

    1987-01-01

    A large body of experimental results, obtained in more than 40 wind tunnels on a single, well-known two-dimensional configuration, has been critically examined and correlated. An assessment of some of the possible sources of error has been made for each facility, and data which are suspect have been identified. It was found that no single experiment provided a complete set of reliable data, although one investigation stands out as superior in many respects. However, from the aggregate of data the representative properties of the NACA 0012 airfoil can be identified with reasonable confidence over wide ranges of Mach number, Reynolds number, and angles of attack. This synthesized information can now be used to assess and validate existing and future wind tunnel results and to evaluate advanced Computational Fluid Dynamics codes.

  12. Review of Estelle and LOTOS with respect to critical computer applications

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.

  13. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  14. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    NASA Astrophysics Data System (ADS)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  15. 78 FR 29375 - Protected Critical Infrastructure Information (PCII) Office Self-Assessment Questionnaire

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... critical infrastructure information not customarily in the public domain and related to the security of... SECURITY Protected Critical Infrastructure Information (PCII) Office Self- Assessment Questionnaire AGENCY... Protection and Programs Directorate (NPPD), Office of Infrastructure Protection (IP), Infrastructure...

  16. Assessment of Computer Literacy of Nurses in Lesotho.

    PubMed

    Mugomeri, Eltony; Chatanga, Peter; Maibvise, Charles; Masitha, Matseliso

    2016-11-01

    Health systems worldwide are moving toward use of information technology to improve healthcare delivery. However, this requires basic computer skills. This study assessed the computer literacy of nurses in Lesotho using a cross-sectional quantitative approach. A structured questionnaire with 32 standardized computer skills was distributed to 290 randomly selected nurses in Maseru District. Univariate and multivariate logistic regression analyses in Stata 13 were performed to identify factors associated with having inadequate computer skills. Overall, 177 (61%) nurses scored below 16 of the 32 skills assessed. Finding hyperlinks on Web pages (63%), use of advanced search parameters (60.2%), and downloading new software (60.1%) proved to be challenging to the highest proportions of nurses. Age, sex, year of obtaining latest qualification, computer experience, and work experience were significantly (P < .05) associated with inadequate computer skills in univariate analysis. However, in multivariate analyses, sex (P = .001), year of obtaining latest qualification (P = .011), and computer experience (P < .001) emerged as significant factors. The majority of nurses in Lesotho have inadequate computer skills, and this is significantly associated with having many years since obtaining their latest qualification, being female, and lack of exposure to computers. These factors should be considered during planning of training curriculum for nurses in Lesotho.

  17. Perceptions of University Students regarding Computer Assisted Assessment

    ERIC Educational Resources Information Center

    Jamil, Mubashrah

    2012-01-01

    Computer assisted assessment (CAA) is a common technique of assessment in higher educational institutions in Western countries, but a relatively new concept for students and teachers in Pakistan. It was therefore interesting to investigate students' perceptions about CAA practices from different universities of Pakistan. Information was collected…

  18. eWorkbook: A Computer Aided Assessment System

    ERIC Educational Resources Information Center

    Costagliola, Gennaro; Ferrucci, Filomena; Fuccella, Vittorio; Oliveto, Rocco

    2007-01-01

    Computer aided assessment (CAA) tools are more and more widely adopted in academic environments mixed to other assessment means. In this article, we present a CAA Web application, named eWorkbook, which can be used for evaluating learner's knowledge by creating (the tutor) and taking (the learner) on-line tests based on multiple choice, multiple…

  19. Test Review: Computer-Based Reading Assessment Instrument (CRAI).

    ERIC Educational Resources Information Center

    Blanchard, Jay S.

    1987-01-01

    Evaluates the Computer-Based Assessment Instrument (CRAI) as a test for reading proficiency. Notes strengths of CRAI, including its use as a quick assessment of silent reading comprehension level, and the problems with readability and content specific words lists and the lack of scoring features. (JC)

  20. Computer-Based Dynamic Assessment of Multidigit Multiplication.

    ERIC Educational Resources Information Center

    Gerber, Michael M.; And Others

    1994-01-01

    Design details, operation, and initial field test results are reported for DynaMath, a computer-based dynamic assessment system that provides individually tailored, instructionally useful assessment of students with disabilities. DynaMath organizes and outputs student performance data, graphically shows the "zone of proximal…

  1. Assessing Critical Thinking Performance of Postgraduate Students in Threaded Discussions

    ERIC Educational Resources Information Center

    Tan, Cheng Lee; Ng, Lee Luan

    2014-01-01

    Critical thinking has increasingly been seen as one of the important attributes where human capital is concerned and in line with this recognition, the tertiary educational institutions worldwide are putting more effort into designing courses that produce university leavers who are critical thinkers. This study aims to investigate the critical…

  2. Modelling Critical Thinking through Learning-Oriented Assessment

    ERIC Educational Resources Information Center

    Lombard, B. J. J.

    2008-01-01

    One of the cornerstones peculiar to the outcomes-based approach adopted by the South African education and training sector is the so-called "critical outcomes". Included in one of these outcomes is the ability to think critically. Although this outcome articulates well with the cognitive domain of holistic development, it also gives rise…

  3. What Do They Know? A Strategy for Assessing Critical Literacy

    ERIC Educational Resources Information Center

    Morrissette, Rhonda

    2007-01-01

    In this article, the author describes how difficult it is to know how critically literate her students are in the adult senior high school in which she is a teacher-librarian. She assumes that many would have gaps in their learning, including gaps in information and critical literacy skills, and that they were likely to approach all online…

  4. Developing Critical Thinking Skills: Assessing the Effectiveness of Workbook Exercises

    ERIC Educational Resources Information Center

    Wallace, Elise D.; Jefferson, Renee N.

    2015-01-01

    To address the challenge of developing critical thinking skills in college students, this empirical study examines the effectiveness of cognitive exercises in developing those skills. The study uses Critical Thinking: Building the Basics by Walter, Knudsvig, and Smith (2003). This workbook is specifically designed to exercise and develop critical…

  5. Criticism or praise? The impact of verbal versus text-only computer feedback on social presence, intrinsic motivation, and recall.

    PubMed

    Bracken, Cheryl Campanella; Jeffres, Leo W; Neuendorf, Kimberly A

    2004-06-01

    The Computers Are Social Actors (CASA) paradigm asserts that human computer users interact socially with computers, and has provided extensive evidence that this is the case. In this experiment (n = 134), participants received either praise or criticism from a computer. Independent variables were the direction feedback (praise or criticism), and voice channel (verbal or text-only). Dependent variables measured via a computer-based questionnaire were recall, perceived ability, intrinsic motivation, and perceptions of the computer as a social entity. Results demonstrate that participants had similar reactions to computers as predicted by interpersonal communication research with participants who received text-only criticism reporting higher levels of intrinsic motivation, perceived ability, and recall. Additionally, the computer was seen as more intelligent. Implications for theory and application are discussed.

  6. Developing a Critical Lens among Preservice Teachers while Working within Mandated Performance-Based Assessment Systems

    ERIC Educational Resources Information Center

    Moss, Glenda

    2008-01-01

    This article addresses the dilemma of promoting critical pedagogy within portfolio assessment, which has been implemented in many teacher education programs to meet state and national mandates for performance-based assessment. It explores how one teacher educator works to move portfolio assessment to a level of critical self-reflection that…

  7. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    SciTech Connect

    Ivanova, T.; Laville, C.; Dyrda, J.; Mennerdahl, D.; Golovko, Y.; Raskach, K.; Tsiboulia, A.; Lee, G. S.; Woo, S. W.; Bidaud, A.; Sabouri, P.; Bledsoe, K.; Rearden, B.; Gulliford, J.; Michel-Sendis, F.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)

  8. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  9. Portfolios Plus: A Critical Guide to Alternative Assessment.

    ERIC Educational Resources Information Center

    Mabry, Linda

    This book explains some basic assumptions that underlie different assessment systems, some connections between education and assessment, and some assessment options that have gone unrecognized. The discussion serves as a guide to designing a custom assessment program individualized to fit the students, school, and community. Part 2 contains…

  10. Critical Assessment Issues in Work-Integrated Learning

    ERIC Educational Resources Information Center

    Ferns, Sonia; Zegwaard, Karsten E.

    2014-01-01

    Assessment has long been a contentious issue in work-integrated learning (WIL) and cooperative education. Despite assessment being central to the integrity and accountability of a university and long-standing theories around best practice in assessment, enacting quality assessment practices has proven to be more difficult. Authors in this special…

  11. Portfolios Plus: A Critical Guide to Alternative Assessment.

    ERIC Educational Resources Information Center

    Mabry, Linda

    This book explains some basic assumptions that underlie different assessment systems, some connections between education and assessment, and some assessment options that have gone unrecognized. The discussion serves as a guide to designing a custom assessment program individualized to fit the students, school, and community. Part 2 contains…

  12. Providing Formative Feedback From a Summative Computer-aided Assessment

    PubMed Central

    Sewell, Robert D. E.

    2007-01-01

    Objectives To examine the effectiveness of providing formative feedback for summative computer-aided assessment. Design Two groups of first-year undergraduate life science students in pharmacy and neuroscience who were studying an e-learning package in a common pharmacology module were presented with a computer-based summative assessment. A sheet with individualized feedback derived from each of the 5 results sections of the assessment was provided to each student. Students were asked via a questionnaire to evaluate the form and method of feedback. Assessment The students were able to reflect on their performance and use the feedback provided to guide their future study or revision. There was no significant difference between the responses from pharmacy and neuroscience students. Students' responses on the questionnaire indicated a generally positive reaction to this form of feedback. Conclusions Findings suggest that additional formative assessment conveyed by this style and method would be appreciated and valued by students. PMID:17533442

  13. Does Computer-Based Motor Skill Assessment Training Transfer to Live Assessing?

    ERIC Educational Resources Information Center

    Kelly, Luke E.; Taliaferro, Andrea; Krause, Jennifer

    2012-01-01

    Developing competency in motor skill assessment has been identified as a critical need in physical educator preparation. We conducted this study to evaluate (a) the effectiveness of a web-based instructional program--Motor Skill Assessment Program (MSAP)--for developing assessment competency, and specifically (b) whether competency developed by…

  14. Does Computer-Based Motor Skill Assessment Training Transfer to Live Assessing?

    ERIC Educational Resources Information Center

    Kelly, Luke E.; Taliaferro, Andrea; Krause, Jennifer

    2012-01-01

    Developing competency in motor skill assessment has been identified as a critical need in physical educator preparation. We conducted this study to evaluate (a) the effectiveness of a web-based instructional program--Motor Skill Assessment Program (MSAP)--for developing assessment competency, and specifically (b) whether competency developed by…

  15. Assessment of examinations in computer science doctoral education

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types of candidacy examinations that are required. The performance of computer science departments, estimated via two common surrogate metrics, in these different categories of candidacy requirements are compared and contrasted and the correlation between candidacy requirements and program/department performance is assessed.

  16. Void Fraction and Critical Power Assessment of CORETRAN-01/VIPRE-02

    SciTech Connect

    Aounallah, Yacine

    2004-02-15

    CORETRAN-01 is the Electric Power Research Institute core analysis computer program that couples the neutronic code ARROTTA to the thermal-hydraulic code VIPRE-02 to achieve an integrated three-dimensional representation of the core for both steady-state and transient applications. The thermal-hydraulic module VIPRE-02, the two-fluid version of the one-fluid code VIPRE-01, has been the object of relatively few assessment studies, and the work presented seeks to reduce this lacuna. The priority has been given to the assessment of the void fraction prediction due to the importance of the void feedback on the core power generation. The assessment data are experimental void fractions obtained from X- and gamma-ray attenuation techniques applied at assembly-averaged as well as subchannel level for both steady-state and transient conditions. These experiments are part of the NUPEC (Japan) program where full-scale boiling water reactor (BWR) assemblies of different types, including assemblies with part-length rods, and pressurized water reactor subassemblies were tested at nominal reactor operating conditions, as well as for a range of flow rates and pressures. Generally, the code performance ranged from adequate to good, except for configurations exhibiting a strong gradient in power-to-flow ratio. Critical power predictions have also been assessed and code limitations identified, based on measurements on full-scale BWR 8 x 8 and high-burnup assemblies operated over a range of thermal-hydraulic conditions.

  17. Validation of a computer based system for assessing dietary intake.

    PubMed Central

    Levine, J A; Madden, A M; Morgan, M Y

    1987-01-01

    Dietary intake was assessed in 50 patients in hospital by using a dietary history method and computer based system for data collection and standard food tables to calculate the composition of nutrients. The results were compared with those from a weighed assessment that was calculated by using both food tables and manufacturers' food analyses. The use of the food tables overestimated mean (SEM) individual nutrient intakes by between 2.5% (1.5%) and 15.5% (3.0%). The mean errors associated with the dietary history assessment varied from -23% (7.8%) for fat intake to +21.4% (8.5%) for carbohydrate intake. Overall, 30% of the assessments of total nutrient intakes that were calculated using this method were within -20% to +20% of actual values; 18% were within -10% to +10%. The mean errors associated with the computer based assessment varied from -1.0% (4.3%) for carbohydrate intake to +8.5% (3.4%) for protein intake. Overall, 56% of the assessments of total nutrient intakes were within -20% to +20% of actual intakes; 31% were within -10% to +10%. The computer based system provides an accurate, reproducible, convenient, and inexpensive method for assessing dietary intake. PMID:3115455

  18. A review of literature and computer models on exposure assessment.

    PubMed

    Butta, T E; Clarkb, M; Coulone, F; Oduyemi, K O K

    2009-12-14

    At the present time, risk analysis is an effective management tool used by environmental managers to protect the environment from inevitable anthropogenic activities. There are generic elements in environmental risk assessments, which are independent of the subject to which risk analysis is applied. Examples of these elements are: baseline study, hazard identification, hazards' concentration assessment and risk quantification. Another important example of such generic elements is exposure assessment, which is required in a risk analysis process for landfill leachate as it would in any other environmental risk issue. Furthermore, computer models are also being developed to assist risk analysis in different fields. However, in the review of current computer models and literature, particularly regarding landfills, the authors have found no evidence for the existence of a holistic exposure assessment procedure underpinned with a computational method for landfill leachate. This paper, with reference to the relevant literature and models reviewed, discusses the extent to which exposure assessment is absent in landfill risk assessment approaches. The study also indicates a number of factors and features that should be added to the exposure assessment system in order to render it more strategic, thereby enhancing the quantitative risk analysis.

  19. An Assessment of Post-Professional Athletic Training Students' Critical Thinking Skills and Dispositions

    ERIC Educational Resources Information Center

    Walter, Jessica Marie

    2013-01-01

    The need for outcome measures in critical thinking skills and dispositions for post-professional athletic training programs (PPATPs) is significant. It has been suggested that athletic trainers who are competent and disposed towards thinking critically will be successful in the profession. The purpose of this study is to assess critical thinking…

  20. An Assessment of Post-Professional Athletic Training Students' Critical Thinking Skills and Dispositions

    ERIC Educational Resources Information Center

    Walter, Jessica Marie

    2013-01-01

    The need for outcome measures in critical thinking skills and dispositions for post-professional athletic training programs (PPATPs) is significant. It has been suggested that athletic trainers who are competent and disposed towards thinking critically will be successful in the profession. The purpose of this study is to assess critical thinking…

  1. Critical Thinking and Political Participation: Development and Assessment of a Casual Model.

    ERIC Educational Resources Information Center

    Guyton, Edith M.

    1988-01-01

    This study assessed a model of the relationship between critical thinking and political participation. Findings indicated that critical thinking has indirect positive effects on orientations toward political participation, that critical thinking positively affects personal control, political efficacy, and democratic attitude, and that personal…

  2. Critical Thinking and Political Participation: Development and Assessment of a Casual Model.

    ERIC Educational Resources Information Center

    Guyton, Edith M.

    1988-01-01

    This study assessed a model of the relationship between critical thinking and political participation. Findings indicated that critical thinking has indirect positive effects on orientations toward political participation, that critical thinking positively affects personal control, political efficacy, and democratic attitude, and that personal…

  3. Comparison of two pain assessment tools in nonverbal critical care patients.

    PubMed

    Paulson-Conger, Melissa; Leske, Jane; Maidl, Carolyn; Hanson, Andrew; Dziadulewicz, Laurel

    2011-12-01

    It is recommended that patient's self-report of pain should be obtained as often as possible as the "gold standard." Unfortunately in critical care, many factors can alter verbal communication with patients, making pain assessment more difficult. Scientific advances in understanding pain mechanisms, multidimensional methods of pain assessment, and analgesic pharmacology have improved pain management strategies. However, pain assessment for nonverbal patients in critical care continues to present a challenge for clinicians and researchers. The purpose of this study was to compare the Pain Assessment in Advanced Dementia (PAINAD) and the Critical-Care Pain Observation Tool (CPOT) scores for assessment in nonverbal critical care patients. A descriptive, comparative, prospective design was used in this study. A convenience sample of 100 critical care, nonverbal, adult patients of varying medical diagnoses who required pain evaluation were assessed with the PAINAD and CPOT scales. Data were collected over a 6-month period in all critical care areas. Observations of pain assessments for nonverbal patients who required pain evaluation were recorded on the PAINAD and the CPOT successively. Internal consistency reliability for the PAINAD was 0.80 and for the CPOT 0.72. Limits of agreement indicated that there was no difference in PAINAD and CPOT scores for assessing pain in nonverbal patients in critical care. Further research in the area of pain assessment for nonverbal patients in critical care is needed. Copyright © 2011 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  4. Assessment and treatment of hyperglycemia in critically ill patients

    PubMed Central

    Viana, Marina Verçoza; Moraes, Rafael Barberena; Fabbrin, Amanda Rodrigues; Santos, Manoella Freitas; Gerchman, Fernando

    2014-01-01

    Hyperglycemia is a commonly encountered issue in critically ill patients in the intensive care setting. The presence of hyperglycemia is associated with increased morbidity and mortality, regardless of the reason for admission (e.g., acute myocardial infarction, status post-cardiovascular surgery, stroke, sepsis). However, the pathophysiology and, in particular, the treatment of hyperglycemia in the critically ill patient remain controversial. In clinical practice, several aspects must be taken into account in the management of these patients, including blood glucose targets, history of diabetes mellitus, the route of nutrition (enteral or parenteral), and available monitoring equipment, which substantially increases the workload of providers involved in the patients' care. This review describes the epidemiology, pathophysiology, management, and monitoring of hyperglycemia in the critically ill adult patient. PMID:24770692

  5. Computation of cross sections and dose conversion factors for criticality accident dosimetry.

    PubMed

    Devine, R T

    2004-01-01

    In the application of criticality accident dosemeters the cross sections and fluence-to-dose conversion factors have to be computed. The cross section and fluence-to-dose conversion factor for the thermal and epi-thermal contributions to neutron dose are well documented; for higher energy regions (>100 keV) these depend on the spectrum assumed. Fluence is determined using threshold detectors. The cross sections require the folding of an expected spectrum with the reaction cross sections. The fluence-to-dose conversion factors also require a similar computation. The true and effective thresholds are used to include the information on the expected spectrum. The spectra can either be taken from compendia or measured at the facility at which the exposures are to be expected. The cross sections can be taken from data computations or analytic representations and the fluence-to-dose conversion factors are determined by various standards making bodies. The problem remaining is the method of computation. The purpose of this paper is to compare two methods for computing these factors: analytic and Monte Carlo.

  6. Assessing Reliability: Critical Corrections for a Critical Examination of the Rorschach Comprehensive System.

    ERIC Educational Resources Information Center

    Meyer, Gregory J.

    1997-01-01

    In reply to criticism of the Rorschach Comprehensive System (CS) by J. Wood, M. Nezworski, and W. Stejskal (1996), this article presents a meta-analysis of published data indicating that the CS has excellent chance-corrected interrater reliability. It is noted that the erroneous assumptions of Wood et al. make their assertions about validity…

  7. Risk Assessment Methodology for Protecting Our Critical Physical Infrastructures

    SciTech Connect

    BIRINGER,BETTY E.; DANNEELS,JEFFREY J.

    2000-12-13

    Critical infrastructures are central to our national defense and our economic well-being, but many are taken for granted. Presidential Decision Directive (PDD) 63 highlights the importance of eight of our critical infrastructures and outlines a plan for action. Greatly enhanced physical security systems will be required to protect these national assets from new and emerging threats. Sandia National Laboratories has been the lead laboratory for the Department of Energy (DOE) in developing and deploying physical security systems for the past twenty-five years. Many of the tools, processes, and systems employed in the protection of high consequence facilities can be adapted to the civilian infrastructure.

  8. Using student writing assignments to assess critical thinking skills: a holistic approach.

    PubMed

    Niedringhaus, L K

    2001-04-01

    This work offers an example of one school's holistic approach to the evaluation of critical thinking by using student writing assignments. Faculty developed tools to assess achievement of critical thinking competencies, such as analysis, synthesis, insight, reflection, open mindedness, and depth, breadth, and appropriateness of clinical interventions. Faculty created a model for the development of program-specific critical thinking competencies, selected appropriate writing assignments that demonstrate critical thinking, and implemented a holistic assessment plan for data collection and analysis. Holistic assessment involves the identification of shared values and practices, and the use of concepts and language important to nursing.

  9. Assessing Program Impact with the Critical Incident Technique

    ERIC Educational Resources Information Center

    O'Neill, Barbara

    2013-01-01

    The critical incident technique (CIT) is a qualitative research method where subjects are encouraged to tell personal stories that provide descriptive data. Researchers who use the CIT employ a structured methodology to encourage respondents to share their experiences regarding a particular topic. Incidents are considered effective/successful when…

  10. Assess the Critical Period Hypothesis in Second Language Acquisition

    ERIC Educational Resources Information Center

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  11. Teaching in the Zone: Formative Assessments for Critical Thinking

    ERIC Educational Resources Information Center

    Maniotes, Leslie K.

    2010-01-01

    This article discusses how a school librarian can help students improve their critical thinking and strengthen their higher order thinking skills through the inquiry process. First, it will use a Guided Inquiry approach to examine how higher order thinking skills are taught within an inquiry paradigm. Next, it will consider how formative…

  12. Assessment of the adequacy of a criticality incident detection system

    SciTech Connect

    Cartwright, C.M.; Finnerty, M.D.

    1993-12-31

    The primary purpose of a criticality incident detection (CID) and alarm system is to minimize, by means of building evacuation, the radiation doses received by plant personnel. The adequacy of a CID systems installed in a nuclear plant within the UK was investigated. Results are described.

  13. Conceptualising, Developing and Assessing Critical Thinking in Law

    ERIC Educational Resources Information Center

    James, Nickolas; Hughes, Clair; Cappa, Clare

    2010-01-01

    "Critical thinking" is commonly included in the lists of graduate attributes (GAs), which all Australian universities are now required to develop and implement. That efforts to do so have met with limited success is due to a range of factors including inconsistent or naive conceptualisations, the failure to explicitly develop or assess…

  14. Conceptualising, Developing and Assessing Critical Thinking in Law

    ERIC Educational Resources Information Center

    James, Nickolas; Hughes, Clair; Cappa, Clare

    2010-01-01

    "Critical thinking" is commonly included in the lists of graduate attributes (GAs), which all Australian universities are now required to develop and implement. That efforts to do so have met with limited success is due to a range of factors including inconsistent or naive conceptualisations, the failure to explicitly develop or assess…

  15. Assessing Program Impact with the Critical Incident Technique

    ERIC Educational Resources Information Center

    O'Neill, Barbara

    2013-01-01

    The critical incident technique (CIT) is a qualitative research method where subjects are encouraged to tell personal stories that provide descriptive data. Researchers who use the CIT employ a structured methodology to encourage respondents to share their experiences regarding a particular topic. Incidents are considered effective/successful when…

  16. Assessment of Prospective Teachers' Views Regarding the Concept of Criticism

    ERIC Educational Resources Information Center

    Karakus, Neslihan

    2015-01-01

    Critical thinking is one of the skills that exist in the Turkish course curriculum and is aimed to be acquired by students. The objective of the study is to determine prospective Turkish teachers' perspectives regarding the concept of critism, which is both a mental exercise and carries an important role in the world of ideas. In order to assess…

  17. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    DOE PAGES

    Jacobsen, Jesper Lykke; Scullard, Christian R.

    2013-02-04

    We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size ofmore » B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.« less

  18. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    NASA Astrophysics Data System (ADS)

    Lykke Jacobsen, Jesper; Scullard, Christian R.

    2013-02-01

    In our previous work [1] we have shown that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK - 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size of B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible. We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.

  19. Workplace Educators' Interpretations of Their Assessment Practices: A View through a Critical Practice Lens

    ERIC Educational Resources Information Center

    Trede, Franziska; Smith, Megan

    2014-01-01

    In this paper, we examine workplace educators' interpretations of their assessment practices. We draw on a critical practice lens to conceptualise assessment practice as a social, relational and situated practice that becomes critical through critique and emancipation. We conducted semi-structured interviews followed by roundtable discussions with…

  20. Full-Cycle Assessment of Critical Thinking in an Ethics and Science Course

    ERIC Educational Resources Information Center

    Blue, Jennifer; Taylor, Beverley; Yarrison-Rice, Jan

    2008-01-01

    Enhancing critical thinking skills for undergraduate students is important across the curriculum and between disciplines. We report on a method of improving critical thinking skills, which was studied through an Ethics and Science First-Year Seminar course. We used full cycle assessment over a three-year period to assess students' development and…

  1. Workplace Educators' Interpretations of Their Assessment Practices: A View through a Critical Practice Lens

    ERIC Educational Resources Information Center

    Trede, Franziska; Smith, Megan

    2014-01-01

    In this paper, we examine workplace educators' interpretations of their assessment practices. We draw on a critical practice lens to conceptualise assessment practice as a social, relational and situated practice that becomes critical through critique and emancipation. We conducted semi-structured interviews followed by roundtable discussions with…

  2. The Halpern Critical Thinking Assessment and Real-World Outcomes: Cross-National Applications

    ERIC Educational Resources Information Center

    Butler, Heather A.; Dwyer, Christopher P.; Hogan, Michael J.; Franco, Amanda; Rivas, Silvia F.; Saiz, Carlos; Almeida, Leandro S.

    2012-01-01

    The Halpern Critical Thinking Assessment (HCTA) is a reliable measure of critical thinking that has been validated with numerous qualitatively different samples and measures of academic success (Halpern, 2010a). This paper presents several cross-national applications of the assessment, and recent work to expand the validation of the HCTA with…

  3. Using a Client Memo to Assess Critical Thinking of Finance Majors

    ERIC Educational Resources Information Center

    Carrithers, David; Bean, John C.

    2008-01-01

    This article describes a holistic, discourse-based method for assessing the critical thinking skills of undergraduate senior-level finance majors. Rejecting a psychometric assessment approach in which component features of critical thinking are disaggregated, this study is based on a holistic scoring of student memos. Students were asked to…

  4. The Halpern Critical Thinking Assessment and Real-World Outcomes: Cross-National Applications

    ERIC Educational Resources Information Center

    Butler, Heather A.; Dwyer, Christopher P.; Hogan, Michael J.; Franco, Amanda; Rivas, Silvia F.; Saiz, Carlos; Almeida, Leandro S.

    2012-01-01

    The Halpern Critical Thinking Assessment (HCTA) is a reliable measure of critical thinking that has been validated with numerous qualitatively different samples and measures of academic success (Halpern, 2010a). This paper presents several cross-national applications of the assessment, and recent work to expand the validation of the HCTA with…

  5. Moving beyond Assessment to Improving Students' Critical Thinking Skills: A Model for Implementing Change

    ERIC Educational Resources Information Center

    Haynes, Ada; Lisic, Elizabeth; Goltz, Michele; Stein, Barry; Harris, Kevin

    2016-01-01

    This research examines how the use of the CAT (Critical thinking Assessment Test) and involvement in CAT-Apps (CAT Applications within the discipline) training can serve as an important part of a faculty development model that assists faculty in the assessment of students' critical thinking skills and in the development of these skills within…

  6. Computer-aided modeling of beam propagation effects in diffraction-critical spaceborne instruments

    NASA Astrophysics Data System (ADS)

    Caldwell, Martin E.; Gray, Peter F.; McNamara, Paul

    1996-08-01

    This talk concerns applications of a ray-trace model to the computation of the effect of diffraction on beam propagation. It reports the use of the technique in the design of apertures for space-borne instruments having critical diffraction properties. The modeling technique used is that of gaussian beam decomposition, a numerical beam propagation technique incorporated in a commercially available ray-trace program. The result is the powerful capability to model the optical field at any point, in systems of any geometry, with any amount of aberration. The technique is particularly useful for design problems where `non-imaging' effects are important, and examples of its use will be given. Although the computation requirements for such detailed analysis may seem daunting, the continuing increase in readily available computing power is now overcoming this drawback. The application here is to certain `diffraction-critical' situations, where the design of correctly sized apertures is needed for the control of unwanted diffraction effects. Three recent design studies are illustrated: (1) Millimeter wave imaging with off-axis reflectors. Analysis of the effects of aberration on coherent detection efficiency. (2) Long-distance beam propagation in space-borne laser interferometry. This involves the analysis of coherent detection efficiency in the presence of aberrated gaussian beams. (3) Design of a Lyot stop system for an infra-red radiometer which is to view the Earth's limb from space. Here the critical (and unwanted) diffraction is that from the bright Earth disc, lying just outside of the instrument field of view. The analysis technique is explained, and examples given of diffracted energy patterns analyzed at progressive stages in the system. It is shown how these aid the design and analysis of the systems. The aim is to show the range problems in which this method is useful, and to hopefully learn from others at the conference about other cases where such techniques

  7. Computer Applications for Alternative Assessment: An Instructional and Organization Dilemma.

    ERIC Educational Resources Information Center

    Mills, Ed; Brown, John A.

    1997-01-01

    Describes the possibilities and problems that computer-generated portfolios will soon present to instructors across America. Highlights include the history of portfolio assessment, logistical problems of handling portfolios in the traditional three-ring binder format, use of the zip drive for storage, and software/hardware compatibility problems.…

  8. How Effective Is Feedback in Computer-Aided Assessments?

    ERIC Educational Resources Information Center

    Gill, Mundeep; Greenhow, Martin

    2008-01-01

    Computer-Aided Assessments (CAAs) have been used increasingly at Brunel University for over 10 years to test students' mathematical abilities. Recently, we have focussed on providing very rich feedback to the students; given the work involved in designing and coding such feedback, it is important to study the impact of the interaction between…

  9. Computer Technology for Nursing Staff Learning Need Assessment.

    ERIC Educational Resources Information Center

    Forte, Paula S.

    1984-01-01

    The advantages of using a computer to analyze needs assessment data for continuing education are (1) it allows the expression of organizational needs, (2) all learners are able to declare their own needs, and (3) it provides rapid access to large amounts of information. (SK)

  10. Using Computer Simulations to Assess Hands-On Science Learning.

    ERIC Educational Resources Information Center

    Baxter, Gail P.

    1995-01-01

    Two methods of assessing student learning of a hands-on instructional unit are compared. One method involves manipulation of concrete materials, and the other method involves manipulation of icons on a computer to solve an electric circuits problem. Sixth-grade students in an inquiry-based science program completed both assignments. (LZ)

  11. The Use of Computers in Social Work Practice: An Assessment.

    ERIC Educational Resources Information Center

    Miller, Henry

    1986-01-01

    The potential use of computers in social work education and practice is discussed. Possibilities are emerging in regard to case management, diagnosis and assessment, and even treatment. The bottleneck is no longer expensive hardware but the development of usable and relevant software and courseware. (Author/MH)

  12. The Use of Computers in Social Work Practice: An Assessment.

    ERIC Educational Resources Information Center

    Miller, Henry

    1986-01-01

    The potential use of computers in social work education and practice is discussed. Possibilities are emerging in regard to case management, diagnosis and assessment, and even treatment. The bottleneck is no longer expensive hardware but the development of usable and relevant software and courseware. (Author/MH)

  13. Computer Applications for Alternative Assessment: An Instructional and Organization Dilemma.

    ERIC Educational Resources Information Center

    Mills, Ed; Brown, John A.

    1997-01-01

    Describes the possibilities and problems that computer-generated portfolios will soon present to instructors across America. Highlights include the history of portfolio assessment, logistical problems of handling portfolios in the traditional three-ring binder format, use of the zip drive for storage, and software/hardware compatibility problems.…

  14. Assessing the use of computers in industrial occupational health departments.

    PubMed

    Owen, J P

    1995-04-01

    Computers are widely used in business and industry and the benefits of computerizing occupational health (OH) departments have been advocated by several authors. The requirements for successful computerization of an OH department are reviewed. Having identified the theoretical benefits, the real picture in industry is assessed by surveying 52 firms with over 1000 employees in a large urban area. Only 15 (29%) of the companies reported having any OH service, of which six used computers in the OH department, reflecting the business priorities of most of the companies. The types of software systems used and their main use are examined, along with perceived benefits or disadvantages. With the decreasing costs of computers and increasingly 'user-friendly' software, there is a real cost benefit to be gained from using computers in OH departments, although the concept may have to be 'sold' to management.

  15. The use of computers for perioperative simulation in anesthesia, critical care, and pain medicine.

    PubMed

    Lambden, Simon; Martin, Bruce

    2011-09-01

    Simulation in perioperative anesthesia training is a field of considerable interest, with an urgent need for tools that reliably train and facilitate objective assessment of performance. This article reviews the available simulation technologies, their evolution, and the current evidence base for their use. The future directions for research in the field and potential applications of simulation technology in anesthesia, critical care, and pain medicine are discussed. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Computer-aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.

    1984-01-01

    This research on pilot response to critical in-flight events employs a unique methodology including an interactive computer-aided scenario-testing system. Navigation displays, instrument-panel displays, and assorted textual material are presented on a touch-sensitive CRT screen. Problem diagnosis scenarios, destination-diversion scenarios and combined destination/diagnostic tests are available. A complete time history of all data inquiries and responses is maintained. Sample results of diagnosis scenarios obtained from testing 38 licensed pilots are presented and discussed.

  17. Computer-aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.

    1984-01-01

    This research on pilot response to critical in-flight events employs a unique methodology including an interactive computer-aided scenario-testing system. Navigation displays, instrument-panel displays, and assorted textual material are presented on a touch-sensitive CRT screen. Problem diagnosis scenarios, destination-diversion scenarios and combined destination/diagnostic tests are available. A complete time history of all data inquiries and responses is maintained. Sample results of diagnosis scenarios obtained from testing 38 licensed pilots are presented and discussed.

  18. Critical issues in state-of-the-art brain-computer interface signal processing.

    PubMed

    Krusienski, Dean J; Grosse-Wentrup, Moritz; Galán, Ferran; Coyle, Damien; Miller, Kai J; Forney, Elliott; Anderson, Charles W

    2011-04-01

    This paper reviews several critical issues facing signal processing for brain-computer interfaces (BCIs) and suggests several recent approaches that should be further examined. The topics were selected based on discussions held during the 4th International BCI Meeting at a workshop organized to review and evaluate the current state of, and issues relevant to, feature extraction and translation of field potentials for BCIs. The topics presented in this paper include the relationship between electroencephalography and electrocorticography, novel features for performance prediction, time-embedded signal representations, phase information, signal non-stationarity, and unsupervised adaptation.

  19. Critical issues in state-of-the-art brain–computer interface signal processing

    PubMed Central

    Krusienski, Dean J.; Grosse-Wentrup, Moritz; Galán, Ferran; Coyle, Damien; Miller, Kai J.; Forney, Elliott; Anderson, Charles W.

    2012-01-01

    This paper reviews several critical issues facing signal processing for brain–computer interfaces (BCIs) and suggests several recent approaches that should be further examined. The topics were selected based on discussions held during the 4th International BCI Meeting at a workshop organized to review and evaluate the current state of, and issues relevant to, feature extraction and translation of field potentials for BCIs. The topics presented in this paper include the relationship between electroencephalography and electrocorticography, novel features for performance prediction, time-embedded signal representations, phase information, signal non-stationarity, and unsupervised adaptation. PMID:21436519

  20. Assessing Critical Thinking: A College's Journey and Lessons Learned

    ERIC Educational Resources Information Center

    Peach, Brian E.; Mukherjee, Arup; Hornyak, Martin

    2007-01-01

    The business college at University of West Florida is currently in the throes of implementing an assessment initiative to develop student learning outcomes, design assessment devices to measure learning, analyze the measurement results to identify learning shortfalls, and establish feedback mechanisms to modify the curriculum to address the…

  1. Assessing Critical Thinking: A College's Journey and Lessons Learned

    ERIC Educational Resources Information Center

    Peach, Brian E.; Mukherjee, Arup; Hornyak, Martin

    2007-01-01

    The business college at University of West Florida is currently in the throes of implementing an assessment initiative to develop student learning outcomes, design assessment devices to measure learning, analyze the measurement results to identify learning shortfalls, and establish feedback mechanisms to modify the curriculum to address the…

  2. Optimal recovery sequencing for critical infrastructure resilience assessment.

    SciTech Connect

    Vugrin, Eric D.; Brown, Nathanael J. K.; Turnquist, Mark Alan

    2010-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the identification of optimal recovery strategies that maximize resilience. To this goal, we formulate a bi-level optimization problem for infrastructure network models. In the 'inner' problem, we solve for network flows, and we use the 'outer' problem to identify the optimal recovery modes and sequences. We draw from the literature of multi-mode project scheduling problems to create an effective solution strategy for the resilience optimization model. We demonstrate the application of this approach to a set of network models, including a national railroad model and a supply chain for Army munitions production.

  3. Using the critical incident survey to assess hospital service quality.

    PubMed

    Longo, B; Connor, G; Barnhart, T

    1993-01-01

    This survey was designed to determine "standards of excellence" in hospital services as defined by (a) former patients, (b) physicians, (c) hospital employees, and (d) corporate insurance subscribers. One hundred forty-seven (147) patients, 188 employees, and 20 corporate subscribers were interviewed by telephone, and 52 physicians were interviewed in their offices. The interview consisted of a single question: "Can you think of a time when, as a patient/employee/employer/physician, you had a particularly satisfying or dissatisfying experience with a local hospital?" Reported incidents were reviewed, and 239 "critical incidents" were identified. These incidents were classified into 12 descriptive categories relating to the underlying factors in the incident reports. Six focus groups were later held with participants segregated by the population pool they represented. These groups were asked to develop definitions of "excellence" in hospital service quality and standards for service which would "exceed expectations." The focus groups created 122 standards of excellence, which were classified into 43 categories. Overall, the largest percentages of corporate, physician, and employee critical incidents were classified as "Administrative Policy" issues. Patients most often reported "Nurturing" incidents as critical to their perceptions of hospital service quality.

  4. Prediction of State Mandated Assessment Mathematics Scores from Computer Based Mathematics and Reading Preview Assessments

    ERIC Educational Resources Information Center

    Costa-Guerra, Boris

    2012-01-01

    The study sought to understand whether MAPs computer based assessment of math and language skills using MAPs reading scores can predict student scores on the NMSBA. A key question was whether or not the prediction can be improved by including student language skill scores. The study explored the effectiveness of computer based preview assessments…

  5. Evaluation of tablet computers for visual function assessment.

    PubMed

    Bodduluri, Lakshmi; Boon, Mei Ying; Dain, Stephen J

    2017-04-01

    Recent advances in technology and the increased use of tablet computers for mobile health applications such as vision testing necessitate an understanding of the behavior of the displays of such devices, to facilitate the reproduction of existing or the development of new vision assessment tests. The purpose of this study was to investigate the physical characteristics of one model of tablet computer (iPad mini Retina display) with regard to display consistency across a set of devices (15) and their potential application as clinical vision assessment tools. Once the tablet computer was switched on, it required about 13 min to reach luminance stability, while chromaticity remained constant. The luminance output of the device remained stable until a battery level of 5%. Luminance varied from center to peripheral locations of the display and with viewing angle, whereas the chromaticity did not vary. A minimal (1%) variation in luminance was observed due to temperature, and once again chromaticity remained constant. Also, these devices showed good temporal stability of luminance and chromaticity. All 15 tablet computers showed gamma functions approximating the standard gamma (2.20) and showed similar color gamut sizes, except for the blue primary, which displayed minimal variations. The physical characteristics across the 15 devices were similar and are known, thereby facilitating the use of this model of tablet computer as visual stimulus displays.

  6. Incorporating Colour Information for Computer-Aided Diagnosis of Melanoma from Dermoscopy Images: A Retrospective Survey and Critical Analysis

    PubMed Central

    Drew, Mark S.

    2016-01-01

    Cutaneous melanoma is the most life-threatening form of skin cancer. Although advanced melanoma is often considered as incurable, if detected and excised early, the prognosis is promising. Today, clinicians use computer vision in an increasing number of applications to aid early detection of melanoma through dermatological image analysis (dermoscopy images, in particular). Colour assessment is essential for the clinical diagnosis of skin cancers. Due to this diagnostic importance, many studies have either focused on or employed colour features as a constituent part of their skin lesion analysis systems. These studies range from using low-level colour features, such as simple statistical measures of colours occurring in the lesion, to availing themselves of high-level semantic features such as the presence of blue-white veil, globules, or colour variegation in the lesion. This paper provides a retrospective survey and critical analysis of contributions in this research direction. PMID:28096807

  7. A critical review on sustainability assessment of recycled water schemes.

    PubMed

    Chen, Zhuo; Ngo, Huu Hao; Guo, Wenshan

    2012-06-01

    Recycled water provides a viable opportunity to supplement water supplies as well as alleviate environmental loads. To further expand current schemes and explore new recycled water end uses, this study reviews several environmental assessment tools, including Life Cycle Assessment (LCA), Material Flow Analysis (MFA) and Environmental Risk Assessment (ERA) in terms of their types, characteristics and weaknesses in evaluating the sustainability of recycled water schemes. Due to the limitations in individual models, the integrated approaches are recommended in most cases, of which the outputs could be further combined with additional economic and social assessments in multi-criteria decision making framework. The study also proposes several management strategies in improving the environmental scores. The discussion and suggestions could help decision makers in making a sound judgement as well as recognising the challenges and tasks in the future.

  8. Critical assessment of Reynolds stress turbulence models using homogeneous flows

    NASA Technical Reports Server (NTRS)

    Shabbir, Aamir; Shih, Tsan-Hsing

    1992-01-01

    In modeling the rapid part of the pressure correlation term in the Reynolds stress transport equations, extensive use has been made of its exact properties which were first suggested by Rotta. These, for example, have been employed in obtaining the widely used Launder, Reece and Rodi (LRR) model. Some recent proposals have dropped one of these properties to obtain new models. We demonstrate, by computing some simple homogeneous flows, that doing so does not lead to any significant improvements over the LRR model and it is not the right direction in improving the performance of existing models. The reason for this, in our opinion, is that violation of one of the exact properties can not bring in any new physics into the model. We compute thirteen homogeneous flows using LRR (with a recalibrated rapid term constant), IP and SSG models. The flows computed include the flow through axisymmetric contraction; axisymmetric expansion; distortion by plane strain; and homogeneous shear flows with and without rotation. Results show that for most general representation for a model linear in the anisotropic tensor, performs either better or as good as the other two models of the same level.

  9. Critical assessment of Reynolds stress turbulence models using homogeneous flows

    NASA Astrophysics Data System (ADS)

    Shabbir, Aamir; Shih, Tsan-Hsing

    1992-12-01

    In modeling the rapid part of the pressure correlation term in the Reynolds stress transport equations, extensive use has been made of its exact properties which were first suggested by Rotta. These, for example, have been employed in obtaining the widely used Launder, Reece and Rodi (LRR) model. Some recent proposals have dropped one of these properties to obtain new models. We demonstrate, by computing some simple homogeneous flows, that doing so does not lead to any significant improvements over the LRR model and it is not the right direction in improving the performance of existing models. The reason for this, in our opinion, is that violation of one of the exact properties can not bring in any new physics into the model. We compute thirteen homogeneous flows using LRR (with a recalibrated rapid term constant), IP and SSG models. The flows computed include the flow through axisymmetric contraction; axisymmetric expansion; distortion by plane strain; and homogeneous shear flows with and without rotation. Results show that for most general representation for a model linear in the anisotropic tensor, performs either better or as good as the other two models of the same level.

  10. Assessment of Teaching Methods and Critical Thinking in a Course for Science Majors

    NASA Astrophysics Data System (ADS)

    Speck, Angela; Ruzhitskaya, L.; Whittington, A. G.

    2014-01-01

    Ability to think critically is a key ingredient to the scientific mindset. Students who take science courses may or may not be predisposed to critical thinking - the ability to evaluate information analytically. Regardless of their initial stages, students can significantly improve their critical thinking through learning and practicing their reasoning skills, critical assessments, conducting and reflecting on observations and experiments, building their questioning and communication skills, and through the use of other techniques. While, there are several of teaching methods that may help to improve critical thinking, there are only a few assessment instruments that can help in evaluating the efficacy of these methods. Critical thinking skills and improvement in those skills are notoriously difficult to measure. Assessments that are based on multiple-choice questions demonstrate students’ final decisions but not their thinking processes. In addition, during the course of studies students may develop subject-based critical thinking while not being able to extend the skills to the general critical thinking. As such, we wanted to design and conduct a study on efficacy of several teaching methods in which we would learn how students’ improve their thinking processes within a science discipline as well as in everyday life situations. We conducted a study among 20 astronomy, physics and geology majors-- both graduate and undergraduate students-- enrolled in our Solar System Science course (mostly seniors and early graduate students) at the University of Missouri. We used the Ennis-Weir Critical Thinking Essay test to assess students’ general critical thinking and, in addition, we implemented our own subject-based critical thinking assessment. Here, we present the results of this study and share our experience on designing a subject-based critical thinking assessment instrument.

  11. FORTRAN 4 computer program for calculating critical speeds of rotating shafts

    NASA Technical Reports Server (NTRS)

    Trivisonno, R. J.

    1973-01-01

    A FORTRAN 4 computer program, written for the IBM DCS 7094/7044 computer, that calculates the critical speeds of rotating shafts is described. The shaft may include bearings, couplings, extra masses (nonshaft mass), and disks for the gyroscopic effect. Shear deflection is also taken into account, and provision is made in the program for sections of the shaft that are tapered. The boundary conditions at the ends of the shaft can be fixed (deflection and slope equal to zero) or free (shear and moment equal to zero). The fixed end condition enables the program to calculate the natural frequencies of cantilever beams. Instead of using the lumped-parameter method, the program uses continuous integration of the differential equations of beam flexure across different shaft sections. The advantages of this method over the usual lumped-parameter method are less data preparation and better approximation of the distribution of the mass of the shaft. A main feature of the program is the nature of the output. The Calcomp plotter is used to produce a drawing of the shaft with superimposed deflection curves at the critical speeds, together with all pertinent information related to the shaft.

  12. Assessment of Critical Mass Laboratory safeguards and security upgrades

    SciTech Connect

    Merrill, B.J.; DeMyer, J.J.

    1985-05-31

    Pacific Northwest Laboratory (PNL) conducted an evaluation of the safeguards and security systems at the Critical Mass Laboratory (CML) in February 1985, to identify appropriate upgrading actions necessary to ensure that effective and efficient systems consistent with DOE-RL policies, procedures, and site priorities are in place. Since that evaluation, there have been changes in Patrol contingency philosophy, response tactics, and distribution of manpower. Because of these changes, and at the request of DOE-RL, PNL has re-evaluated the safeguards and security systems in place at CML.

  13. A Technology Assessment of Personal Computers. Vol. III: Personal Computer Impacts and Policy Issues.

    ERIC Educational Resources Information Center

    Nilles, Jack M.; And Others

    A technology assessment of personal computers was conducted to study both the socially desirable and undesirable impacts of this new technology in three main areas: education, employment, and international trade. Information gleaned from this study was then used to generate suggestions for public policy options which could influence these impacts.…

  14. Computer Literacy and the Construct Validity of a High-Stakes Computer-Based Writing Assessment

    ERIC Educational Resources Information Center

    Jin, Yan; Yan, Ming

    2017-01-01

    One major threat to validity in high-stakes testing is construct-irrelevant variance. In this study we explored whether the transition from a paper-and-pencil to a computer-based test mode in a high-stakes test in China, the College English Test, has brought about variance irrelevant to the construct being assessed in this test. Analyses of the…

  15. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    SciTech Connect

    Pevey, Ronald E.

    2005-09-15

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.

  16. Oral Histories as Critical Qualitative Inquiry in Community Health Assessment.

    PubMed

    Hernandez, Sarah Gabriella; Genkova, Ana; Castañeda, Yvette; Alexander, Simone; Hebert-Beirne, Jennifer

    2017-10-01

    Qualitative methods such as focus groups and interviews are common methodologies employed in participatory approaches to community health assessment to develop effective community health improvement plans. Oral histories are a rarely used form of qualitative inquiry that can enhance community health assessment in multiple ways. Oral histories center residents' lived experiences, which often reveal more complex social and health phenomena than conventional qualitative inquiry. This article examines an oral history research component of the Little Village Community Health Assessment, a collaborative research effort to promote health equity in an urban, Mexican ethnic enclave. We collected of 32 oral histories from residents to provide deeper, more grounded insight on community needs and assets. We initially used thematic data analysis. After analytic peer debriefings with the analysis team, we found the process inadvertently reductionist and instead opted for community listening events for participatory data analysis, knowledge translation, and dissemination of findings. Oral histories were most meaningful in their original audio form, adding to a holistic understanding of health by giving voice to complex problems while also naming and describing concepts that were culturally unique. Moreover, the oral histories collectively articulated a counternarrative that celebrated community cultural wealth and opposed the mainstream narrative of the community as deprived. We argue for the recognition and practice of oral histories as a more routine form of qualitative inquiry in community health assessment. In the pursuit of health equity and collaboratively working toward social justice, oral histories can push the boundaries of community health assessment research and practice.

  17. Validation of a scenario-based assessment of critical thinking using an externally validated tool.

    PubMed

    Buur, Jennifer L; Schmidt, Peggy; Smylie, Dean; Irizarry, Kris; Crocker, Carlos; Tyler, John; Barr, Margaret

    2012-01-01

    With medical education transitioning from knowledge-based curricula to competency-based curricula, critical thinking skills have emerged as a major competency. While there are validated external instruments for assessing critical thinking, many educators have created their own custom assessments of critical thinking. However, the face validity of these assessments has not been challenged. The purpose of this study was to compare results from a custom assessment of critical thinking with the results from a validated external instrument of critical thinking. Students from the College of Veterinary Medicine at Western University of Health Sciences were administered a custom assessment of critical thinking (ACT) examination and the externally validated instrument, California Critical Thinking Skills Test (CCTST), in the spring of 2011. Total scores and sub-scores from each exam were analyzed for significant correlations using Pearson correlation coefficients. Significant correlations between ACT Blooms 2 and deductive reasoning and total ACT score and deductive reasoning were demonstrated with correlation coefficients of 0.24 and 0.22, respectively. No other statistically significant correlations were found. The lack of significant correlation between the two examinations illustrates the need in medical education to externally validate internal custom assessments. Ultimately, the development and validation of custom assessments of non-knowledge-based competencies will produce higher quality medical professionals.

  18. Teaching Critical Thinking Skills with CAI: A Design by Two Researchers Shows Computers Can Make a Difference.

    ERIC Educational Resources Information Center

    Bass, George M., Jr.; Perkins, Harvey W.

    1984-01-01

    Describes a project which involved designing a nine-week course utilizing computer assisted instruction (CAI) to teach seventh graders critical thinking skills. Results indicate measurable gains were made in the critical thinking skills of verbal analogy and inductive/deductive reasoning, although no consistent gains were made in logical reasoning…

  19. Teaching Critical Thinking Skills with CAI: A Design by Two Researchers Shows Computers Can Make a Difference.

    ERIC Educational Resources Information Center

    Bass, George M., Jr.; Perkins, Harvey W.

    1984-01-01

    Describes a project which involved designing a nine-week course utilizing computer assisted instruction (CAI) to teach seventh graders critical thinking skills. Results indicate measurable gains were made in the critical thinking skills of verbal analogy and inductive/deductive reasoning, although no consistent gains were made in logical reasoning…

  20. A Critical Technical Review of Six Hazard Assessment Models

    DTIC Science & Technology

    1975-12-01

    documentation of AMSHAH. As the 3-12 image sources are not included in equation (4.2) of AMSHAH, the statement in the documentation of AMSHAH seems to be...0708 3 - J. 6 42 1.76"D: 12.869_] :17 The brackets have been omitted in the text . Page 70 - Theeighth line from the top contains Equation (4.6); this...is not documented in the text of AMSHAH and is not used in the computer coding for the transmissivity. The procedure for calculations of the water

  1. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  2. A critical review of seven selected neighborhood sustainability assessment tools

    SciTech Connect

    Sharifi, Ayyoob Murayama, Akito

    2013-01-15

    Neighborhood sustainability assessment tools have become widespread since the turn of 21st century and many communities, mainly in the developed world, are utilizing these tools to measure their success in approaching sustainable development goals. In this study, seven tools from Australia, Europe, Japan, and the United States are selected and analyzed with the aim of providing insights into the current situations; highlighting the strengths, weaknesses, successes, and failures; and making recommendations for future improvements. Using a content analysis, the issues of sustainability coverage, pre-requisites, local adaptability, scoring and weighting, participation, reporting, and applicability are discussed in this paper. The results of this study indicate that most of the tools are not doing well regarding the coverage of social, economic, and institutional aspects of sustainability; there are ambiguities and shortcomings in the weighting, scoring, and rating; in most cases, there is no mechanism for local adaptability and participation; and, only those tools which are embedded within the broader planning framework are doing well with regard to applicability. - Highlights: Black-Right-Pointing-Pointer Seven widely used assessment tools were analyzed. Black-Right-Pointing-Pointer There is a lack of balanced assessment of sustainability dimensions. Black-Right-Pointing-Pointer Tools are not doing well regarding the applicability. Black-Right-Pointing-Pointer Refinements are needed to make the tools more effective. Black-Right-Pointing-Pointer Assessment tools must be integrated into the planning process.

  3. Critical Factors in Assessment of Students with Visual Impairments.

    ERIC Educational Resources Information Center

    Loftin, Marnee

    1997-01-01

    Discusses the assessment component of individualized education programs for students with visual impairments. Important issues reviewed are the appropriate selection of a battery of tests; the knowledge base about a particular vision condition; the importance of supplementing testing information with meaningful observations and interviews; and…

  4. The Critical Role of Anchor Paper Selection in Writing Assessment

    ERIC Educational Resources Information Center

    Osborn Popp, Sharon E.; Ryan, Joseph M.; Thompson, Marilyn S.

    2009-01-01

    Scoring rubrics are routinely used to evaluate the quality of writing samples produced for writing performance assessments, with anchor papers chosen to represent score points defined in the rubric. Although the careful selection of anchor papers is associated with best practices for scoring, little research has been conducted on the role of…

  5. Critical Inquiry and Writing Centers: A Methodology of Assessment

    ERIC Educational Resources Information Center

    Bell, Diana Calhoun; Frost, Alanna

    2012-01-01

    By examining one writing center's role in student success, this project offers two examples of the way writing centers impact student engagement. This analysis models a methodology that writing and learning center directors can utilize in order to foster effective communication with stakeholders. By conducting data-driven assessment, directors can…

  6. Classroom Dynamic Assessment: A Critical Examination of Constructs and Practices

    ERIC Educational Resources Information Center

    Davin, Kristin J.

    2016-01-01

    This article explores the implementation of dynamic assessment (DA) in an elementary school foreign language classroom by considering its theoretical basis and its applicability to second language (L2) teaching, learning, and development. In existing applications of L2 classroom DA, errors serve as a window into learners' instructional needs and…

  7. Classroom Dynamic Assessment: A Critical Examination of Constructs and Practices

    ERIC Educational Resources Information Center

    Davin, Kristin J.

    2016-01-01

    This article explores the implementation of dynamic assessment (DA) in an elementary school foreign language classroom by considering its theoretical basis and its applicability to second language (L2) teaching, learning, and development. In existing applications of L2 classroom DA, errors serve as a window into learners' instructional needs and…

  8. Theories of Occupational Choice: A Critical Assessment of Selected Viewpoints.

    ERIC Educational Resources Information Center

    Hotchkiss, Lawrence; And Others

    Five theoretical perspectives related to occupational choice were assessed. These were (1) Super's career development perspective, (2) Holland's typology of occupational choice, (3) status-attainment research in the field of sociology, (4) economic theory of individual willingness to work in different occupations, and (5) a model of decision…

  9. Critical Issues in Assessing Teacher Compensation. Backgrounder. No. 2638

    ERIC Educational Resources Information Center

    Richwine, Jason; Biggs, Andrew G.

    2012-01-01

    A November 2011 Heritage Foundation report--"Assessing the Compensation of Public-School Teachers"--presented data on teacher salaries and benefits in order to inform debates about teacher compensation reform. The report concluded that public-school teacher compensation is far ahead of what comparable private-sector workers enjoy, and that…

  10. Assessment of Critical Business Skill Development by MBA Alumni

    ERIC Educational Resources Information Center

    Glynn, Joseph G.; Wood, Gregory R.

    2008-01-01

    Six years of survey data were analyzed to assess, among other things, the degree to which an AACSB accredited graduate business program successfully developed student skills in a variety of areas deemed important for career success. The study illustrates a methodology institutions can use to respond to increasing demands for program evaluation and…

  11. Assessment of Critical Business Skill Development by MBA Alumni

    ERIC Educational Resources Information Center

    Glynn, Joseph G.; Wood, Gregory R.

    2008-01-01

    Six years of survey data were analyzed to assess, among other things, the degree to which an AACSB accredited graduate business program successfully developed student skills in a variety of areas deemed important for career success. The study illustrates a methodology institutions can use to respond to increasing demands for program evaluation and…

  12. A critical assessment of the Arrhenius oven-aging methodology

    SciTech Connect

    Gillen, K.T.; Clough, R.L.

    1993-04-01

    The Arrhenius approach assumes a linear relation between log time to material property change and inverse absolute temperature. For elastomers, ultimate tensile elongation results are often used to confirm Arrhenius behavior, even though the ultimate tensile strength is non-Arrhenius. This paper critically examines the Arrhenius approach. Elongation vs air-oven aging temperature for a nitrile rubber, gives an E{sub a} of 22 kcal/mol; however this does not hold for the tensile strength, indicating degradation. Modulus profiling shows heterogeneity at the earliest times at 125 C, caused by diffusion-limited oxidation (DLO). Tensile strength depends on the force at break integrated over the cross section, and nitrile rubbers aged at different temperatures experience different degrees of degradation in the interior. Modulus at the surface, however, is not affected by DLO anomalies. Severe mechanical degradation will occur when the edge modulus increases by an order of magnitude. 7 figs, 3 refs.

  13. Embodied cognition and mirror neurons: a critical assessment.

    PubMed

    Caramazza, Alfonso; Anzellotti, Stefano; Strnad, Lukas; Lingnau, Angelika

    2014-01-01

    According to embodied cognition theories, higher cognitive abilities depend on the reenactment of sensory and motor representations. In the first part of this review, we critically analyze the central claims of embodied theories and argue that the existing behavioral and neuroimaging data do not allow investigators to discriminate between embodied cognition and classical cognitive accounts, which assume that conceptual representations are amodal and symbolic. In the second part, we review the main claims and the core electrophysiological findings typically cited in support of the mirror neuron theory of action understanding, one of the most influential examples of embodied cognition theories. In the final part, we analyze the claim that mirror neurons subserve action understanding by mapping visual representations of observed actions on motor representations, trying to clarify in what sense the representations carried by these neurons can be claimed motor.

  14. Computer Usage and the Validity of Self-Assessed Computer Competence among First-Year Business Students

    ERIC Educational Resources Information Center

    Ballantine, Joan A.; McCourt Larres, Patricia; Oyelere, Peter

    2007-01-01

    This study evaluates the reliability of self-assessment as a measure of computer competence. This evaluation is carried out in response to recent research which has employed self-reported ratings as the sole indicator of students' computer competence. To evaluate the reliability of self-assessed computer competence, the scores achieved by students…

  15. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use.

  16. Computer-based assessment of movement difficulties in Parkinson's disease.

    PubMed

    Cunningham, Laura M; Nugent, Chris D; Moore, George; Finlay, Dewar D; Craig, David

    2012-01-01

    The prevalence of Parkinson's disease (PD) is increasing due to an ageing population. It is an unpredictable disease which requires regular assessment and monitoring. Current techniques used to assess PD are subjective. Clinicians observe movements made by a patient and subsequently rate the level of severity of, for example tremor or slowness of movement. Within this work, we have developed and evaluated a prototype computer-based assessment tool capable of collecting information on the movement difficulties present in PD. Twenty participants took part in an assessment of the tool, 10 of whom were diagnosed with PD and 10 were without the disease. Following the usage of the tool, it was found that there was a significant difference (p = 0.038) in the speed of movement between the two groups. We envisage that this tool could have the potential to enable more objective clinical conclusions to be made.

  17. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    SciTech Connect

    Jacobsen, Jesper Lykke; Scullard, Christian R.

    2013-02-04

    We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size of B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.

  18. Need Assessment of Computer Science and Engineering Graduates

    NASA Astrophysics Data System (ADS)

    Surakka, Sami; Malmi, Lauri

    2005-06-01

    This case study considered the syllabus of the first and second year studies in computer science. The aim of the study was to reveal which topics covered in the syllabi were really needed during the following years of study or in working life. The program that was assessed in the study was a Masters program in computer science and engineering at a university of technology in Finland. The necessity of different subjects for the advanced studies (years 3? ?5) and for working life was assessed using four content analyses: (a) the course catalog of the institution where this study was carried out, (b) employment reports that were attached to the applications for internship credits, (c) masters theses, and (d) job advertisements in a newspaper. The results of the study imply that the necessity of physics for the advanced study and work was very low compared to the extent to which it was studied. On the other hand, the necessity for mathematics was moderate, and it had remained quite steady during the period 1989? ?2002. The most necessary computer science topic was programming. Also telecommunications and networking was needed often, whereas theoretical computer science was needed quite rarely.

  19. Assessing the Amazon Cloud Suitability for CLARREO's Computational Needs

    NASA Technical Reports Server (NTRS)

    Goldin, Daniel; Vakhnin, Andrei A.; Currey, Jon C.

    2015-01-01

    In this document we compare the performance of the Amazon Web Services (AWS), also known as Amazon Cloud, with the CLARREO (Climate Absolute Radiance and Refractivity Observatory) cluster and assess its suitability for computational needs of the CLARREO mission. A benchmark executable to process one month and one year of PARASOL (Polarization and Anistropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar) data was used. With the optimal AWS configuration, adequate data-processing times, comparable to the CLARREO cluster, were found. The assessment of alternatives to the CLARREO cluster continues and several options, such as a NASA-based cluster, are being considered.

  20. Assessment of nonequilibrium radiation computation methods for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Sharma, Surendra

    1993-01-01

    The present understanding of shock-layer radiation in the low density regime, as appropriate to hypersonic vehicles, is surveyed. Based on the relative importance of electron excitation and radiation transport, the hypersonic flows are divided into three groups: weakly ionized, moderately ionized, and highly ionized flows. In the light of this division, the existing laboratory and flight data are scrutinized. Finally, an assessment of the nonequilibrium radiation computation methods for the three regimes in hypersonic flows is presented. The assessment is conducted by comparing experimental data against the values predicted by the physical model.

  1. A theoretical method for assessing disruptive computer viruses

    NASA Astrophysics Data System (ADS)

    Wu, Yingbo; Li, Pengdeng; Yang, Lu-Xing; Yang, Xiaofan; Tang, Yuan Yan

    2017-09-01

    To assess the prevalence of disruptive computer viruses in the situation that every node in a network has its own virus-related attributes, a heterogeneous epidemic model is proposed. A criterion for the global stability of the virus-free equilibrium and a criterion for the existence of a unique viral equilibrium are given, respectively. Furthermore, extensive simulation experiments are conducted, and some interesting phenomena are found from the experimental results. On this basis, some policies of suppressing disruptive viruses are recommended.

  2. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  3. Secure Multiparty Computation for Cooperative Cyber Risk Assessment

    DTIC Science & Technology

    2016-11-01

    Secure Multiparty Computation for Cooperative Cyber Risk Assessment Kyle Hogan, Noah Luther, Nabil Schear, Emily Shen, Sophia Yakoubov, Arkady...Malacaria. How to spend it: Optimal investment for cyber security . In Proceedings of the 1st International Workshop on Agents and CyberSecurity...common problem organizations face is determining which security updates to perform and patches to apply to minimize the risk of potential vulnerabilities

  4. Ecological risk assessment of acidification in the Northern Eurasia using critical load concept

    SciTech Connect

    Bashkin, V.; Golinets, O.

    1995-12-31

    This research presents the risk analysis of acid forming compounds input using critical loads (CL) values of sulfur, nitrogen, and acidity under the computer calculations for terrestrial and freshwater ecosystems of Northern Eurasia. The Cl values are used to set goals for future deposition rates of acidifying and eutrophication compounds so that the environment is protected. CL values for various ecosystems are determined using EM GIS approach. The most influential sources, such as nitrogen, sulfur and base cations uptake by vegetation, surface and groundwater leaching from terrestrial to freshwater ecosystems are described for the whole territory under study regarding uncertainty analysis and the level of corresponding risk assessment. This may be explained by many factors of which the most important are: the estimation of plant uptake is carried out on the basis of data on the biogeochemical cycling of various elements, for which adequate quantitative characterization for all ecosystems under study is either absent or insufficient; reliable information on the quantitative assessment of the ratio between perennial plant biomes increase and dead matter is absent for the required level of spatial and temporal resolution; reliable data on surface and underground runoff in various ecosystems are rare; the influence of hydrothermic factors on the above mentioned processes has not been quantitatively determined at required level of model resolution.

  5. Solutions for data integration in functional genomics: a critical assessment and case study.

    PubMed

    Smedley, Damian; Swertz, Morris A; Wolstencroft, Katy; Proctor, Glenn; Zouberakis, Michael; Bard, Jonathan; Hancock, John M; Schofield, Paul

    2008-11-01

    The torrent of data emerging from the application of new technologies to functional genomics and systems biology can no longer be contained within the traditional modes of data sharing and publication with the consequence that data is being deposited in, distributed across and disseminated through an increasing number of databases. The resulting fragmentation poses serious problems for the model organism community which increasingly rely on data mining and computational approaches that require gathering of data from a range of sources. In the light of these problems, the European Commission has funded a coordination action, CASIMIR (coordination and sustainability of international mouse informatics resources), with a remit to assess the technical and social aspects of database interoperability that currently prevent the full realization of the potential of data integration in mouse functional genomics. In this article, we assess the current problems with interoperability, with particular reference to mouse functional genomics, and critically review the technologies that can be deployed to overcome them. We describe a typical use-case where an investigator wishes to gather data on variation, genomic context and metabolic pathway involvement for genes discovered in a genome-wide screen. We go on to develop an automated approach involving an in silico experimental workflow tool, Taverna, using web services, BioMart and MOLGENIS technologies for data retrieval. Finally, we focus on the current impediments to adopting such an approach in a wider context, and strategies to overcome them.

  6. What English Counts as Writing Assessment? An Australian Move to Mainstream Critical Literacy.

    ERIC Educational Resources Information Center

    Wyatt-Smith, Claire M.; Murphy, Judy

    2001-01-01

    Identifies and examines the range of approaches to writing assessment that are influential in Australian classrooms. Highlights the emergence of critical literacy as an assessment project in the state of Queensland, discussing what is involved when writing assessment moves away from personal voice and individual growth concerns to a socially…

  7. An Overview of a Programme of Research to Support the Assessment of Critical Thinking

    ERIC Educational Resources Information Center

    Black, Beth

    2012-01-01

    Cambridge Assessment has more than 20 years experience in assessing Critical Thinking (CT) in a number of diverse tests and qualifications, unrivalled by any other body within the UK. In recent years, a number of research activities have been carried out in order to support these assessments, with a focus on the validity of measurement. This paper…

  8. A critical assessment of topologically associating domain prediction tools

    PubMed Central

    Dali, Rola

    2017-01-01

    Abstract Topologically associating domains (TADs) have been proposed to be the basic unit of chromosome folding and have been shown to play key roles in genome organization and gene regulation. Several different tools are available for TAD prediction, but their properties have never been thoroughly assessed. In this manuscript, we compare the output of seven different TAD prediction tools on two published Hi-C data sets. TAD predictions varied greatly between tools in number, size distribution and other biological properties. Assessed against a manual annotation of TADs, individual TAD boundary predictions were found to be quite reliable, but their assembly into complete TAD structures was much less so. In addition, many tools were sensitive to sequencing depth and resolution of the interaction frequency matrix. This manuscript provides users and designers of TAD prediction tools with information that will help guide the choice of tools and the interpretation of their predictions. PMID:28334773

  9. Critical Technology Assessment: Fine Grain, High Density Graphite

    DTIC Science & Technology

    2010-04-01

    Control Classification Number ( ECCN ) 1C107.a on the Commerce Control List (CCL). The parameters of 1C107.a stem from controls established by the Missile...Technology Control Regime (MTCR). In this assessment, BIS specifically examined: • The application of ECCN 1C107.a and related licensing...export licensing process for fine grain, high density graphite controlled by ECCN 1C107.a, especially to China, requires more license conditions and

  10. Critical Technology Assessment of Five Axis Simultaneous Control Machine Tools

    DTIC Science & Technology

    2009-07-01

    assessment, BIS specifically examined: • The application of Export Control Classification Numbers ( ECCN ) 2B001.b.2 and 2B001.c.2 controls and related...availability of certain five axis simultaneous control mills, mill/turns, and machining centers controlled by ECCN 2B001.b.2 (but not grinders controlled by... ECCN 2B001.c.2) exists to China and Taiwan, which both have an indigenous capability to produce five axis simultaneous control machine tools with

  11. Criticality Model

    SciTech Connect

    A. Alsaed

    2004-09-14

    computational method will be used for evaluating the criticality potential of configurations of fissionable materials (in-package and external to the waste package) within the repository at Yucca Mountain, Nevada for all waste packages/waste forms. The criticality computational method is also applicable to preclosure configurations. The criticality computational method is a component of the methodology presented in ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003). How the criticality computational method fits in the overall disposal criticality analysis methodology is illustrated in Figure 1 (YMP 2003, Figure 3). This calculation will not provide direct input to the total system performance assessment for license application. It is to be used as necessary to determine the criticality potential of configuration classes as determined by the configuration probability analysis of the configuration generator model (BSC 2003a).

  12. Prediction of Critical Heat Flux in Water-Cooled Plasma Facing Components Using Computational Fluid Dynamics

    SciTech Connect

    Youchison, Dennis

    2011-01-01

    Several commercial computational fluid dynamics (CFD) codes now have the capability to analyze Eulerian two-phase flow using the Rohsenow nucleate boiling model. Analysis of boiling due to one-sided heating in plasma facing components (pfcs) is now receiving attention during the design of water-cooled first wall panels for ITER that may encounter heat fluxes as high as 5 MW/m2. Empirical thermalhydraulic design correlations developed for long fission reactor channels are not reliable when applied to pfcs because fully developed flow conditions seldom exist. Star-CCM+ is one of the commercial CFD codes that can model two-phase flows. Like others, it implements the RPI model for nucleate boiling, but it also seamlessly transitions to a volume-of-fluid model for film boiling. By benchmarking the results of our 3d models against recent experiments on critical heat flux for both smooth rectangular channels and hypervapotrons, we determined the six unique input parameters that accurately characterize the boiling physics for ITER flow conditions under a wide range of absorbed heat flux. We can now exploit this capability to predict the onset of critical heat flux in these components. In addition, the results clearly illustrate the production and transport of vapor and its effect on heat transfer in pfcs from nucleate boiling through transition to film boiling. This article describes the boiling physics implemented in CCM+ and compares the computational results to the benchmark experiments carried out independently in the United States and Russia. Temperature distributions agreed to within 10 °C for a wide range of heat fluxes from 3 MW/m2 to 10 MW/m2 and flow velocities from 1 m/s to 10 m/s in these devices. Although the analysis is incapable of capturing the stochastic nature of critical heat flux (i.e., time and location may depend on a local materials defect or turbulence phenomenon), it is highly reliable in determining the heat flux where

  13. Prediction of critical heat flux in water-cooled plasma facing components using computational fluid dynamics.

    SciTech Connect

    Bullock, James H.; Youchison, Dennis Lee; Ulrickson, Michael Andrew

    2010-11-01

    Several commercial computational fluid dynamics (CFD) codes now have the capability to analyze Eulerian two-phase flow using the Rohsenow nucleate boiling model. Analysis of boiling due to one-sided heating in plasma facing components (pfcs) is now receiving attention during the design of water-cooled first wall panels for ITER that may encounter heat fluxes as high as 5 MW/m2. Empirical thermalhydraulic design correlations developed for long fission reactor channels are not reliable when applied to pfcs because fully developed flow conditions seldom exist. Star-CCM+ is one of the commercial CFD codes that can model two-phase flows. Like others, it implements the RPI model for nucleate boiling, but it also seamlessly transitions to a volume-of-fluid model for film boiling. By benchmarking the results of our 3d models against recent experiments on critical heat flux for both smooth rectangular channels and hypervapotrons, we determined the six unique input parameters that accurately characterize the boiling physics for ITER flow conditions under a wide range of absorbed heat flux. We can now exploit this capability to predict the onset of critical heat flux in these components. In addition, the results clearly illustrate the production and transport of vapor and its effect on heat transfer in pfcs from nucleate boiling through transition to film boiling. This article describes the boiling physics implemented in CCM+ and compares the computational results to the benchmark experiments carried out independently in the United States and Russia. Temperature distributions agreed to within 10 C for a wide range of heat fluxes from 3 MW/m2 to 10 MW/m2 and flow velocities from 1 m/s to 10 m/s in these devices. Although the analysis is incapable of capturing the stochastic nature of critical heat flux (i.e., time and location may depend on a local materials defect or turbulence phenomenon), it is highly reliable in determining the heat flux where boiling instabilities begin

  14. A critical analysis of computational protein design with sparse residue interaction graphs.

    PubMed

    Jain, Swati; Jou, Jonathan D; Georgiev, Ivelin S; Donald, Bruce R

    2017-03-01

    Protein design algorithms enumerate a combinatorial number of candidate structures to compute the Global Minimum Energy Conformation (GMEC). To efficiently find the GMEC, protein design algorithms must methodically reduce the conformational search space. By applying distance and energy cutoffs, the protein system to be designed can thus be represented using a sparse residue interaction graph, where the number of interacting residue pairs is less than all pairs of mutable residues, and the corresponding GMEC is called the sparse GMEC. However, ignoring some pairwise residue interactions can lead to a change in the energy, conformation, or sequence of the sparse GMEC vs. the original or the full GMEC. Despite the widespread use of sparse residue interaction graphs in protein design, the above mentioned effects of their use have not been previously analyzed. To analyze the costs and benefits of designing with sparse residue interaction graphs, we computed the GMECs for 136 different protein design problems both with and without distance and energy cutoffs, and compared their energies, conformations, and sequences. Our analysis shows that the differences between the GMECs depend critically on whether or not the design includes core, boundary, or surface residues. Moreover, neglecting long-range interactions can alter local interactions and introduce large sequence differences, both of which can result in significant structural and functional changes. Designs on proteins with experimentally measured thermostability show it is beneficial to compute both the full and the sparse GMEC accurately and efficiently. To this end, we show that a provable, ensemble-based algorithm can efficiently compute both GMECs by enumerating a small number of conformations, usually fewer than 1000. This provides a novel way to combine sparse residue interaction graphs with provable, ensemble-based algorithms to reap the benefits of sparse residue interaction graphs while avoiding their

  15. A critical analysis of computational protein design with sparse residue interaction graphs

    PubMed Central

    Georgiev, Ivelin S.

    2017-01-01

    Protein design algorithms enumerate a combinatorial number of candidate structures to compute the Global Minimum Energy Conformation (GMEC). To efficiently find the GMEC, protein design algorithms must methodically reduce the conformational search space. By applying distance and energy cutoffs, the protein system to be designed can thus be represented using a sparse residue interaction graph, where the number of interacting residue pairs is less than all pairs of mutable residues, and the corresponding GMEC is called the sparse GMEC. However, ignoring some pairwise residue interactions can lead to a change in the energy, conformation, or sequence of the sparse GMEC vs. the original or the full GMEC. Despite the widespread use of sparse residue interaction graphs in protein design, the above mentioned effects of their use have not been previously analyzed. To analyze the costs and benefits of designing with sparse residue interaction graphs, we computed the GMECs for 136 different protein design problems both with and without distance and energy cutoffs, and compared their energies, conformations, and sequences. Our analysis shows that the differences between the GMECs depend critically on whether or not the design includes core, boundary, or surface residues. Moreover, neglecting long-range interactions can alter local interactions and introduce large sequence differences, both of which can result in significant structural and functional changes. Designs on proteins with experimentally measured thermostability show it is beneficial to compute both the full and the sparse GMEC accurately and efficiently. To this end, we show that a provable, ensemble-based algorithm can efficiently compute both GMECs by enumerating a small number of conformations, usually fewer than 1000. This provides a novel way to combine sparse residue interaction graphs with provable, ensemble-based algorithms to reap the benefits of sparse residue interaction graphs while avoiding their

  16. Critical Thinking Assessment: Measuring a Moving Target. Report & Recommendations of the South Carolina Higher Education Assessment Network Critical Thinking Task Force.

    ERIC Educational Resources Information Center

    Cook, Patricia; Johnson, Reid; Moore, Phil; Myers, Phyllis; Pauly, Susan; Pendarvis, Faye; Prus, Joe; Ulmer-Sottong, Lovely

    This report is part of South Carolina's effort to move toward "100 percent performance funding" for the state's public colleges and universities and results from a task force's investigation of ways to assess critical thinking. The following eight major findings are reported: (1) policy makers must determine priorities; (2) critical…

  17. Nuclear criticality safety assessment of the proposed CFC replacement coolants

    SciTech Connect

    Jordan, W.C.; Dyer, H.R.

    1993-12-01

    The neutron multiplication characteristics of refrigerant-114 (R-114) and proposed replacement coolants perfluorobutane (C{sub 4}F{sub 10}) and cycloperfluorobutane C{sub 4}F{sub 8}) have been compared by evaluating the infinite media multiplication factors of UF{sub 6}/H/coolant systems and by replacement calculations considering a 10-MW freezer/sublimer. The results of these comparisons demonstrate that R-114 is a neutron absorber, due to its chlorine content, and that the alternative fluorocarbon coolants are neutron moderators. Estimates of critical spherical geometries considering mixtures of UF{sub 6}/HF/C{sub 4}F{sub 10} indicate that the flourocarbon-moderated systems are large compared with water-moderated systems. The freezer/sublimer calculations indicate that the alternative coolants are more reactive than R-114, but that the reactivity remains significantly below the condition of water in the tubes, which was a limiting condition. Based on these results, the alternative coolants appear to be acceptable; however, several follow-up tasks have been recommended, and additional evaluation will be required on an individual equipment basis.

  18. A conceptual framework for developing a critical thinking self-assessment scale.

    PubMed

    Nair, Girija G; Stamler, Lynnette Leeseberg

    2013-03-01

    Nurses must be talented critical thinkers to cope with the challenges related to the ever-changing health care system, population trends, and extended role expectations. Several countries now recognize critical thinking skills (CTS) as an expected outcome of nursing education programs. Critical thinking has been defined in multiple ways by philosophers, critical thinking experts, and educators. Nursing experts conceptualize critical thinking as a process involving cognitive and affective domains of reasoning. Nurse educators are often challenged with teaching and measuring CTS because of their latent nature and the lack of a uniform definition of the concept. In this review of the critical thinking literature, we examine various definitions, identify a set of constructs that define critical thinking, and suggest a conceptual framework on which to base a self-assessment scale for measuring CTS.

  19. Using Critical Thinking Drills to Teach and Assess Proficiency in Methodological and Statistical Thinking

    ERIC Educational Resources Information Center

    Cascio, Ted V.

    2017-01-01

    This study assesses the effectiveness of critical thinking drills (CTDs), a repetitious classroom activity designed to improve methodological and statistical thinking in relation to psychological claims embedded in popular press articles. In each of four separate CTDs, students critically analyzed a brief article reporting a recent psychological…

  20. Data connectivity: A critical tool for external quality assessment.

    PubMed

    Cheng, Ben; Cunningham, Brad; Boeras, Debrah I; Mafaune, Patron; Simbi, Raiva; Peeling, Rosanna W

    2016-01-01

    Point-of-care (POC) tests have been useful in increasing access to testing and treatment monitoring for HIV. Decentralising testing from laboratories to hundreds of sites around a country presents tremendous challenges in training and quality assurance. In order to address these concerns, companies are now either embedding connectivity in their new POC diagnostic instruments or providing some form of channel for electronic result exchange. These will allow automated key performance and operational metrics from devices in the field to a central database. Setting up connectivity between these POC devices and a central database at the Ministries of Health will allow automated data transmission, creating an opportunity for real-time information on diagnostic instrument performance as well as the competency of the operator through external quality assessment. A pilot programme in Zimbabwe shows that connectivity has significantly improve the turn-around time of external quality assessment result submissions and allow corrective actions to be provided in a timely manner. Furthermore, by linking the data to existing supply chain management software, stock-outs can be minimised. As countries are looking forward to achieving the 90-90-90 targets for HIV, such innovative technologies can automate disease surveillance, improve the quality of testing and strengthen the efficiency of health systems.

  1. A critical role for network structure in seizure onset: a computational modeling approach.

    PubMed

    Petkov, George; Goodfellow, Marc; Richardson, Mark P; Terry, John R

    2014-01-01

    Recent clinical work has implicated network structure as critically important in the initiation of seizures in people with idiopathic generalized epilepsies. In line with this idea, functional networks derived from the electroencephalogram (EEG) at rest have been shown to be significantly different in people with generalized epilepsy compared to controls. In particular, the mean node degree of networks from the epilepsy cohort was found to be statistically significantly higher than those of controls. However, the mechanisms by which these network differences can support recurrent transitions into seizures remain unclear. In this study, we use a computational model of the transition into seizure dynamics to explore the dynamic consequences of these differences in functional networks. We demonstrate that networks with higher mean node degree are more prone to generating seizure dynamics in the model and therefore suggest a mechanism by which increased mean node degree of brain networks can cause heightened ictogenicity.

  2. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    SciTech Connect

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  3. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    PubMed

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  4. Assessing computer waste generation in Chile using material flow analysis.

    PubMed

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. Assessing computer skills in Tanzanian medical students: an elective experience

    PubMed Central

    Samuel, Miriam; Coombes, John C; Miranda, J Jaime; Melvin, Rob; Young, Eoin JW; Azarmina, Pejman

    2004-01-01

    Background One estimate suggests that by 2010 more than 30% of a physician's time will be spent using information technology tools. The aim of this study is to assess the information and communication technologies (ICT) skills of medical students in Tanzania. We also report a pilot intervention of peer mentoring training in ICT by medical students from the UK tutoring students in Tanzania. Methods Design: Cross sectional study and pilot intervention study. Participants: Fourth year medical students (n = 92) attending Muhimbili University College of Health Sciences, Dar es Salaam, Tanzania. Main outcome measures: Self-reported assessment of competence on ICT-related topics and ability to perform specific ICT tasks. Further information related to frequency of computer use (hours per week), years of computer use, reasons for use and access to computers. Skills at specific tasks were reassessed for 12 students following 4 to 6 hours of peer mentoring training. Results The highest levels of competence in generic ICT areas were for email, Internet and file management. For other skills such as word processing most respondents reported low levels of competence. The abilities to perform specific ICT skills were low – less than 60% of the participants were able to perform the core specific skills assessed. A period of approximately 5 hours of peer mentoring training produced an approximate doubling of competence scores for these skills. Conclusion Our study has found a low level of ability to use ICT facilities among medical students in a leading university in sub-Saharan Africa. A pilot scheme utilising UK elective students to tutor basic skills showed potential. Attention is required to develop interventions that can improve ICT skills, as well as computer access, in order to bridge the digital divide. PMID:15306029

  6. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention.

    PubMed

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2015-09-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald and Garland, Psycholog Assess 25:146-156, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on "joining," which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached 0.83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings.

  7. Complexity theory and geographies of health: a critical assessment.

    PubMed

    Gatrell, Anthony C

    2005-06-01

    The interest of social scientists in complexity theory has developed rapidly in recent years. Here, I consider briefly the primary characteristics of complexity theory, with particular emphasis given to relations and networks, non-linearity, emergence, and hybrids. I assess the 'added value' compared with other, existing perspectives that emphasise relationality and connectedness. I also consider the philosophical underpinnings of complexity theory and its reliance on metaphor. As a vehicle for moving away from reductionist accounts, complexity theory potentially has much to say to those interested in research on health inequalities, spatial diffusion, emerging and resurgent infections, and risk. These and other applications in health geography that have invoked complexity theory are examined in the paper. Finally, I consider some of the missing elements in complexity theory and argue that while it is refreshing to see a fruitful line of theoretical debate in health geography, we need good empirical work to illuminate it.

  8. A critical appraisal of vertebral fracture assessment in paediatrics.

    PubMed

    Kyriakou, Andreas; Shepherd, Sheila; Mason, Avril; Faisal Ahmed, S

    2015-12-01

    There is a need to improve our understanding of the clinical utility of vertebral fracture assessment (VFA) in paediatrics and this requires a thorough evaluation of its readability, reproducibility, and accuracy for identifying VF. VFA was performed independently by two observers, in 165 children and adolescents with a median age of 13.4 years (range, 3.6, 18). In 20 of these subjects, VFA was compared to lateral vertebral morphometry assessment on lateral spine X-ray (LVM). 1528 (84%) of the vertebrae were adequately visualised by both observers for VFA. Interobserver agreement in vertebral readability was 94% (kappa, 0.73 [95% CI, 0.68, 0.73]). 93% of the non-readable vertebrae were located between T6 and T9. Interobserver agreement per-vertebra for the presence of VF was 99% (kappa, 0.85 [95% CI, 0.79, 0.91]). Interobserver agreement per-subject was 91% (kappa, 0.78 [95% CI, 0.66, 0.87]). Per-vertebra agreement between LVM and VFA was 95% (kappa 0.79 [95% CI, 0.62, 0.92]) and per-subject agreement was 95% (kappa, 0.88 [95% CI, 0.58, 1.0]). Accepting LVM as the gold standard, VFA had a positive predictive value (PPV) of 90% and a negative predictive value (NPV) of 95% in per-vertebra analysis and a PPV of 100% and NPV of 93% in per-subject analysis. VFA reaches an excellent level of agreement between observers and a high level of accuracy in identifying VF in a paediatric population. The readability of vertebrae at the mid thoracic region is suboptimal and interpretation at this level should be exercised with caution. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Assessment of asthmatic inflammation using hybrid fluorescence molecular tomography-x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Ma, Xiaopeng; Prakash, Jaya; Ruscitti, Francesca; Glasl, Sarah; Stellari, Fabio Franco; Villetti, Gino; Ntziachristos, Vasilis

    2016-01-01

    Nuclear imaging plays a critical role in asthma research but is limited in its readings of biology due to the short-lived signals of radio-isotopes. We employed hybrid fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) for the assessment of asthmatic inflammation based on resolving cathepsin activity and matrix metalloproteinase activity in dust mite, ragweed, and Aspergillus species-challenged mice. The reconstructed multimodal fluorescence distribution showed good correspondence with ex vivo cryosection images and histological images, confirming FMT-XCT as an interesting alternative for asthma research.

  10. Does computer-aided formative assessment improve learning outcomes?

    NASA Astrophysics Data System (ADS)

    Hannah, John; James, Alex; Williams, Phillipa

    2014-02-01

    Two first-year engineering mathematics courses used computer-aided assessment (CAA) to provide students with opportunities for formative assessment via a series of weekly quizzes. Most students used the assessment until they achieved very high (>90%) quiz scores. Although there is a positive correlation between these quiz marks and the final exam marks, spending time on the CAA component of the course was negatively correlated with final exam performance. Students across the ability spectrum reduced their time commitment to CAA in their second semester, with weaker students achieving lower quiz totals, but with more able students' quiz marks hardly affected. Despite this lower quiz performance, the weaker students still improved their final exam marks in the second semester.

  11. Alcohol assessment using wireless handheld computers: a pilot study.

    PubMed

    Bernhardt, Jay M; Usdan, Stuart; Mays, Darren; Arriola, Kimberly Jacob; Martin, Ryan J; Cremeens, Jennifer; McGill, Tia; Weitzel, Jessica Aungst

    2007-12-01

    The present study sought to test the feasibility of measuring quantity and frequency of self-reported alcohol consumption among college students using the Handheld Assisted Network Diary (HAND) by comparing results to a retrospective Timeline Followback (TLFB). A total of 40 undergraduate college students completed a HAND assessment during the two-week study period and completed a TLFB at follow-up. The HAND recorded similar levels of alcohol consumption compared to the TLFB. There were no significant differences in overall alcohol consumption, drinks per drinking day, or heavy drinking days between the two methods of assessment. Handheld computers may represent a useful tool for assessing daily alcohol use among college students.

  12. Development of Assessment Instrument of Critical Thinking in Physics at Senior High School

    NASA Astrophysics Data System (ADS)

    Sugiarti, T.; Kaniawati, I.; Aviyanti, L.

    2017-02-01

    The result of preliminary study shows that the assessment of physics in school did not train students’ critical thinking skill. The assessment instrument just measured low cognitive aspects. Supposedly, critical thinking skill is trained in the assessment activity. The study aims to determine the characteristics and the quality of critical thinking skill instrument. It employs descriptive-qualitative method with research and development as the research design. The research participants are 35 students involved in the limited trial and 188 students in the wider trial from three public senior high school in Ciamis which in high level school. The data was collected through expert validation, tests and interviews. The results indicate that the characteristics of the assessment instrument of critical thinking skill is open-ended. The instrument fulfills some indicators namely analyzing argument, deduction, induction, and display information in the form of scenario, text, graphic and table. In addition, the data processing through V4 Anates program shows that the instrument reliability achieves 0.67 with high interpretation of 0.67 and the validity is 0.47 with enough interpretation. Thus, the assessment instrument of critical thinking skill in the form of open-ended essay meets the criteria of quality test, so it can use as instrument of assessment critical thinking skill.

  13. Control System Applicable Use Assessment of the Secure Computing Corporation - Secure Firewall (Sidewinder)

    SciTech Connect

    Hadley, Mark D.; Clements, Samuel L.

    2009-01-01

    Battelle’s National Security & Defense objective is, “applying unmatched expertise and unique facilities to deliver homeland security solutions. From detection and protection against weapons of mass destruction to emergency preparedness/response and protection of critical infrastructure, we are working with industry and government to integrate policy, operational, technological, and logistical parameters that will secure a safe future”. In an ongoing effort to meet this mission, engagements with industry that are intended to improve operational and technical attributes of commercial solutions that are related to national security initiatives are necessary. This necessity will ensure that capabilities for protecting critical infrastructure assets are considered by commercial entities in their development, design, and deployment lifecycles thus addressing the alignment of identified deficiencies and improvements needed to support national cyber security initiatives. The Secure Firewall (Sidewinder) appliance by Secure Computing was assessed for applicable use in critical infrastructure control system environments, such as electric power, nuclear and other facilities containing critical systems that require augmented protection from cyber threat. The testing was performed in the Pacific Northwest National Laboratory’s (PNNL) Electric Infrastructure Operations Center (EIOC). The Secure Firewall was tested in a network configuration that emulates a typical control center network and then evaluated. A number of observations and recommendations are included in this report relating to features currently included in the Secure Firewall that support critical infrastructure security needs.

  14. A computer program for conducting incinerator risk assessments.

    PubMed

    Walter, M A

    1999-02-01

    In 1994, the United States Environmental Protection Agency (USEPA) developed a screening methodology for conducting indirect exposure risk assessments for combustion facilities. The United States Army Center for Health Promotion and Preventive Medicine currently utilizes this methodology in conjunction with other USEPA guidance documents to perform human health risk assessments (HHRAs). The HHRAs require the development of complex human health models using spreadsheet software packages which estimate various media concentrations of contaminants in the environment. Since the quality assurance/quality control procedures associated with verifying the model's results are extremely time consuming, a computer program was developed using Microsoft Excel to minimize the amount of time needed. This discussion describes the 6 steps taken in developing this computer program, which are: (1) understanding the problem; (2) establishing the structure of each table in the spreadsheets; (3) developing an algorithm to solve the problem; (4) writing code; (5) running the program; and (6) testing the results. The automated process of having the computer predict health risk and hazards for each potentially exposed individual saves a tremendous amount of time because each calculated value is placed in the correct spreadsheet cell location. In addition to the time needed to develop human health spreadsheets, this program also minimizes the potential for reducing human error.

  15. Computational Fluid Dynamics Framework for Turbine Biological Performance Assessment

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Carlson, Thomas J.; Ebner, Laurie L.; Sick, Mirjam; Cada, G. F.

    2011-05-04

    In this paper, a method for turbine biological performance assessment is introduced to bridge the gap between field and laboratory studies on fish injury and turbine design. Using this method, a suite of biological performance indicators is computed based on simulated data from a computational fluid dynamics (CFD) model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. If the relationship between the dose of an injury mechanism and frequency of injury (dose-response) is known from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from various turbine designs, the engineer can identify the more-promising designs. Discussion here is focused on Kaplan-type turbines, although the method could be extended to other designs. Following the description of the general methodology, we will present sample risk assessment calculations based on CFD data from a model of the John Day Dam on the Columbia River in the USA.

  16. Computational Pollutant Environment Assessment from Propulsion-System Testing

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; McConnaughey, Paul; Chen, Yen-Sen; Warsi, Saif

    1996-01-01

    An asymptotic plume growth method based on a time-accurate three-dimensional computational fluid dynamics formulation has been developed to assess the exhaust-plume pollutant environment from a simulated RD-170 engine hot-fire test on the F1 Test Stand at Marshall Space Flight Center. Researchers have long known that rocket-engine hot firing has the potential for forming thermal nitric oxides, as well as producing carbon monoxide when hydrocarbon fuels are used. Because of the complex physics involved, most attempts to predict the pollutant emissions from ground-based engine testing have used simplified methods, which may grossly underpredict and/or overpredict the pollutant formations in a test environment. The objective of this work has been to develop a computational fluid dynamics-based methodology that replicates the underlying test-stand flow physics to accurately and efficiently assess pollutant emissions from ground-based rocket-engine testing. A nominal RD-170 engine hot-fire test was computed, and pertinent test-stand flow physics was captured. The predicted total emission rates compared reasonably well with those of the existing hydrocarbon engine hot-firing test data.

  17. Use of writing portfolios for interdisciplinary assessment of critical thinking outcomes of nursing students.

    PubMed

    Sorrell, J M; Brown, H N; Silva, M C; Kohlenberg, E M

    1997-01-01

    This article discusses an interdisciplinary research project in which faculty from nursing and english collaborated in the assessment of students' critical thinking skills as reflected in writing portfolios. Faculty reviewed students' writing portfolios and then corresponded on email from two different universities about evidence of critical thinking in the portfolios. Findings suggest that writing portfolios can provide important evidence of critical thinking outcomes. To do this, however, faculty need to design writing assignments to foster critical thinking skills, helping students to think not only about learning to write, but also about using writing to learn.

  18. Evaluation of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) and the Collected Data.

    PubMed

    Huff, Andrew G; Hodges, James S; Kennedy, Shaun P; Kircher, Amy

    2015-08-01

    To protect and secure food resources for the United States, it is crucial to have a method to compare food systems' criticality. In 2007, the U.S. government funded development of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) to determine which food and agriculture systems were most critical to the nation. FASCAT was developed in a collaborative process involving government officials and food industry subject matter experts (SMEs). After development, data were collected using FASCAT to quantify threats, vulnerabilities, consequences, and the impacts on the United States from failure of evaluated food and agriculture systems. To examine FASCAT's utility, linear regression models were used to determine: (1) which groups of questions posed in FASCAT were better predictors of cumulative criticality scores; (2) whether the items included in FASCAT's criticality method or the smaller subset of FASCAT items included in DHS's risk analysis method predicted similar criticality scores. Akaike's information criterion was used to determine which regression models best described criticality, and a mixed linear model was used to shrink estimates of criticality for individual food and agriculture systems. The results indicated that: (1) some of the questions used in FASCAT strongly predicted food or agriculture system criticality; (2) the FASCAT criticality formula was a stronger predictor of criticality compared to the DHS risk formula; (3) the cumulative criticality formula predicted criticality more strongly than weighted criticality formula; and (4) the mixed linear regression model did not change the rank-order of food and agriculture system criticality to a large degree. © 2015 Society for Risk Analysis.

  19. Critical assessment of regulatory standards and tests for fish vaccines.

    PubMed

    Midtlyng, P J

    2005-01-01

    Within the European Economic Area (EU member states plus Norway, Iceland and Liechtenstein), veterinary vaccines must comply with EU regulations and relevant monographs of the European Pharmacopoeia. Since 1996, three European monographs specific for fish have been adopted; concerning oil-adjuvanted, injectable vaccine for salmonids against furunculosis, as well as bacterins against classical vibriosis (Listonella anguillarum) or cold-water vibriosis (Vibrio salmonicida) in salmonids. The regulatory requirements laid down in these monographs include the use of seronegative fish for in vivo safety testing; conduct of vaccination trials in which experimental challenge is administered by injection, and minimum relative protection to be achieved at a given level of control mortality. Several aspects of these requirements are being questioned. This concerns the relevance of injection challenge methods as opposed to waterborne challenge; the validity of relative protection assessed at 60 % control mortality (RPS6) as compared to protection calculated at the endpoint of mortality (RPSendpoint), and poor test power due to low numbers of fish per treatment group. There is a strong need for future efforts to refine the methods for documentation and testing of fish vaccines, and to assure their suitability for the intended purpose.

  20. Advanced criticality assessment method for sewer pipeline assets.

    PubMed

    Syachrani, S; Jeong, H D; Chung, C S

    2013-01-01

    For effective management of water and wastewater infrastructure, the United States Environmental Protection Agency (US-EPA) has long emphasized the significant role of risk in prioritizing and optimizing asset management decisions. High risk assets are defined as assets with a high probability of failure (e.g. soon to fail, old, poor condition) and high consequences of failure (e.g. environmental impact, high expense, safety concerns, social disruption). In practice, the consequences of failure are often estimated by experts through a Delphi method. However, the estimation of the probability of failure has been challenging as it requires the thorough analysis of the historical condition assessment data, repair and replacement records, and other factors influencing the deterioration of the asset. The most common predictor in estimating the probability of failure is calendar age. However, a simple reliance on calendar age as a basis for estimating the asset's deterioration pattern completely ignores the different aging characteristics influenced by various operational and environmental conditions. This paper introduces a new approach of using 'real age' in estimating the probability of failure. Unlike the traditional calendar age method, the real age represents the adjusted age based on the unique operational and environmental conditions of the asset. Depending on the individual deterioration pattern, the real age could be higher or lower than its calendar age. Using the concept of real age, the probability of failure of an asset can be more accurately estimated.

  1. Assessment of low-grade hepatic encephalopathy: a critical analysis.

    PubMed

    Kircheis, Gerald; Fleig, Wolfgang E; Görtelmeyer, Roman; Grafe, Susanne; Häussinger, Dieter

    2007-11-01

    The value of paper-pencil tests and West-Haven-criteria for assessment of low-grade hepatic encephalopathy under conditions of a randomized, double-blind, placebo-controlled, clinical trial was evaluated in a cohort of 217 cirrhotics. Patients were graded at least twice clinically for severity of hepatic encephalopathy and tested concomitantly with a recommended psychometric test battery. Re-evaluation of the study documentation showed that at study entry 33% and during the study even 50% of the patients were wrongly allocated to minimal or overt hepatic encephalopathy. Despite the participating physicians' training, 31% of the number-connection-tests-A, 20% of the number-connection-tests-B and 28% of the line-tracing-test were in retrospect considered invalid by an independent psychologist. Neither the Portosystemic-Encephalopathy-Syndrome (PSE) test nor the Psychometric-Hepatic-Encephalopathy-Sum (PHES)-score reliably picked up clinical improvement in the individual patient. Although these test scores could statistically differentiate between patients with minimal and overt hepatic encephalopathy, the clinical classification of individual patients into one of the groups will have a high rate of error. The PHES-Score was less balanced than the score derived from the PSE-Syndrome-Test. Inaccuracies in conducting paper-pencil tests together with the subjectivity and incorrectness of clinical HE-grading question the usefulness of West-Haven-criteria and paper-pencil tests including related scores for quantification of low-grade HE at least in multicenter approaches.

  2. Concepts in critical thinking applied to caries risk assessment in dental education.

    PubMed

    Guzman-Armstrong, Sandra; Warren, John J; Cunningham-Ford, Marsha A; von Bergmann, HsingChi; Johnsen, David C

    2014-06-01

    Much progress has been made in the science of caries risk assessment and ways to analyze caries risk, yet dental education has seen little movement toward the development of frameworks to guide learning and assess critical thinking in caries risk assessment. In the absence of previous proactive implementation of a learning framework that takes the knowledge of caries risk and critically applies it to the patient with the succinctness demanded in the clinical setting, the purpose of this study was to develop a model learning framework that combines the science of caries risk assessment with principles of critical thinking from the education literature. This article also describes the implementation of that model at one dental school and presents some preliminary assessment data.

  3. Categorization of drugs implicated in causing liver injury: Critical assessment based on published case reports.

    PubMed

    Björnsson, Einar S; Hoofnagle, Jay H

    2016-02-01

    An important element in assessing causality in drug-induced liver injury is whether the implicated agent is known to cause hepatotoxicity. We classified drugs into categories based on the number of published reports of convincingly documented, clinically apparent, idiosyncratic liver injury. Drugs described in the website LiverTox (http://livertox.nih.gov) were classified into five categories based on the number of published cases (category A, ≥50; category B, 12-49; category C, 4-11; category D, 1-3; category E, none). Case reports in categories C and D were individually reanalyzed using the Roussel Uclaf Causality Assessment Method. Drugs with fatal cases or with rechallenge were noted. Among 671 individual drugs or closely related agents, 353 (53%) were considered convincingly linked to liver injury in published case reports; 48 (13%) were assigned to category A, 76 (22%) were assigned to category B, 96 (27%) were assigned to category C, and 126 (36%) were assigned to category D. Another 7 (2%) were direct hepatotoxins but only in high doses and placed in a separate category (T). The remaining 318 (47%) drugs had no convincing case report of hepatoxicity in the literature (category E). All except one in category A have been available since 1999, 98% had at least one fatal case and 89% a positive rechallenge. In category B, 54% had a fatal case and 41% a rechallenge. Drugs in categories C and D less frequently had instances of fatal (23% and 7%) or rechallenge cases (26% and 11%). Documentation of hepatoxicity in the medical literature is variable, and many published instances do not stand up to critical review. A standardized system for categorizing drugs for hepatotoxicity potential will help develop objective and reliable, computer-based instruments for assessing causality in drug-induced liver injury. © 2015 by the American Association for the Study of Liver Diseases.

  4. Performance Assessment of OVERFLOW on Distributed Computing Environment

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Rizk, Yehia M.

    2000-01-01

    The aerodynamic computer code, OVERFLOW, with a multi-zone overset grid feature, has been parallelized to enhance its performance on distributed and shared memory paradigms. Practical application benchmarks have been set to assess the efficiency of code's parallelism on high-performance architectures. The code's performance has also been experimented with in the context of the distributed computing paradigm on distant computer resources using the Information Power Grid (IPG) toolkit, Globus. Two parallel versions of the code, namely OVERFLOW-MPI and -MLP, have developed around the natural coarse grained parallelism inherent in a multi-zonal domain decomposition paradigm. The algorithm invokes a strategy that forms a number of groups, each consisting of a zone, a cluster of zones and/or a partition of a large zone. Each group can be thought of as a process with one or multithreads assigned to it and that all groups run in parallel. The -MPI version of the code uses explicit message-passing based on the standard MPI library for sending and receiving interzonal boundary data across processors. The -MLP version employs no message-passing paradigm; the boundary data is transferred through the shared memory. The -MPI code is suited for both distributed and shared memory architectures, while the -MLP code can only be used on shared memory platforms. The IPG applications are implemented by the -MPI code using the Globus toolkit. While a computational task is distributed across multiple computer resources, the parallelism can be explored on each resource alone. Performance studies are achieved with some practical aerodynamic problems with complex geometries, consisting of 2.5 up to 33 million grid points and a large number of zonal blocks. The computations were executed primarily on SGI Origin 2000 multiprocessors and on the Cray T3E. OVERFLOW's IPG applications are carried out on NASA homogeneous metacomputing machines located at three sites, Ames, Langley and Glenn. Plans

  5. Performance Assessment of OVERFLOW on Distributed Computing Environment

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Rizk, Yehia M.

    2000-01-01

    The aerodynamic computer code, OVERFLOW, with a multi-zone overset grid feature, has been parallelized to enhance its performance on distributed and shared memory paradigms. Practical application benchmarks have been set to assess the efficiency of code's parallelism on high-performance architectures. The code's performance has also been experimented with in the context of the distributed computing paradigm on distant computer resources using the Information Power Grid (IPG) toolkit, Globus. Two parallel versions of the code, namely OVERFLOW-MPI and -MLP, have developed around the natural coarse grained parallelism inherent in a multi-zonal domain decomposition paradigm. The algorithm invokes a strategy that forms a number of groups, each consisting of a zone, a cluster of zones and/or a partition of a large zone. Each group can be thought of as a process with one or multithreads assigned to it and that all groups run in parallel. The -MPI version of the code uses explicit message-passing based on the standard MPI library for sending and receiving interzonal boundary data across processors. The -MLP version employs no message-passing paradigm; the boundary data is transferred through the shared memory. The -MPI code is suited for both distributed and shared memory architectures, while the -MLP code can only be used on shared memory platforms. The IPG applications are implemented by the -MPI code using the Globus toolkit. While a computational task is distributed across multiple computer resources, the parallelism can be explored on each resource alone. Performance studies are achieved with some practical aerodynamic problems with complex geometries, consisting of 2.5 up to 33 million grid points and a large number of zonal blocks. The computations were executed primarily on SGI Origin 2000 multiprocessors and on the Cray T3E. OVERFLOW's IPG applications are carried out on NASA homogeneous metacomputing machines located at three sites, Ames, Langley and Glenn. Plans

  6. Self-assessment, reflection on practice and critical thinking in nursing students.

    PubMed

    Siles-González, José; Solano-Ruiz, Carmen

    2016-10-01

    In accordance with the principles of the European Higher Education Area, the aim of this study was to contribute to the implementation of self-assessment through the application of reflection on learning and critical thinking. The theoretical framework employed was Habermas's critical theory and emancipatory interest as a preliminary step to generate educational transformations. The methodological contribution is the design a student self-assessment document that promotes reflection on action and critical thinking. The development of assessment through peer evaluation and other intermediate solutions until achieving self-assessment entails a shift in the educational and scientific paradigm, but also involves the implementation in practice of democratic and ethical principles, values and premises in society. Self-assessment is a novel concept for students, and obliges them to reinterpret their role. Due to the diversity of students' principles, values, motivations, interests and aspirations, this reinterpretation of their role can have a positive outcome, stimulating an active and critical attitude towards group work and self-assessment; or, on the contrary, can generate a stance characterised by disinterest, passivity and lack of critical thinking. The forms of assessment adopted in a given educational system reflect ways of thinking related to ideologies, values, ethical principles and educational paradigms: in order to render implementation of effective self-assessment feasible, it is necessary to undertake structural and regulatory reforms. Students have little experience of reflection on practice or critical thinking. Massification and cultural and structural factors determine the form of assessment. In this context, it would seem advisable to move towards self-assessment gradually and cautiously. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Improving Educational Assessment: A Computer-Adaptive Multiple Choice Assessment Using NRET as the Scoring Method

    ERIC Educational Resources Information Center

    Sie Hoe, Lau; Ngee Kiong, Lau; Kian Sam, Hong; Bin Usop, Hasbee

    2009-01-01

    Assessment is central to any educational process. Number Right (NR) scoring method is a conventional scoring method for multiple choice items, where students need to pick one option as the correct answer. One point is awarded for the correct response and zero for any other responses. However, it has been heavily criticized for guessing and failure…

  8. Can Dental Cone Beam Computed Tomography Assess Bone Mineral Density?

    PubMed Central

    2014-01-01

    Mineral density distribution of bone tissue is altered by active bone modeling and remodeling due to bone complications including bone disease and implantation surgery. Clinical cone beam computed tomography (CBCT) has been examined whether it can assess oral bone mineral density (BMD) in patient. It has been indicated that CBCT has disadvantages of higher noise and lower contrast than conventional medical computed tomography (CT) systems. On the other hand, it has advantages of a relatively lower cost and radiation dose but higher spatial resolution. However, the reliability of CBCT based mineral density measurement has not yet been fully validated. Thus, the objectives of this review are to discuss 1) why assessment of BMD distribution is important and 2) whether the clinical CBCT can be used as a potential tool to measure the BMD. Brief descriptions of image artefacts associated with assessment of gray value, which has been used to account for mineral density, in CBCT images are provided. Techniques to correct local and conversion errors in obtaining the gray values in CBCT images are also introduced. This review can be used as a quick reference for users who may encounter these errors during analysis of CBCT images. PMID:25006568

  9. Computational techniques for the assessment of fracture repair.

    PubMed

    Anderson, Donald D; Thomas, Thaddeus P; Campos Marin, Ana; Elkins, Jacob M; Lack, William D; Lacroix, Damien

    2014-06-01

    The combination of high-resolution three-dimensional medical imaging, increased computing power, and modern computational methods provide unprecedented capabilities for assessing the repair and healing of fractured bone. Fracture healing is a natural process that restores the mechanical integrity of bone and is greatly influenced by the prevailing mechanical environment. Mechanobiological theories have been proposed to provide greater insight into the relationships between mechanics (stress and strain) and biology. Computational approaches for modelling these relationships have evolved from simple tools to analyze fracture healing at a single point in time to current models that capture complex biological events such as angiogenesis, stochasticity in cellular activities, and cell-phenotype specific activities. The predictive capacity of these models has been established using corroborating physical experiments. For clinical application, mechanobiological models accounting for patient-to-patient variability hold the potential to predict fracture healing and thereby help clinicians to customize treatment. Advanced imaging tools permit patient-specific geometries to be used in such models. Refining the models to study the strain fields within a fracture gap and adapting the models for case-specific simulation may provide more accurate examination of the relationship between strain and fracture healing in actual patients. Medical imaging systems have significantly advanced the capability for less invasive visualization of injured musculoskeletal tissues, but all too often the consideration of these rich datasets has stopped at the level of subjective observation. Computational image analysis methods have not yet been applied to study fracture healing, but two comparable challenges which have been addressed in this general area are the evaluation of fracture severity and of fracture-associated soft tissue injury. CT-based methodologies developed to assess and quantify

  10. Incorporating active-learning techniques and competency assessment into a critical care elective course.

    PubMed

    Malcom, Daniel R; Hibbs, Jennifer L

    2012-09-10

    To design, implement, and measure the effectiveness of a critical care elective course for second-year students in a 3-year accelerated doctor of pharmacy (PharmD) program. A critical care elective course was developed that used active-learning techniques, including cooperative learning and group presentations, to deliver content on critical care topics. Group presentations had to include a disease state overview, practice guidelines, and clinical recommendations, and were evaluated by course faculty members and peers. Students' mean scores on a 20-question critical-care competency assessment administered before and after the course improved by 11% (p < 0.05). Course evaluations and comments were positive. A critical care elective course resulted in significantly improved competency in critical care and was well-received by students.

  11. Strategic Computing. New-Generation Computing Technology: A Strategic Plan for Its Development and Application to Critical Problems in Defense

    DTIC Science & Technology

    1983-10-28

    Computing. By seizing an opportunity to leverage recent advances in artificial intelligence, computer science, and microelectronics, the Agency plans...occurred in many separated areas of artificial intelligence, computer science, and microelectronics. Advances in "expert system" technology now...and expert knowledge o Advances in Artificial Intelligence: Mechanization of speech recognition, vision, and natural language understanding. o

  12. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention

    PubMed Central

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2014-01-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald et al, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on “joining,” which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached .83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings. PMID:24500022

  13. Assessing stapes piston position using computed tomography: a cadaveric study.

    PubMed

    Hahn, Yoav; Diaz, Rodney; Hartman, Jonathan; Bobinski, Matthew; Brodie, Hilary

    2009-02-01

    Temporal bone computed tomographic (CT) scanning in the postoperative stapedotomy patient is inaccurate in assessing stapes piston position within the vestibule. Poststapedotomy patients that have persistent vertigo may undergo CT scanning to assess the position of the stapes piston within the vestibule to rule out overly deep insertion. Vertigo is a recognized complication of the deep piston, and CT evaluation is often recommended. The accuracy of CT scan in this setting is unestablished. Stapedotomy was performed on 12 cadaver ears, and stainless steel McGee pistons were placed. The cadaver heads were then scanned using a fine-cut temporal bone protocol. Temporal bone dissection was performed with microscopic measurement of the piston depth in the vestibule. These values were compared with depth of intravestibular penetration measured on CT scan by 4 independent measurements. The intravestibular penetration as assessed by computed tomography was consistently greater than the value found on cadaveric anatomic dissection. The radiographic bias was greater when piston location within the vestibule was shallower. The axial CT scan measurement was 0.53 mm greater, on average, than the anatomic measurement. On average, the coronal CT measurement was 0.68 mm greater than the anatomic measurement. The degree of overestimation of penetration, however, was highly inconsistent. Standard temporal bone CT scan is neither an accurate nor precise examination of stapes piston depth within the vestibule. We found that CT measurement consistently overstated intravestibular piston depth. Computed tomography is not a useful study in the evaluation of piston depth for poststapedectomy vertigo and is of limited value in this setting.

  14. Electronic Quality of Life Assessment Using Computer-Adaptive Testing.

    PubMed

    Gibbons, Chris; Bower, Peter; Lovell, Karina; Valderas, Jose; Skevington, Suzanne

    2016-09-30

    Quality of life (QoL) questionnaires are desirable for clinical practice but can be time-consuming to administer and interpret, making their widespread adoption difficult. Our aim was to assess the performance of the World Health Organization Quality of Life (WHOQOL)-100 questionnaire as four item banks to facilitate adaptive testing using simulated computer adaptive tests (CATs) for physical, psychological, social, and environmental QoL. We used data from the UK WHOQOL-100 questionnaire (N=320) to calibrate item banks using item response theory, which included psychometric assessments of differential item functioning, local dependency, unidimensionality, and reliability. We simulated CATs to assess the number of items administered before prespecified levels of reliability was met. The item banks (40 items) all displayed good model fit (P>.01) and were unidimensional (fewer than 5% of t tests significant), reliable (Person Separation Index>.70), and free from differential item functioning (no significant analysis of variance interaction) or local dependency (residual correlations < +.20). When matched for reliability, the item banks were between 45% and 75% shorter than paper-based WHOQOL measures. Across the four domains, a high standard of reliability (alpha>.90) could be gained with a median of 9 items. Using CAT, simulated assessments were as reliable as paper-based forms of the WHOQOL with a fraction of the number of items. These properties suggest that these item banks are suitable for computerized adaptive assessment. These item banks have the potential for international development using existing alternative language versions of the WHOQOL items.

  15. Computational geometry assessment for morphometric analysis of the mandible.

    PubMed

    Raith, Stefan; Varga, Viktoria; Steiner, Timm; Hölzle, Frank; Fischer, Horst

    2017-01-01

    This paper presents a fully automated algorithm for geometry assessment of the mandible. Anatomical landmarks could be reliably detected and distances were statistically evaluated with principal component analysis. The method allows for the first time to generate a mean mandible shape with statistically valid geometrical variations based on a large set of 497 CT-scans of human mandibles. The data may be used in bioengineering for designing novel oral implants, for planning of computer-guided surgery, and for the improvement of biomechanical models, as it is shown that commercially available mandible replicas differ significantly from the mean of the investigated population.

  16. Assessment of a human computer interface prototyping environment

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1993-01-01

    A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.

  17. RESRAD-CHEM: A computer code for chemical risk assessment

    SciTech Connect

    Cheng, J.J.; Yu, C.; Hartmann, H.M.; Jones, L.G.; Biwer, B.M.; Dovel, E.S.

    1993-10-01

    RESRAD-CHEM is a computer code developed at Argonne National Laboratory for the U.S. Department of Energy to evaluate chemically contaminated sites. The code is designed to predict human health risks from multipathway exposure to hazardous chemicals and to derive cleanup criteria for chemically contaminated soils. The method used in RESRAD-CHEM is based on the pathway analysis method in the RESRAD code and follows the U.S. Environmental Protection Agency`s (EPA`s) guidance on chemical risk assessment. RESRAD-CHEM can be used to evaluate a chemically contaminated site and, in conjunction with the use of the RESRAD code, a mixed waste site.

  18. Computer database takes confusion out of multi-property assessments

    SciTech Connect

    Kinworthy, M.L.

    1996-03-01

    Managing environmental site assessments in multi-property transactions poses a special challenge. Multi-site ESAs require a tremendous amount of coordination, data collection and interpretation; often, these tasks must be completed according to accelerated timeframes to meet client deadlines. The tasks can be particularly challenging when several hundred sites are included in the transaction. In such cases, a computer database can be an effective, powerful tool for tracking and managing property data, and generating customized reports for large, multi-site ESAs.

  19. Evidence Based Clinical Assessment of Child and Adolescent Social Phobia: A Critical Review of Rating Scales

    ERIC Educational Resources Information Center

    Tulbure, Bogdan T.; Szentagotai, Aurora; Dobrean, Anca; David, Daniel

    2012-01-01

    Investigating the empirical support of various assessment instruments, the evidence based assessment approach expands the scientific basis of psychotherapy. Starting from Hunsley and Mash's evaluative framework, we critically reviewed the rating scales designed to measure social anxiety or phobia in youth. Thirteen of the most researched social…

  20. CRITICAL ANALYSIS OF THE MATHEMATICAL RELATIONSHIPS AND COMPREHENSIVENESS OF LIFE CYCLE IMPACT ASSESSMENT APPROACHES

    EPA Science Inventory

    The impact assessment phase of Life Cycle Assessment (LCA) has received much criticism due to lack of consistency. ISO 14042 requires selection of impact categories that “reflect a comprehensive set of environmental issues” related to the system being studied, especi...

  1. Assessing Critical Thinking in Middle and High Schools: Meeting the Common Core

    ERIC Educational Resources Information Center

    Stobaugh, Rebecca

    2013-01-01

    This practical, very effective resource helps middle and high school teachers and curriculum leaders develop the skills to design instructional tasks and assessments that engage students in higher-level critical thinking, as recommended by the Common Core State Standards. Real examples of formative and summative assessments from a variety of…

  2. What Can You Learn in Three Minutes? Critical Reflection on an Assessment Task that Embeds Technology

    ERIC Educational Resources Information Center

    Brown, Natalie Ruth

    2009-01-01

    Purpose: The purpose of this paper is to critically examine an assessment task, undertaken by pre-service science teachers, that integrates the use of technology (in this case digital video-recorders and video-editing software) whilst scaffolding skill development. The embedding of technology into the assessment task is purposeful, aiming to…

  3. A Critical Analysis of the Design and Implementation of Formative Assessment

    ERIC Educational Resources Information Center

    Green, Rhiannon

    2017-01-01

    The objectives of this article are to critically analyse the impact of formative and summative assessment in an informal secondary school environment. The informality reflects the work of The Anne Frank Trust UK and their practices in evaluating student progress through a two-week workshop programme. The preference for formative assessment is…

  4. What Can You Learn in Three Minutes? Critical Reflection on an Assessment Task that Embeds Technology

    ERIC Educational Resources Information Center

    Brown, Natalie Ruth

    2009-01-01

    Purpose: The purpose of this paper is to critically examine an assessment task, undertaken by pre-service science teachers, that integrates the use of technology (in this case digital video-recorders and video-editing software) whilst scaffolding skill development. The embedding of technology into the assessment task is purposeful, aiming to…

  5. Problem-Based Learning in Geography: Towards a Critical Assessment of Its Purposes, Benefits and Risks

    ERIC Educational Resources Information Center

    Pawson, Eric; Fournier, Eric; Haigh, Martin; Muniz, Osvaldo; Trafford, Julie; Vajoczki, Susan

    2006-01-01

    This paper makes a critical assessment of problem-based learning (PBL) in geography. It assesses what PBL is, in terms of the range of definitions in use and in light of its origins in specific disciplines such as medicine. It considers experiences of PBL from the standpoint of students, instructors and managers (e.g. deans), and asks how well…

  6. Using Art to Assess Reading Comprehension and Critical Thinking in Adolescents

    ERIC Educational Resources Information Center

    Holdren, Tara Shoemaker

    2012-01-01

    In the current testing environment, high school reading teachers may often rely on a multiple-choice assessment as the best practice. This study suggests that a visual arts assessment of reading comprehension can rigorously measure critical thinking. This action research study follows 21 high school juniors through the selection, creation, and…

  7. An Australian Proposal for Doing Critical Literary Assessment: The Case of Writing.

    ERIC Educational Resources Information Center

    Wyatt-Smith, Claire; Murphy, Judy

    2002-01-01

    Presents current thinking and practices in Queensland, Australia, about how to do critical assessment in the English classroom. Proposes and discusses a framework that brings together interest in text analysis and social practices. Applies the framework showing how it can be used to generate writing tasks and assessment criteria that are…

  8. Evidence Based Clinical Assessment of Child and Adolescent Social Phobia: A Critical Review of Rating Scales

    ERIC Educational Resources Information Center

    Tulbure, Bogdan T.; Szentagotai, Aurora; Dobrean, Anca; David, Daniel

    2012-01-01

    Investigating the empirical support of various assessment instruments, the evidence based assessment approach expands the scientific basis of psychotherapy. Starting from Hunsley and Mash's evaluative framework, we critically reviewed the rating scales designed to measure social anxiety or phobia in youth. Thirteen of the most researched social…

  9. Problem-Based Learning in Geography: Towards a Critical Assessment of Its Purposes, Benefits and Risks

    ERIC Educational Resources Information Center

    Pawson, Eric; Fournier, Eric; Haigh, Martin; Muniz, Osvaldo; Trafford, Julie; Vajoczki, Susan

    2006-01-01

    This paper makes a critical assessment of problem-based learning (PBL) in geography. It assesses what PBL is, in terms of the range of definitions in use and in light of its origins in specific disciplines such as medicine. It considers experiences of PBL from the standpoint of students, instructors and managers (e.g. deans), and asks how well…

  10. Assessment of spare reliability for multi-state computer networks within tolerable packet unreliability

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Kuei; Huang, Cheng-Fu

    2015-04-01

    From a quality of service viewpoint, the transmission packet unreliability and transmission time are both critical performance indicators in a computer system when assessing the Internet quality for supervisors and customers. A computer system is usually modelled as a network topology where each branch denotes a transmission medium and each vertex represents a station of servers. Almost every branch has multiple capacities/states due to failure, partial failure, maintenance, etc. This type of network is known as a multi-state computer network (MSCN). This paper proposes an efficient algorithm that computes the system reliability, i.e., the probability that a specified amount of data can be sent through k (k ≥ 2) disjoint minimal paths within both the tolerable packet unreliability and time threshold. Furthermore, two routing schemes are established in advance to indicate the main and spare minimal paths to increase the system reliability (referred to as spare reliability). Thus, the spare reliability can be readily computed according to the routing scheme.

  11. Assessing executive function using a computer game: computational modeling of cognitive processes.

    PubMed

    Hagler, Stuart; Jimison, Holly Brugge; Pavel, Misha

    2014-07-01

    Early and reliable detection of cognitive decline is one of the most important challenges of current healthcare. In this project, we developed an approach whereby a frequently played computer game can be used to assess a variety of cognitive processes and estimate the results of the pen-and-paper trail making test (TMT)--known to measure executive function, as well as visual pattern recognition, speed of processing, working memory, and set-switching ability. We developed a computational model of the TMT based on a decomposition of the test into several independent processes, each characterized by a set of parameters that can be estimated from play of a computer game designed to resemble the TMT. An empirical evaluation of the model suggests that it is possible to use the game data to estimate the parameters of the underlying cognitive processes and using the values of the parameters to estimate the TMT performance. Cognitive measures and trends in these measures can be used to identify individuals for further assessment, to provide a mechanism for improving the early detection of neurological problems, and to provide feedback and monitoring for cognitive interventions in the home.

  12. Advancing risk assessment of engineered nanomaterials: application of computational approaches.

    PubMed

    Gajewicz, Agnieszka; Rasulev, Bakhtiyor; Dinadayalane, Tandabany C; Urbaszek, Piotr; Puzyn, Tomasz; Leszczynska, Danuta; Leszczynski, Jerzy

    2012-12-01

    Nanotechnology that develops novel materials at size of 100nm or less has become one of the most promising areas of human endeavor. Because of their intrinsic properties, nanoparticles are commonly employed in electronics, photovoltaic, catalysis, environmental and space engineering, cosmetic industry and - finally - in medicine and pharmacy. In that sense, nanotechnology creates great opportunities for the progress of modern medicine. However, recent studies have shown evident toxicity of some nanoparticles to living organisms (toxicity), and their potentially negative impact on environmental ecosystems (ecotoxicity). Lack of available data and low adequacy of experimental protocols prevent comprehensive risk assessment. The purpose of this review is to present the current state of knowledge related to the risks of the engineered nanoparticles and to assess the potential of efficient expansion and development of new approaches, which are offered by application of theoretical and computational methods, applicable for evaluation of nanomaterials. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Analytical formulae for computing the critical current of an Nb3Sn strand under bending

    NASA Astrophysics Data System (ADS)

    Ciazynski, D.; Torre, A.

    2010-12-01

    Works on bending strain in Nb3Sn wires were initiated in support of the 'react-and-wind' technique used to manufacture superconducting coils. More recently, the bending strains of Nb3Sn strands in cable-in-conduit conductors (CICC) under high Lorentz forces have been thought to be partly responsible for the degradation of the conductor performance in terms of critical current and n index, particularly for the international thermonuclear experimental reactor (ITER) conductors. This has led to a new wave of experiments and modelling on this subject. The computation of the current transport capability in an Nb3Sn wire under uniform bending used to be carried out through the so-called Ekin's models, and more recently through numerical simulations with electric networks. The flaws of Ekin's models are that they consider only two extreme cases or limits, namely the so-called long twist pitch (LTP) or short twist pitch (STP) cases, and that these models only allow computation of a value for the critical current without reference to the n index of the superconducting filaments (i.e. this index is implicitly assumed to be infinite). Although the numerical models allow a fine description of the wire under operation and can take into account the filament's n index, they need a refined meshing to be accurate enough and their results may be sensitive to boundary conditions (i.e. current injection in the wire), also general intrinsic parameters cannot be easily identified. In this paper, we propose clearly to go further than Ekin's models by developing, from a homogeneous model and Maxwell's equations, an analytical model to establish the general equation governing the evolution of the electric field inside an Nb3Sn strand under uniform bending (with possible longitudinal strain). Within the usual strand fabrication limits, this equation allows the definition of one single parameter to discriminate the STP and LTP cases. It is also shown that whereas Ekin's LTP model corresponds

  14. Critical thinking evaluation in reflective writing: Development and testing of Carter Assessment of Critical Thinking in Midwifery (Reflection).

    PubMed

    Carter, Amanda G; Creedy, Debra K; Sidebotham, Mary

    2017-11-01

    develop and test a tool designed for use by academics to evaluate pre-registration midwifery students' critical thinking skills in reflective writing. a descriptive cohort design was used. a random sample (n = 100) of archived student reflective writings based on a clinical event or experience during 2014 and 2015. a staged model for tool development was used to develop a fifteen item scale involving item generation; mapping of draft items to critical thinking concepts and expert review to test content validity; inter-rater reliability testing; pilot testing of the tool on 100 reflective writings; and psychometric testing. Item scores were analysed for mean, range and standard deviation. Internal reliability, content and construct validity were assessed. expert review of the tool revealed a high content validity index score of 0.98. Using two independent raters to establish inter-rater reliability, good absolute agreement of 72% was achieved with a Kappa coefficient K = 0.43 (p<0.0001). Construct validity via exploratory factor analysis revealed three factors: analyses context, reasoned inquiry, and self-evaluation. The mean total score for the tool was 50.48 (SD = 12.86). Total and subscale scores correlated significantly. The scale achieved good internal reliability with a Cronbach's alpha coefficient of .93. this study establishedthe reliability and validity of the CACTiM (reflection) for use by academics to evaluate midwifery students' critical thinking in reflective writing. Validation with large diverse samples is warranted. reflective practice is a key learning and teaching strategy in undergraduate Bachelor of Midwifery programmes and essential for safe, competent practice. There is the potential to enhance critical thinking development by assessingreflective writing with the CACTiM (reflection) tool to provide formative and summative feedback to students and inform teaching strategies. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  15. Critical Thinking Theory to Practice: Using the Expert's Thought Process as Guide for Learning and Assessment.

    PubMed

    Marshall, Teresa A; Marchini, Leonardo; Cowen, Howard; Hartshorn, Jennifer E; Holloway, Julie A; Straub-Morarend, Cheryl L; Gratton, David; Solow, Catherine M; Colangelo, Nicholas; Johnsen, David C

    2017-08-01

    Critical thinking skills are essential for the successful dentist, yet few explicit skillsets in critical thinking have been developed and published in peer-reviewed literature. The aims of this article are to 1) offer an assessable critical thinking teaching model with the expert's thought process as the outcome, learning guide, and assessment instrument and 2) offer three critical thinking skillsets following this model: for geriatric risk assessment, technology decision making, and situation analysis/reflections. For the objective component, the student demonstrates delivery of each step in the thought process. For the subjective component, the student is judged to have grasped the principles as applied to the patient or case. This article describes the framework and the results of pilot tests in which students in one year at this school used the model in the three areas, earning scores of 90% or above on the assessments. The model was thus judged to be successful for students to demonstrate critical thinking skillsets in the course settings. Students consistently delivered each step of the thought process and were nearly as consistent in grasping the principles behind each step. As more critical thinking skillsets are implemented, a reinforcing network develops.

  16. Guiding dental student learning and assessing performance in critical thinking with analysis of emerging strategies.

    PubMed

    Johnsen, David C; Lipp, Mitchell J; Finkelstein, Michael W; Cunningham-Ford, Marsha A

    2012-12-01

    Patient-centered care involves an inseparable set of knowledge, abilities, and professional traits on the part of the health care provider. For practical reasons, health professions education is segmented into disciplines or domains like knowledge, technical skills, and critical thinking, and the culture of dental education is weighted toward knowledge and technical skills. Critical thinking, however, has become a growing presence in dental curricula. To guide student learning and assess performance in critical thinking, guidelines have been developed over the past several decades in the educational literature. Prominent among these guidelines are the following: engage the student in multiple situations/exercises reflecting critical thinking; for each exercise, emulate the intended activity for validity; gain agreement of faculty members across disciplines and curriculum years on the learning construct, application, and performance assessment protocol for reliability; and use the same instrument to guide learning and assess performance. The purposes of this article are 1) to offer a set of concepts from the education literature potentially helpful to guide program design or corroborate existing programs in dental education; 2) to offer an implementation model consolidating these concepts as a guide for program design and execution; 3) to cite specific examples of exercises and programs in critical thinking in the dental education literature analyzed against these concepts; and 4) to discuss opportunities and challenges in guiding student learning and assessing performance in critical thinking for dentistry.

  17. Assessment of Zero Power Critical Experiments and Needs for a Fission Surface Power System

    SciTech Connect

    Jim R Parry; John Darrell bess; Brad T. Rearden; Gary A. Harms

    2009-06-01

    The National Aeronautics and Space Administration (NASA) is providing funding to the Department of Energy (DOE) to assess, develop, and test nuclear technologies that could provide surface power to a lunar outpost. Sufficient testing of this fission surface power (FSP) system will need to be completed to enable a decision by NASA for flight development. The near-term goal for the FSP work is to conduct the minimum amount of testing needed to validate the system performance within an acceptable risk. This report attempts to assess the current modeling capabilities and quantify any bias associated with the modeling methods for designing the nuclear reactor. The baseline FSP system is a sodium-potassium (NaK) cooled, fast spectrum reactor with 93% 235U enriched HEU-O2 fuel, SS316 cladding, and beryllium reflectors with B4C control drums. The FSP is to produce approximately 40 kWe net power with a lifetime of at least 8 years at full power. A flight-ready FSP is to be ready for launch and deployment by 2020. Existing benchmarks from the International Criticality Safety Benchmark Evaluation Program (ICSBEP) were reviewed and modeled in MCNP. An average bias of less than 0.6% was determined using the ENDF/B-VII cross-section libraries except in the case of subcritical experiments, which exhibited an average bias of approximately 1.5%. The bias increases with increasing reflector worth of the beryllium. The uncertainties and sensitivities in cross section data for the FSP model and ZPPR-20 configurations were assessed using TSUNAMI-3D. The cross-section covariance uncertainty in the FSP model was calculated as 2.09%, which was dominated by the uncertainty in the 235U(n,?) reactions. Global integral indices were generated in TSUNAMI-IP using pre-release SCALE 6 cross-section covariance data. The ZPPR-20 benchmark models exhibit strong similarity with the FSP model. A penalty assessment was performed to determine the degree of which the FSP model could not be characterized

  18. Assessment of the Relationships between the Computer Attitudes and Computer Literacy Levels of Prospective Educators.

    ERIC Educational Resources Information Center

    Hignite, Michael A.; Echternacht, Lonnie J.

    1992-01-01

    Data gathered from 83 prospective business educators on computer anxiety, attitudes toward computers in education, attitudes toward computers as a teaching tool, and computer liking were correlated with data on subjects' computer systems and applications literacy. A relationship between computer attitude and computer system literacy was found, and…

  19. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  20. An assessment technique for computer-socket manufacturing

    PubMed Central

    Sanders, Joan; Severance, Michael

    2015-01-01

    An assessment strategy is presented for testing the quality of carving and forming of individual computer aided manufacturing facilities. The strategy is potentially useful to facilities making sockets and companies marketing manufacturing equipment. To execute the strategy, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, and then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file shapes. Then model shapes are compared with electronic file shapes to characterize carving performance. Socket shapes are compared with model shapes to characterize forming performance. The mean radial error (MRE), which is the average difference in radii between the two shapes being compared, provides insight into sizing quality. Inter-quartile range (IQR), the range of radial error for the best matched half of the points on the surfaces being compared, provides insight into shape quality. By determining MRE and IQR for carving and forming separately, the source(s) of socket shape error may be pinpointed. The developed strategy may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer aided manufacturing and insight into appropriate modifications to overcome them. PMID:21938663

  1. Development and Evaluation of the Diagnostic Power for a Computer-Based Two-Tier Assessment

    ERIC Educational Resources Information Center

    Lin, Jing-Wen

    2016-01-01

    This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…

  2. Development and Evaluation of the Diagnostic Power for a Computer-Based Two-Tier Assessment

    ERIC Educational Resources Information Center

    Lin, Jing-Wen

    2016-01-01

    This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…

  3. Sedimentation equilibria in polydisperse ferrofluids: critical comparisons between experiment, theory, and computer simulation.

    PubMed

    Elfimova, Ekaterina A; Ivanov, Alexey O; Lakhtina, Ekaterina V; Pshenichnikov, Alexander F; Camp, Philip J

    2016-05-14

    The sedimentation equilibrium of dipolar particles in a ferrofluid is studied using experiment, theory, and computer simulation. A theory of the particle-concentration profile in a dipolar hard-sphere fluid is developed, based on the local-density approximation and accurate expressions from a recently introduced logarithmic free energy approach. The theory is tested critically against Monte Carlo simulation results for monodisperse and bidisperse dipolar hard-sphere fluids in homogeneous gravitational fields. In the monodisperse case, the theory is very accurate over broad ranges of gravitational field strength, volume fraction, and dipolar coupling constant. In the bidisperse case, with realistic dipolar coupling constants and compositions, the theory is excellent at low volume fraction, but is slightly inaccurate at high volume fraction in that it does not capture a maximum in the small-particle concentration profile seen in simulations. Possible reasons for this are put forward. Experimental measurements of the magnetic-susceptibility profile in a real ferrofluid are then analysed using the theory. The concentration profile is linked to the susceptibility profile using the second-order modified mean-field theory. It is shown that the experimental results are not consistent with the sample being monodisperse. By introducing polydispersity in the simplest possible way, namely by assuming the system is a binary mixture, almost perfect agreement between theory and experiment is achieved.

  4. Understanding climate-induced migration through computational modeling: A critical overview with guidance for future efforts

    DOE PAGES

    Till, Charlotte; Haverkamp, Jamie; White, Devin; ...

    2016-11-22

    Climate change has the potential to displace large populations in many parts of the developed and developing world. Understanding why, how, and when environmental migrants decide to move is critical to successful strategic planning within organizations tasked with helping the affected groups, and mitigating their systemic impacts. One way to support planning is through the employment of computational modeling techniques. Models can provide a window into possible futures, allowing planners and decision makers to test different scenarios in order to understand what might happen. While modeling is a powerful tool, it presents both opportunities and challenges. This paper builds amore » foundation for the broader community of model consumers and developers by: providing an overview of pertinent climate-induced migration research, describing some different types of models and how to select the most relevant one(s), highlighting three perspectives on obtaining data to use in said model(s), and the consequences associated with each. It concludes with two case studies based on recent research that illustrate what can happen when ambitious modeling efforts are undertaken without sufficient planning, oversight, and interdisciplinary collaboration. Lastly, we hope that the broader community can learn from our experiences and apply this knowledge to their own modeling research efforts.« less

  5. Understanding climate-induced migration through computational modeling: A critical overview with guidance for future efforts

    SciTech Connect

    Till, Charlotte; Haverkamp, Jamie; White, Devin; Bhaduri, Budhendra

    2016-11-22

    Climate change has the potential to displace large populations in many parts of the developed and developing world. Understanding why, how, and when environmental migrants decide to move is critical to successful strategic planning within organizations tasked with helping the affected groups, and mitigating their systemic impacts. One way to support planning is through the employment of computational modeling techniques. Models can provide a window into possible futures, allowing planners and decision makers to test different scenarios in order to understand what might happen. While modeling is a powerful tool, it presents both opportunities and challenges. This paper builds a foundation for the broader community of model consumers and developers by: providing an overview of pertinent climate-induced migration research, describing some different types of models and how to select the most relevant one(s), highlighting three perspectives on obtaining data to use in said model(s), and the consequences associated with each. It concludes with two case studies based on recent research that illustrate what can happen when ambitious modeling efforts are undertaken without sufficient planning, oversight, and interdisciplinary collaboration. Lastly, we hope that the broader community can learn from our experiences and apply this knowledge to their own modeling research efforts.

  6. Crosswords to computers: a critical review of popular approaches to cognitive enhancement.

    PubMed

    Jak, Amy J; Seelye, Adriana M; Jurick, Sarah M

    2013-03-01

    Cognitive enhancement strategies have gained recent popularity and have the potential to benefit clinical and non-clinical populations. As technology advances and the number of cognitively healthy adults seeking methods of improving or preserving cognitive functioning grows, the role of electronic (e.g., computer and video game based) cognitive training becomes more relevant and warrants greater scientific scrutiny. This paper serves as a critical review of empirical evaluations of publically available electronic cognitive training programs. Many studies have found that electronic training approaches result in significant improvements in trained cognitive tasks. Fewer studies have demonstrated improvements in untrained tasks within the trained cognitive domain, non-trained cognitive domains, or on measures of everyday function. Successful cognitive training programs will elicit effects that generalize to untrained, practical tasks for extended periods of time. Unfortunately, many studies of electronic cognitive training programs are hindered by methodological limitations such as lack of an adequate control group, long-term follow-up and ecologically valid outcome measures. Despite these limitations, evidence suggests that computerized cognitive training has the potential to positively impact one's sense of social connectivity and self-efficacy.

  7. Experimental evidence validating the computational inference of functional associations from gene fusion events: a critical survey.

    PubMed

    Promponas, Vasilis J; Ouzounis, Christos A; Iliopoulos, Ioannis

    2014-05-01

    More than a decade ago, a number of methods were proposed for the inference of protein interactions, using whole-genome information from gene clusters, gene fusions and phylogenetic profiles. This structural and evolutionary view of entire genomes has provided a valuable approach for the functional characterization of proteins, especially those without sequence similarity to proteins of known function. Furthermore, this view has raised the real possibility to detect functional associations of genes and their corresponding proteins for any entire genome sequence. Yet, despite these exciting developments, there have been relatively few cases of real use of these methods outside the computational biology field, as reflected from citation analysis. These methods have the potential to be used in high-throughput experimental settings in functional genomics and proteomics to validate results with very high accuracy and good coverage. In this critical survey, we provide a comprehensive overview of 30 most prominent examples of single pairwise protein interaction cases in small-scale studies, where protein interactions have either been detected by gene fusion or yielded additional, corroborating evidence from biochemical observations. Our conclusion is that with the derivation of a validated gold-standard corpus and better data integration with big experiments, gene fusion detection can truly become a valuable tool for large-scale experimental biology.

  8. Nuclear criticality safety assessment of the Consolidated Edison Uranium-Solidification Program Facility

    SciTech Connect

    Thomas, J.T.

    1984-01-01

    A nuclear criticality assessment of the Consolidated Edison Uranium-Solidification Program facility confirms that all operations involved in the process may be conducted with an acceptable margin of subcriticality. Normal operation presents no concern since subcriticality is maintained by design. Several recommendations are presented to prevent, or mitigate the consequences of, any abnormal events that might occur in the various portions of the process. These measures would also serve to reduce to a minimum the administrative controls required to prevent criticality.

  9. Nutritional status: assessing and understanding its value in the critical care setting.

    PubMed

    Rodriguez, Les

    2004-12-01

    The nutritional assessment is a key determinant in establishing risk for malnutrition and is also valuable in predicting outcomes in the critical care setting. Studies have demonstrated that nurses who are aware of the impact of nutrition and have operational aptitude can influence patient outcomes through early intervention. This intervention can result in shortened recovery time and decreased lengths of stay. Knowledge of nutrition's effect in the acute and critically ill patient is integral for nursing to predict and promote outcomes successfully in the critical care setting.

  10. Nuclear criticality safety assessment of the low level radioactive waste disposal facility trenches

    SciTech Connect

    Kahook, S.D.

    1994-04-01

    Results of the analyses performed to evaluate the possibility of nuclear criticality in the Low Level Radioactive Waste Disposal Facility (LLRWDF) trenches are documented in this report. The studies presented in this document are limited to assessment of the possibility of criticality due to existing conditions in the LLRWDF. This document does not propose nor set limits for enriched uranium (EU) burial in the LLRWDF and is not a nuclear criticality safety evaluation nor analysis. The calculations presented in the report are Level 2 calculations as defined by the E7 Procedure 2.31, Engineering Calculations.

  11. Deconstructing the Assessment of Anomaly-based Intrusion Detectors for Critical Applications

    SciTech Connect

    Viswanathan, Arun; Tan, Kymie; Neuman, Clifford

    2013-10-01

    Anomaly detection is a key strategy for cyber intrusion detection because it is conceptually capable of detecting novel attacks. This makes it an appealing defensive technique for environments such as the nation's critical infrastructures that is currently facing increased cyber adversarial activity. When considering deployment within the purview of such critical infrastructures it is imperative that the technology is well understood and reliable, where its performance is benchmarked on the results of principled assessments. This paper works towards such an imperative by analyzing the current state of anomaly detector assessments with a view toward mission critical deployments. We compile a framework of key evaluation constructs that identify how and where current assessment methods may fall short in providing sufficient insight into detector performance characteristics. Within the context of three case studies from literature, we show how error factors that influence the performance of detectors interact with different phases of a canonical evaluation strategy to compromise the integrity of the final results.

  12. Electronic Quality of Life Assessment Using Computer-Adaptive Testing

    PubMed Central

    2016-01-01

    Background Quality of life (QoL) questionnaires are desirable for clinical practice but can be time-consuming to administer and interpret, making their widespread adoption difficult. Objective Our aim was to assess the performance of the World Health Organization Quality of Life (WHOQOL)-100 questionnaire as four item banks to facilitate adaptive testing using simulated computer adaptive tests (CATs) for physical, psychological, social, and environmental QoL. Methods We used data from the UK WHOQOL-100 questionnaire (N=320) to calibrate item banks using item response theory, which included psychometric assessments of differential item functioning, local dependency, unidimensionality, and reliability. We simulated CATs to assess the number of items administered before prespecified levels of reliability was met. Results The item banks (40 items) all displayed good model fit (P>.01) and were unidimensional (fewer than 5% of t tests significant), reliable (Person Separation Index>.70), and free from differential item functioning (no significant analysis of variance interaction) or local dependency (residual correlations < +.20). When matched for reliability, the item banks were between 45% and 75% shorter than paper-based WHOQOL measures. Across the four domains, a high standard of reliability (alpha>.90) could be gained with a median of 9 items. Conclusions Using CAT, simulated assessments were as reliable as paper-based forms of the WHOQOL with a fraction of the number of items. These properties suggest that these item banks are suitable for computerized adaptive assessment. These item banks have the potential for international development using existing alternative language versions of the WHOQOL items. PMID:27694100

  13. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  14. Preliminary performance assessment of computer automated facial approximations using computed tomography scans of living individuals.

    PubMed

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2013-12-10

    ReFace (Reality Enhancement Facial Approximation by Computational Estimation) is a computer-automated facial approximation application jointly developed by the Federal Bureau of Investigation and GE Global Research. The application derives a statistically based approximation of a face from a unidentified skull using a dataset of ~400 human head computer tomography (CT) scans of living adult American individuals from four ancestry groups: African, Asian, European and Hispanic (self-identified). To date only one unpublished subjective recognition study has been conducted using ReFace approximations. It indicated that approximations produced by ReFace were recognized above chance rates (10%). This preliminary study assesses: (i) the recognizability of five ReFace approximations; (ii) the recognizability of CT-derived skin surface replicas of the same individuals whose skulls were used to create the ReFace approximations; and (iii) the relationship between recognition performance and resemblance ratings of target individuals. All five skin surface replicas were recognized at rates statistically significant above chance (22-50%). Four of five ReFace approximations were recognized above chance (5-18%), although with statistical significance only at the higher rate. Such results suggest reconsideration of the usefulness of the type of output format utilized in this study, particularly in regard to facial approximations employed as a means of identifying unknown individuals.

  15. Collaborative mobile sensing and computing for civil infrastructure condition assessment: framework and applications

    NASA Astrophysics Data System (ADS)

    Chen, Jianfei; Chen, ZhiQiang

    2012-04-01

    Multi-function sensing and imaging devices, GPS, communication and computing devices are being ubiquitously used in field by engineers in civil engineering and emergence response practice. Field engineers, however, still have difficulty to balance between ever-increasing data collection demand and capacity of real-time data processing and knowledge sharing. In addition, field engineers usually work collaboratively in a geospatially large area; however, the existing sensing and computing modalities used in the field are not designed to accommodate this condition. In this paper, we present a solution framework of collaborative mobile sensing and computing (CMSC) for civil infrastructure condition assessment, with the Android-based mobile devices as the basic nodes in the framework with a potential of adding other auxiliary imaging and sensing devices into the network. Difficulties in mixed C++ and Java code programming that are critical to realize the framework are discussed. With a few prototypes illustrated in this paper, we envisage that the proposed CMSC framework will enable seamless integration of sensing, imaging, real-time processing and knowledge discovery in future engineers-centered field reconnaissances and civil infrastructure condition assessment.

  16. English Computer Critical Thinking Reading and Writing Interactive Multi-Media Programs for Comparison/Contrast and Analysis.

    ERIC Educational Resources Information Center

    Barkley, Christine

    Two computer programs were developed to enhance community college students' critical thinking skills in the areas of "Comparison and Contrast" and "Analysis." Instructors have several options in using the programs. With access to an LCD panel and an overhead projector, instructors can use the programs in the classroom, manipulating the computer…

  17. How Day of Posting Affects Level of Critical Discourse in Asynchronous Discussions and Computer-Supported Collaborative Argumentation

    ERIC Educational Resources Information Center

    Jeong, Allan; Frazier, Sue

    2008-01-01

    In asynchronous threaded discussions, messages posted near the end of the week provide less time for students to critically examine and respond to ideas presented in the messages than messages posted early in the week. This study examined how the day in which messages are posted (early, midweek and weekend) in computer-supported collaborative…

  18. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment.

    PubMed

    Jain, Sangeeta; Menon, Kusum; Piquette, Dominique; Gottesman, Ronald; Hutchison, James; Gilfoyle, Elaine; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees.

  19. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment

    PubMed Central

    Jain, Sangeeta; Hutchison, James; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees. PMID:27610029

  20. The development and testing of a qualitative instrument designed to assess critical thinking

    NASA Astrophysics Data System (ADS)

    Clauson, Cynthia Louisa

    This study examined a qualitative approach to assess critical thinking. An instrument was developed that incorporates an assessment process based on Dewey's (1933) concepts of self-reflection and critical thinking as problem solving. The study was designed to pilot test the critical thinking assessment process with writing samples collected from a heterogeneous group of students. The pilot test included two phases. Phase 1 was designed to determine the validity and inter-rater reliability of the instrument using two experts in critical thinking, problem solving, and literacy development. Validity of the instrument was addressed by requesting both experts to respond to ten questions in an interview. The inter-rater reliability was assessed by analyzing the consistency of the two experts' scorings of the 20 writing samples to each other, as well as to my scoring of the same 20 writing samples. Statistical analyses included the Spearman Rho and the Kuder-Richardson (Formula 20). Phase 2 was designed to determine the validity and reliability of the critical thinking assessment process with seven science teachers. Validity was addressed by requesting the teachers to respond to ten questions in a survey and interview. Inter-rater reliability was addressed by comparing the seven teachers' scoring of five writing samples with my scoring of the same five writing samples. Again, the Spearman Rho and the Kuder-Richardson (Formula 20) were used to determine the inter-rater reliability. The validity results suggest that the instrument is helpful as a guide for instruction and provides a systematic method to teach and assess critical thinking while problem solving with students in the classroom. The reliability results show the critical thinking assessment instrument to possess fairly high reliability when used by the experts, but weak reliability when used by classroom teachers. A major conclusion was drawn that teachers, as well as students, would need to receive instruction

  1. [Vascular assessment in stroke codes: role of computed tomography angiography].

    PubMed

    Mendigaña Ramos, M; Cabada Giadas, T

    2015-01-01

    Advances in imaging studies for acute ischemic stroke are largely due to the development of new efficacious treatments carried out in the acute phase. Together with computed tomography (CT) perfusion studies, CT angiography facilitates the selection of patients who are likely to benefit from appropriate early treatment. CT angiography plays an important role in the workup for acute ischemic stroke because it makes it possible to confirm vascular occlusion, assess the collateral circulation, and obtain an arterial map that is very useful for planning endovascular treatment. In this review about CT angiography, we discuss the main technical characteristics, emphasizing the usefulness of the technique in making the right diagnosis and improving treatment strategies. Copyright © 2012 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  2. Assessment of computer customized brackets and positioning jigs.

    PubMed

    Dewhurst, Robert

    2012-01-01

    An in-practice assessment of Ormco's CAD CAM Insignia Orthodontic system is reviewed Investigation included an in-vitro and in-vivo analysis of the accuracy of bracket placement, and measurement of the accuracy of slot and torque manufacturing specifications were addressed. The possible role of these systems in general dental practice is discussed. Examination of the Insignia system has led us to believe that changes in the way that orthodontic cases are being planned, treated, and delivered to the patient are changing with the increased use of computer technology. The ability to customize brackets and wires for the individual tooth in the individual patient should lead to better and easier finishing.

  3. Japanese technology assessment: Computer science, opto- and microelectronics mechatronics, biotechnology

    SciTech Connect

    Brandin, D.; Wieder, H.; Spicer, W.; Nevins, J.; Oxender, D.

    1986-01-01

    The series studies Japanese research and development in four high-technology areas - computer science, opto and microelectronics, mechatronics (a term created by the Japanese to describe the union of mechanical and electronic engineering to produce the next generation of machines, robots, and the like), and biotechnology. The evaluations were conducted by panels of U.S. scientists - chosen from academia, government, and industry - actively involved in research in areas of expertise. The studies were prepared for the purpose of aiding the U.S. response to Japan's technological challenge. The main focus of the assessments is on the current status and long-term direction and emphasis of Japanese research and development. Other aspects covered include evolution of the state of the art; identification of Japanese researchers, R and D organizations, and resources; and comparative U.S. efforts. The general time frame of the studies corresponds to future industrial applications and potential commercial impacts spanning approximately the next two decades.

  4. Color calculations for and perceptual assessment of computer graphic images

    SciTech Connect

    Meyer, G.W.

    1986-01-01

    Realistic image synthesis involves the modelling of an environment in accordance with the laws of physics and the production of a final simulation that is perceptually acceptable. To be considered a scientific endeavor, synthetic image generation should also include the final step of experimental verification. This thesis concentrates on the color calculations that are inherent in the production of the final simulation and on the perceptual assessment of the computer graphic images that result. The fundamental spectral sensitivity functions that are active in the human visual system are introduced and are used to address color-blindness issues in computer graphics. A digitally controlled color television monitor is employed to successfully implement both the Farnsworth Munsell 100 hues test and a new color vision test that yields more accurate diagnoses. Images that simulate color blind vision are synthesized and are used to evaluate color scales for data display. Gaussian quadrature is used with a set of opponent fundamental to select the wavelengths at which to perform synthetic image generation.

  5. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    NASA Astrophysics Data System (ADS)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step

  6. Assessing Walking Activity in Older Adults: Development and Validation of a Novel Computer-Animated Assessment Tool.

    PubMed

    Marsh, Anthony P; Janssen, James A; Ip, Edward H; Barnard, Ryan T; Ambrosius, Walter T; Brubaker, Peter R; Burdette, Jonathan H; Sheedy, Jessica L; Rejeski, W Jack

    2015-12-01

    Assessing volume of physical activity (PA) in older adults is critical to understanding the role that PA has on health outcomes and the effectiveness of treatment interventions to increase PA. The purpose of this study was to investigate the psychometric properties of a novel computer-animated self-report questionnaire designed to assess walking activity of older adults: the Mobility Assessment Tool for Walking--the MAT-W. We recruited 249 older adults (66.9±4.7 years, 71% female, 32% black) with cardiovascular disease and/or metabolic syndrome as part of the Cooperative Lifestyle Intervention Program-II study. Participants completed the MAT-W at baseline and after 6 months of a walking and weight loss (n = 78) or weight loss only (n = 69) intervention. Test-retest reliability was assessed in 31 participants. Walking speed at usual and fast pace was measured using a GAITRite mat, and 7-day accelerometry data were collected at baseline and 6 months. The mCHAMPS5, a modified version of a widely used self-report PA questionnaire, was completed at baseline. The test-retest reliability of MAT-W was excellent (intraclass correlation coefficient > .85). The MAT-W was correlated with mCHAMPS5 (Spearman r = .66, p < .001) and moderate/vigorous levels of PA as assessed by accelerometry (Spearman r = .65, p < .001) and was responsive to an intervention-induced change in PA at 6 months when comparing the Cooperative Lifestyle Intervention Program-II walking and weight loss group with the weight loss only group (p < .001). The MAT-W is a brief, reliable, and valid tool to assess PA and has promise for the assessment of walking behavior in older adults under free-living conditions. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    SciTech Connect

    Lo, P. Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G.; Argula, R.; Strange, C.

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  8. Approaches for the computationally efficient assessment of the plug-in HEV impact on the grid

    NASA Astrophysics Data System (ADS)

    Lee, Tae-Kyung; Filipi, Zoran S.

    2012-11-01

    Realistic duty cycles are critical for design and assessment of hybrid propulsion systems, in particular, plug-in hybrid electric vehicles. The analysis of the PHEV impact requires a large amount of data about daily missions for ensuring realism in predicted temporal loads on the grid. This paper presents two approaches for the reduction of the computational effort while assessing the large scale PHEV impact on the grid, namely 1) "response surface modelling" approach; and 2) "daily driving schedule modelling" approach. The response surface modelling approach replaces the time-consuming vehicle simulations by response surfaces constructed off-line with the consideration of the real-world driving. The daily driving modelling approach establishes a correlation between departure and arrival times, and it predicts representative driving patterns with a significantly reduced number of simulation cases. In both cases, representative synthetic driving cycles are used to capture the naturalistic driving characteristics for a given trip length. The proposed approaches enable construction of 24-hour missions, assessments of charging requirements at the time of plugging-in, and temporal distributions of the load on the grid with high computational efficiency.

  9. Health advantages and disadvantages of weight-reducing diets: a computer analysis and critical review.

    PubMed

    Anderson, J W; Konz, E C; Jenkins, D J

    2000-10-01

    Some weight-loss diets are nutritionally sound and consistent with recommendations for healthy eating while others are "fad" diets encouraging irrational and, sometimes, unsafe practices. The purpose of the study was to compare several weight loss diets and assess their potential long-term effects. Eight popular weight-loss diets were selected (Atkins, Protein Power, Sugar Busters, Zone, ADA Exchange, High-Fiber Fitness, Pritikin and Omish) to be non-clinically analyzed by means of a computer to predict their relative benefits/potential harm. A summary description, menu plan and recommended snacks were developed for each diet. The nutrient composition of each diet was determined using computer software, and a Food Pyramid Score was calculated to compare diets. The Mensink, Hegsted and other formulae were applied to estimate coronary heart disease risk factors. Higher fat diets are higher in saturated fats and cholesterol than current dietary guidelines and their long-term use would increase serum cholesterol levels and risk for CHD. Diets restricted in sugar intake would lower serum cholesterol levels and long-term risk for CHD; however, higher carbohydrate, higher fiber, lower fat diets would have the greatest effect in decreasing serum cholesterol concentrations and risk of CHD. While high fat diets may promote short-term weight loss, the potential hazards for worsening risk for progression of atherosclerosis override the short-term benefits. Individuals derive the greatest health benefits from diets low in saturated fat and high in carbohydrate and fiber: these increase sensitivity to insulin and lower risk for CHD.

  10. Computational fluid dynamics framework for aerodynamic model assessment

    NASA Astrophysics Data System (ADS)

    Vallespin, D.; Badcock, K. J.; Da Ronch, A.; White, M. D.; Perfect, P.; Ghoreyshi, M.

    2012-07-01

    This paper reviews the work carried out at the University of Liverpool to assess the use of CFD methods for aircraft flight dynamics applications. Three test cases are discussed in the paper, namely, the Standard Dynamic Model, the Ranger 2000 jet trainer and the Stability and Control Unmanned Combat Air Vehicle. For each of these, a tabular aerodynamic model based on CFD predictions is generated along with validation against wind tunnel experiments and flight test measurements. The main purpose of the paper is to assess the validity of the tables of aerodynamic data for the force and moment prediction of realistic aircraft manoeuvres. This is done by generating a manoeuvre based on the tables of aerodynamic data, and then replaying the motion through a time-accurate computational fluid dynamics calculation. The resulting forces and moments from these simulations were compared with predictions from the tables. As the latter are based on a set of steady-state predictions, the comparisons showed perfect agreement for slow manoeuvres. As manoeuvres became more aggressive some disagreement was seen, particularly during periods of large rates of change in attitudes. Finally, the Ranger 2000 model was used on a flight simulator.

  11. Computational Performance Assessment of k-mer Counting Algorithms.

    PubMed

    Pérez, Nelson; Gutierrez, Miguel; Vera, Nelson

    2016-04-01

    This article is about the assessment of several tools for k-mer counting, with the purpose to create a reference framework for bioinformatics researchers to identify computational requirements, parallelizing, advantages, disadvantages, and bottlenecks of each of the algorithms proposed in the tools. The k-mer counters evaluated in this article were BFCounter, DSK, Jellyfish, KAnalyze, KHMer, KMC2, MSPKmerCounter, Tallymer, and Turtle. Measured parameters were the following: RAM occupied space, processing time, parallelization, and read and write disk access. A dataset consisting of 36,504,800 reads was used corresponding to the 14th human chromosome. The assessment was performed for two k-mer lengths: 31 and 55. Obtained results were the following: pure Bloom filter-based tools and disk-partitioning techniques showed a lesser RAM use. The tools that took less execution time were the ones that used disk-partitioning techniques. The techniques that made the major parallelization were the ones that used disk partitioning, hash tables with lock-free approach, or multiple hash tables.

  12. TRECII: a computer program for transportation risk assessment

    SciTech Connect

    Franklin, A.L.

    1980-05-01

    A risk-based fault tree analysis method has been developed at the Pacific Northwest Laboratory (PNL) for analysis of nuclear fuel cycle operations. This methodology was developed for the Department of Energy (DOE) as a risk analysis tool for evaluating high level waste management systems. A computer package consisting of three programs was written at that time to assist in the performance of risk assessment: ACORN (draws fault trees), MFAULT (analyzes fault trees), and RAFT (calculates risk). This methodology evaluates release consequences and estimates the frequency of occurrence of these consequences. This document describes an additional risk calculating code which can be used in conjunction with two of the three codes for transportation risk assessment. TRECII modifies the definition of risk used in RAFT (prob. x release) to accommodate release consequences in terms of fatalities. Throughout this report risk shall be defined as probability times consequences (fatalities are one possible health effect consequence). This methodology has been applied to a variety of energy material transportation systems. Typically the material shipped has been radioactive, although some adaptation to fossil fuels has occurred. The approach is normally applied to truck or train transport systems with some adaptation to pipelines and aircraft. TRECII is designed to be used primarily in conjunction with MFAULT; however, with a moderate amount of effort by the user, it can be implemented independent of the risk analysis package developed at PNL. Code description and user instructions necessary for the implementation of the TRECII program are provided.

  13. Computer Science: A Historical Perspective and a Current Assessment

    NASA Astrophysics Data System (ADS)

    Wirth, Niklaus

    We begin with a brief review of the early years of Computer Science. This period was dominated by large, remote computers and the struggle to master the complex problems of programming. The remedy was found in programming languages providing suitable abstractions and programming models. Outstanding was the language Algol 60, designed by an international committee, and intended as a publication language for algorithms. The early period ends with the advent of the microcomputer in the mid 1970s, bringing computing into homes and schools. The outstanding computer was the Alto, the first personal computer with substantial computing power. It changed the world of computing.

  14. Application of the Sequential Organ Failure Assessment Score to predict outcome in critically ill dogs: preliminary results.

    PubMed

    Ripanti, D; Dino, G; Piovano, G; Farca, A

    2012-08-01

    In human medicine the Sequential Organ Failure Assessment (SOFA) score is one of the most commonly organ dysfunction scoring systems used to assess critically ill patients and to predict the outcome in Intensive Care Units (ICUs). It is composed of scores from six organ systems (respiratory, cardiovascular, hepatic, coagulation, renal, and neurological) graded according to the degree of the dysfunction. The aim of the current study was to describe the applicability of the SOFA score in assessing the outcome of critically ill dogs. A total of 45 dogs admitted to the ICU was enrolled. Among these, 40 dogs completed the study: 50 % survived and left the veterinary clinic. The SOFA score was computed for each dog every 24 hours for the first 3 days of ICU stay, starting on the day of admission. A statistically significant correlation between SOFA score and death or survival was found. Most of the dogs showing an increase of the SOFA score in the first 3 days of hospitalization died, whereas the dogs with a decrease of the score survived. These results suggest that the SOFA score system could be considered a useful indicator of prognosis in ICUs hospitalized dogs.

  15. Infrastructure Suitability Assessment Modeling for Cloud Computing Solutions

    DTIC Science & Technology

    2011-09-01

    implementations of the cloud com- puting paradigm, dissolving the need to co-locate user and computing power by providing desired services through the...increased imple- mentations of the cloud computing paradigm, dissolving the need to co-locate user and computing power by providing desired services...technologies, such as the widespread availability of fast computer networks, inexpensive computing power provided by small-form factor servers and

  16. Quality Is the Key: Critical Issues in Teaching, Learning and Assessment in Vocational Education and Training

    ERIC Educational Resources Information Center

    Mitchell, John; Chappell, Clive; Bateman, Andrea; Roy, Susan

    2006-01-01

    The main finding from research conducted in 2005 into the critical issues in teaching, learning and assessment in vocational education and training (VET) was that "quality is the major issue." While the research identified many issues--such as the need for providers to be increasingly flexible and responsive in meeting the multiple…

  17. Critical Thinking and Formative Assessments: Increasing the Rigor in Your Classroom

    ERIC Educational Resources Information Center

    Moore, Betsy; Stanley, Todd

    2010-01-01

    Develop your students' critical thinking skills and prepare them to perform competitively in the classroom, on state tests, and beyond. In this book, Moore and Stanley show you how to effectively instruct your students to think on higher levels, and how to assess their progress. As states move toward common achievement standards, teachers have…

  18. Developing Institutional Standards for Critical Thinking Using the Collegiate Learning Assessment. Research Brief

    ERIC Educational Resources Information Center

    Hardison, Chaitra M.; Vilamovska, Anna-Marie

    2009-01-01

    The Collegiate Learning Assessment (CLA) measures students' critical thinking skills, but some institutions remain uncertain how to interpret the results. RAND researchers designed a method that institutions can use to develop their own standards. It consists of a three-step process and a system of checks to validate the results. This method will…

  19. Critical Thinking and Formative Assessments: Increasing the Rigor in Your Classroom

    ERIC Educational Resources Information Center

    Moore, Betsy; Stanley, Todd

    2010-01-01

    Develop your students' critical thinking skills and prepare them to perform competitively in the classroom, on state tests, and beyond. In this book, Moore and Stanley show you how to effectively instruct your students to think on higher levels, and how to assess their progress. As states move toward common achievement standards, teachers have…

  20. The Promotion of Critical Thinking Skills in School-Based Assessment (SBA)

    ERIC Educational Resources Information Center

    Kamarulzaman, Wirawani; Kamarulzaman, Wirawahida

    2016-01-01

    The new curriculum introduced in the Malaysian primary students; the Primary School Standard Curriculum (Kurikulum Standard Sekolah Rendah-KSSR) together with the school-based assessment (SBA) is a step taken by the Malaysian government to encourage thinking skills to students, specifically critical thinking skills. The study explores teachers'…

  1. Connecting Assessment and Instruction to Help Students Become More Critical Producers of Multimedia

    ERIC Educational Resources Information Center

    Ostenson, Jonathan William

    2012-01-01

    Classroom teachers have been encouraged to incorporate more multimedia production in the classroom as a means of helping students develop critical media literacy skills. However, they have not always been well trained in how to evaluate the work students create; many teachers struggle to know which criteria to use in assessing student work. This…

  2. Development of Critical Thinking Self-Assessment System Using Wearable Device

    ERIC Educational Resources Information Center

    Gotoh, Yasushi

    2015-01-01

    In this research the author defines critical thinking as skills and dispositions which enable one to solve problems logically and to attempt to reflect autonomously by means of meta-cognitive activities on one's own problem-solving processes. The author focuses on providing meta-cognitive knowledge to help with self-assessment. To develop…

  3. A Study on Critical Thinking Assessment System of College English Writing

    ERIC Educational Resources Information Center

    Dong, Tian; Yue, Lu

    2015-01-01

    This research attempts to discuss the validity of introducing the evaluation of students' critical thinking skills (CTS) into the assessment system of college English writing through an empirical study. In this paper, 30 College English Test Band 4 (CET-4) writing samples were collected and analyzed. Students' CTS and the final scores of collected…

  4. Critical Thinking and Political Participation: The Development and Assessment of a Causal Model.

    ERIC Educational Resources Information Center

    Guyton, Edith M.

    An assessment of a four-stage conceptual model reveals that critical thinking has indirect positive effects on political participation through its direct effects on personal control, political efficacy, and democratic attitudes. The model establishes causal relationships among selected personality variables (self-esteem, personal control, and…

  5. Approaches Used by Faculty to Assess Critical Thinking--Implications for General Education

    ERIC Educational Resources Information Center

    Nicholas, Mark; Raider-Roth, Miriam

    2011-01-01

    This investigation focused on a group of 17 faculty drawn from disciplines in the humanities social sciences and natural sciences. Using in-depth interviews, focus group discussions and qualitative coding strategies, this study examined how faculty conceptualized the term critical thinking (CT), and how they assessed for it in general education…

  6. Developing Institutional Standards for Critical Thinking Using the Collegiate Learning Assessment. Research Brief

    ERIC Educational Resources Information Center

    Hardison, Chaitra M.; Vilamovska, Anna-Marie

    2009-01-01

    The Collegiate Learning Assessment (CLA) measures students' critical thinking skills, but some institutions remain uncertain how to interpret the results. RAND researchers designed a method that institutions can use to develop their own standards. It consists of a three-step process and a system of checks to validate the results. This method will…

  7. Critical Thinking and Political Participation: The Development and Assessment of a Causal Model.

    ERIC Educational Resources Information Center

    Guyton, Edith M.

    An assessment of a four-stage conceptual model reveals that critical thinking has indirect positive effects on political participation through its direct effects on personal control, political efficacy, and democratic attitudes. The model establishes causal relationships among selected personality variables (self-esteem, personal control, and…

  8. Life cycle assessment study of a Chinese desktop personal computer.

    PubMed

    Duan, Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li, Jinhui

    2009-02-15

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps--i.e. the end of life phase--lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study.

  9. Physiologic Assessment of Coronary Artery Disease by Cardiac Computed Tomography

    PubMed Central

    Kochar, Minisha

    2013-01-01

    Coronary artery disease (CAD) remains the leading cause of death and morbidity worldwide. To date, diagnostic evaluation of patients with suspected CAD has relied upon the use of physiologic non-invasive testing by stress electrocardiography, echocardiography, myocardial perfusion imaging (MPI) and magnetic resonance imaging. Indeed, the importance of physiologic evaluation of CAD has been highlighted by large-scale randomized trials that demonstrate the propitious benefit of an integrated anatomic-physiologic evaluation method by performing lesion-specific ischemia assessment by fractional flow reserve (FFR)-widely considered the "gold" standard for ischemia assessment-at the time of invasive angiography. Coronary CT angiography (CCTA) has emerged as an attractive non-invasive test for anatomic illustration of the coronary arteries and atherosclerotic plaque. In a series of prospective multicenter trials, CCTA has been proven as having high diagnostic performance for stenosis detection as compared to invasive angiography. Nevertheless, CCTA evaluation of obstructive stenoses is prone to overestimation of severity and further, detection of stenoses by CCTA does not reliably determine the hemodynamic significance of the visualized lesions. Recently, a series of technological innovations have advanced the possibility of CCTA to enable physiologic evaluation of CAD, thereby creating the potential of this test to provide an integrated anatomic-physiologic assessment of CAD. These advances include rest-stress MPI by CCTA as well as the use of computational fluid dynamics to non-invasively calculate FFR from a typically acquired CCTA. The purpose of this review is to summarize the most recent data addressing these 2 physiologic methods of CAD evaluation by CCTA. PMID:23964289

  10. The Computing Alliance of Hispanic-Serving Institutions: Supporting Hispanics at Critical Transition Points

    ERIC Educational Resources Information Center

    Gates, Ann Quiroz; Hug, Sarah; Thiry, Heather; Alo, Richard; Beheshti, Mohsen; Fernandez, John; Rodriguez, Nestor; Adjouadi, Malek

    2011-01-01

    Hispanics have the highest growth rates among all groups in the U.S., yet they remain considerably underrepresented in computing careers and in the numbers who obtain advanced degrees. Hispanics constituted about 7% of undergraduate computer science and computer engineering graduates and 1% of doctoral graduates in 2007-2008. The small number of…

  11. Understanding the Critics of Educational Technology: Gender Inequities and Computers 1983-1993.

    ERIC Educational Resources Information Center

    Mangione, Melissa

    Although many view computers purely as technological tools to be utilized in the classroom and workplace, attention has been drawn to the social differences computers perpetuate, including those of race, class, and gender. This paper focuses on gender and computing by examining recent analyses in regards to content, form, and usage concerns. The…

  12. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  13. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  14. The Computing Alliance of Hispanic-Serving Institutions: Supporting Hispanics at Critical Transition Points

    ERIC Educational Resources Information Center

    Gates, Ann Quiroz; Hug, Sarah; Thiry, Heather; Alo, Richard; Beheshti, Mohsen; Fernandez, John; Rodriguez, Nestor; Adjouadi, Malek

    2011-01-01

    Hispanics have the highest growth rates among all groups in the U.S., yet they remain considerably underrepresented in computing careers and in the numbers who obtain advanced degrees. Hispanics constituted about 7% of undergraduate computer science and computer engineering graduates and 1% of doctoral graduates in 2007-2008. The small number of…

  15. An assessment of criticality safety at the Department of Energy Rocky Flats Plant, Golden, Colorado, July--September 1989

    SciTech Connect

    Mattson, Roger J.

    1989-09-01

    This is a report on the 1989 independent Criticality Safety Assessment of the Rocky Flats Plant, primarily in response to public concerns that nuclear criticality accidents involving plutonium may have occurred at this nuclear weapon component fabrication and processing plant. The report evaluates environmental issues, fissile material storage practices, ventilation system problem areas, and criticality safety practices. While no evidence of a criticality accident was found, several recommendations are made for criticality safety improvements. 9 tabs.

  16. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    NASA Astrophysics Data System (ADS)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  17. Benchmark Problems Used to Assess Computational Aeroacoustics Codes

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Envia, Edmane

    2005-01-01

    The field of computational aeroacoustics (CAA) encompasses numerical techniques for calculating all aspects of sound generation and propagation in air directly from fundamental governing equations. Aeroacoustic problems typically involve flow-generated noise, with and without the presence of a solid surface, and the propagation of the sound to a receiver far away from the noise source. It is a challenge to obtain accurate numerical solutions to these problems. The NASA Glenn Research Center has been at the forefront in developing and promoting the development of CAA techniques and methodologies for computing the noise generated by aircraft propulsion systems. To assess the technological advancement of CAA, Glenn, in cooperation with the Ohio Aerospace Institute and the AeroAcoustics Research Consortium, organized and hosted the Fourth CAA Workshop on Benchmark Problems. Participants from industry and academia from both the United States and abroad joined to present and discuss solutions to benchmark problems. These demonstrated technical progress ranging from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The results are documented in the proceedings of the workshop. Problems were solved in five categories. In three of the five categories, exact solutions were available for comparison with CAA results. A fourth category of problems representing sound generation from either a single airfoil or a blade row interacting with a gust (i.e., problems relevant to fan noise) had approximate analytical or completely numerical solutions. The fifth category of problems involved sound generation in a viscous flow. In this case, the CAA results were compared with experimental data.

  18. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  19. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  20. Intelligent computer based reliability assessment of multichip modules

    NASA Astrophysics Data System (ADS)

    Grosse, Ian R.; Katragadda, Prasanna; Bhattacharya, Sandeepan; Kulkarni, Sarang

    1994-04-01

    To deliver reliable Multichip (MCM's) in the face of rapidly changing technology, computer-based tools are needed for predicting the thermal mechanical behavior of various MCM package designs and selecting the most promising design in terms of performance, robustness, and reliability. The design tool must be able to address new design technologies manufacturing processes, novel materials, application criteria, and thermal environmental conditions. Reliability is one of the most important factors for determining design quality and hence must be a central condition in the design of Multichip Module packages. Clearly, design engineers need computer based simulation tools for rapid and efficient electrical, thermal, and mechanical modeling and optimization of advanced devices. For three dimensional thermal and mechanical simulation of advanced devices, the finite element method (FEM) is increasingly becoming the numerical method of choice. FEM is a versatile and sophisticated numerical techniques for solving the partial differential equations that describe the physical behavior of complex designs. AUTOTHERM(TM) is a MCM design tool developed by Mentor Graphics for Motorola, Inc. This tool performs thermal analysis of MCM packages using finite element analysis techniques. The tools used the philosophy of object oriented representation of components and simplified specification of boundary conditions for the thermal analysis so that the user need not be an expert in using finite element techniques. Different package types can be assessed and environmental conditions can be modeled. It also includes a detailed reliability module which allows the user to choose a desired failure mechanism (model). All the current tools perform thermal and/or stress analysis and do not address the issues of robustness and optimality of the MCM designs and the reliability prediction techniques are based on closed form analytical models and can often fail to predict the cycles of failure (N