Science.gov

Sample records for computer criticality assessments

  1. Making Student Thinking Visible through a Concept Map in Computer-Based Assessment of Critical Thinking

    ERIC Educational Resources Information Center

    Rosen, Yigal; Tager, Maryam

    2014-01-01

    Major educational initiatives in the world place great emphasis on fostering rich computer-based environments of assessment that make student thinking and reasoning visible. Using thinking tools engages students in a variety of critical and complex thinking, such as evaluating, analyzing, and decision making. The aim of this study was to explore…

  2. Content Analysis in Computer-Mediated Communication: Analyzing Models for Assessing Critical Thinking through the Lens of Social Constructivism

    ERIC Educational Resources Information Center

    Buraphadeja, Vasa; Dawson, Kara

    2008-01-01

    This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…

  3. Critical assessment of nucleic acid electrostatics via experimental and computational investigation of an unfolded state ensemble

    PubMed Central

    Bai, Yu; Chu, Vincent B.; Lipfert, Jan; Pande, Vijay S.; Herschlag, Daniel; Doniach, Sebastian

    2010-01-01

    Electrostatic forces, acting between helices and modulated by the presence of the ion atmosphere, are key determinants in the energetic balance that governs RNA folding. Previous studies have employed Poisson-Boltzmann (PB) theory to compute the energetic contribution of these forces in RNA folding. However, the complex interaction of these electrostatic forces with RNA features such as tertiary contact formation, specific ion-binding, and complex interhelical junctions present in prior studies precluded a rigorous evaluation of PB theory, especially in physiologically important Mg2+ solutions. To critically assess PB theory, we developed a model system that isolates these electrostatic forces. The model system, composed of two DNA duplexes tethered by a polyethylene glycol junction, is an analog for the unfolded state of canonical helix-junction-helix motifs found in virtually all structured RNAs. This model system lacks the complicating features that have precluded a critical assessment of PB in prior studies, ensuring that interhelical electrostatic forces dominate the behavior of the system. The system’s simplicity allows PB predictions to be directly compared with small angle x-ray scattering experiments over a range of monovalent and divalent ion concentrations. These comparisons indicate that PB is a reasonable description of the underlying electrostatic energies for monovalent ions, but large deviations are observed for divalent ions. The validation of PB for monovalent solutions allows analysis of the change in the conformational ensemble of this simple motif as salt concentration is changed. Addition of ions allows the motif to sample more compact microstates, increasing its conformational entropy. The increase of conformational entropy presents an additional barrier to folding by stabilizing the unfolded state. Neglecting this effect will adversely impact the accuracy of folding analyses and models. PMID:18722445

  4. Computer-Based Assessment in Safety-Critical Industries: The Case of Shipping

    ERIC Educational Resources Information Center

    Gekara, Victor Oyaro; Bloor, Michael; Sampson, Helen

    2011-01-01

    Vocational education and training (VET) concerns the cultivation and development of specific skills and competencies, in addition to broad underpinning knowledge relating to paid employment. VET assessment is, therefore, designed to determine the extent to which a trainee has effectively acquired the knowledge, skills, and competencies required by…

  5. AVLIS Criticality risk assessment

    SciTech Connect

    Brereton, S.J., LLNL

    1998-04-29

    U-235 and uranium depleted in U-235) are cooled and accumulated in solid metallic form in canisters. The collected product and tails material is weighed and transferred into certified, critically safe, shipping containers (DOT specification 6M with 2R containment vessel). These will be temporarily stored, and then shipped offsite either for use by a fuel fabricator, or for disposal. Tails material will be packaged for disposal. A criticality risk assessment was performed for AVLIS IPD runs. In this analysis, the likelihood of occurrence of a criticality was examined. For the AVLIS process, there are a number of areas that have been specifically examined to assess whether or not the frequency of occurrence of a criticality is credible (frequency of occurrence > 10-6/yr). In this paper, we discuss only two of the areas: the separator and canister operations.

  6. Carahunge - A Critical Assessment

    NASA Astrophysics Data System (ADS)

    González-García, A. César

    Carahunge is a megalithic monument in southern Armenia that has often been acclaimed as the oldest observatory. The monument, composed of dozens of standing stones, has some perforated stones. The direction of the holes has been measured and their orientation is related to the sun, moon, and stars, obtaining a date for the construction of such devices. After a critical review of the methods and conclusions, these are shown as untenable.

  7. Critical care procedure logging using handheld computers

    PubMed Central

    Carlos Martinez-Motta, J; Walker, Robin; Stewart, Thomas E; Granton, John; Abrahamson, Simon; Lapinsky, Stephen E

    2004-01-01

    Introduction We conducted this study to evaluate the feasibility of implementing an internet-linked handheld computer procedure logging system in a critical care training program. Methods Subspecialty trainees in the Interdepartmental Division of Critical Care at the University of Toronto received and were trained in the use of Palm handheld computers loaded with a customized program for logging critical care procedures. The procedures were entered into the handheld device using checkboxes and drop-down lists, and data were uploaded to a central database via the internet. To evaluate the feasibility of this system, we tracked the utilization of this data collection system. Benefits and disadvantages were assessed through surveys. Results All 11 trainees successfully uploaded data to the central database, but only six (55%) continued to upload data on a regular basis. The most common reason cited for not using the system pertained to initial technical problems with data uploading. From 1 July 2002 to 30 June 2003, a total of 914 procedures were logged. Significant variability was noted in the number of procedures logged by individual trainees (range 13–242). The database generated by regular users provided potentially useful information to the training program director regarding the scope and location of procedural training among the different rotations and hospitals. Conclusion A handheld computer procedure logging system can be effectively used in a critical care training program. However, user acceptance was not uniform, and continued training and support are required to increase user acceptance. Such a procedure database may provide valuable information that may be used to optimize trainees' educational experience and to document clinical training experience for licensing and accreditation. PMID:15469577

  8. Characterizing the state of the art in the computational assignment of gene function: lessons from the first critical assessment of functional annotation (CAFA)

    PubMed Central

    2013-01-01

    The assignment of gene function remains a difficult but important task in computational biology. The establishment of the first Critical Assessment of Functional Annotation (CAFA) was aimed at increasing progress in the field. We present an independent analysis of the results of CAFA, aimed at identifying challenges in assessment and at understanding trends in prediction performance. We found that well-accepted methods based on sequence similarity (i.e., BLAST) have a dominant effect. Many of the most informative predictions turned out to be either recovering existing knowledge about sequence similarity or were "post-dictions" already documented in the literature. These results indicate that deep challenges remain in even defining the task of function assignment, with a particular difficulty posed by the problem of defining function in a way that is not dependent on either flawed gold standards or the input data itself. In particular, we suggest that using the Gene Ontology (or other similar systematizations of function) as a gold standard is unlikely to be the way forward. PMID:23630983

  9. Computer Resources Handbook for Flight Critical Systems.

    DTIC Science & Technology

    1985-01-01

    in avionic systems are suspected of being due to software. In a study of software reliability for digital flight controls conducted by SoHaR for the...aircraft and flight crew -- the use of computers in flight critical applications. Special reliability and fault tolerance (RAFT) techniques are being Used...tolerance in flight critical systems. Conventional reliability techniques and analysis and reliability improvement techniques at the system level are

  10. NASA Critical Facilities Maintenance Assessment

    NASA Technical Reports Server (NTRS)

    Oberhettinger, David J.

    2006-01-01

    Critical Facilities Maintenance Assessment (CFMA) was first implemented by NASA following the March 2000 overtest of the High Energy Solar Spectroscopic Imager (HESSI) spacecraft. A sine burst dynamic test using a 40 year old shaker failed. Mechanical binding/slippage of the slip table imparted 10 times the planned force to the test article. There was major structural damage to HESSI. The mechanical "health" of the shaker had not been assessed and tracked to assure the test equipment was in good working order. Similar incidents have occurred at NASA facilities due to inadequate maintenance (e.g., rainwater from a leaky roof contaminated an assembly facility that housed a spacecraft). The HESSI incident alerted NASA to the urgent need to identify inadequacies in ground facility readiness and maintenance practices. The consequences of failures of ground facilities that service these NASA systems are severe due to the high unit value of NASA products.

  11. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  12. Mission critical cloud computing in a week

    NASA Astrophysics Data System (ADS)

    George, B.; Shams, K.; Knight, D.; Kinney, J.

    NASA's vision is to “ reach for new heights and reveal the unknown so that what we do and learn will benefit all humankind.” While our missions provide large volumes of unique and invaluable data to the scientific community, they also serve to inspire and educate the next generation of engineers and scientists. One critical aspect of “ benefiting all humankind” is to make our missions as visible and accessible as possible to facilitate the transfer of scientific knowledge to the public. The recent successful landing of the Curiosity rover on Mars exemplified this vision: we shared the landing event via live video streaming and web experiences with millions of people around the world. The video stream on Curiosity's website was delivered by a highly scalable stack of computing resources in the cloud to cache and distribute the video stream to our viewers. While this work was done in the context of public outreach, it has extensive implications for the development of mission critical, highly available, and elastic applications in the cloud for a diverse set of use cases across NASA.

  13. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  14. Assessing Postgraduate Students' Critical Thinking Ability

    ERIC Educational Resources Information Center

    Javed, Muhammad; Nawaz, Muhammad Atif; Qurat-Ul-Ain, Ansa

    2015-01-01

    This paper addresses to assess the critical thinking ability of postgraduate students. The target population was the male and female students at University level in Pakistan. A small sample of 45 male and 45 female students were selected randomly from The Islamia University of Bahawalpur, Pakistan. Cornell Critical Thinking Test Series, The…

  15. Equivalent damage: A critical assessment

    NASA Technical Reports Server (NTRS)

    Laflen, J. R.; Cook, T. S.

    1982-01-01

    Concepts in equivalent damage were evaluated to determine their applicability to the life prediction of hot path components of aircraft gas turbine engines. Equivalent damage was defined as being those effects which influence the crack initiation life-time beyond the damage that is measured in uniaxial, fully-reversed sinusoidal and isothermal experiments at low homologous temperatures. Three areas of equivalent damage were examined: mean stress, cumulative damage, and multiaxiality. For each area, a literature survey was conducted to aid in selecting the most appropriate theories. Where possible, data correlations were also used in the evaluation process. A set of criteria was developed for ranking the theories in each equivalent damage regime. These criteria considered aspects of engine utilization as well as the theoretical basis and correlative ability of each theory. In addition, consideration was given to the complex nature of the loading cycle at fatigue critical locations of hot path components; this loading includes non-proportional multiaxial stressing, combined temperature and strain fluctuations, and general creep-fatigue interactions. Through applications of selected equivalent damage theories to some suitable data sets it was found that there is insufficient data to allow specific recommendations of preferred theories for general applications. A series of experiments and areas of further investigations were identified.

  16. Assessment of critical thinking: a Delphi study.

    PubMed

    Paul, Sheila A

    2014-11-01

    Nurse educators are responsible for preparing nurses who critically analyze patient information and provide meaningful interventions in today's complex health care system. By using the Delphi research method, this study, utilized the specialized and experiential knowledge of Certified Nurse Educators. This original Delphi research study asked Certified Nurse Educators how to assess the critical-thinking ability of nursing students in the clinical setting. The results showed that nurse educators need time, during the clinical experience, to accurately assess each individual nursing student. This study demonstrated the need for extended student clinical time, and a variety of clinical learning assessment tools.

  17. Recent Use of Covariance Data for Criticality Safety Assessment

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2008-01-01

    The TSUNAMI codes of the Oak Ridge National Laboratory SCALE code system were applied to a burnup credit application to demonstrate the use of sensitivity and uncertainty analysis with recent cross section covariance data for criticality safety code and data validation. The use of sensitivity and uncertainty analysis provides for the assessment of a defensible computational bias, bias uncertainty, and gap analysis for a complex system that otherwise could be assessed only through the use of expert judgment and conservative assumptions.

  18. Reliability of assessment of critical thinking.

    PubMed

    Allen, George D; Rubenfeld, M Gaie; Scheffer, Barbara K

    2004-01-01

    Although clinical critical thinking skills and behaviors are among the most highly sought characteristics of BSN graduates, they remain among the most difficult to teach and assess. Three reasons for this difficulty have been (1) lack of agreement among nurse educators as to the definition of critical thinking, (2) low correlation between clinical critical thinking and existing standardized tests of critical thinking, and (3) poor reliability in scoring other evidences of critical thinking, such as essays. This article first describes a procedure for teaching critical thinking that is based on a consensus definition of 17 dimensions of critical thinking in clinical nursing practice. This procedure is easily taught to nurse educators and can be flexibly and inexpensively incorporated into any undergraduate nursing curriculum. We then show that students' understanding and use of these dimensions can be assessed with high reliability (coefficient alpha between 0.7 and 0.8) and with great time efficiency for both teachers and students. By using this procedure iteratively across semesters, students can develop portfolios demonstrating attainment of competence in clinical critical thinking, and educators can obtain important summary evaluations of the degree to which their graduates have succeeded in this important area of their education.

  19. To assess the reparative ability of differentiated mesenchymal stem cells in a rat critical size bone repair defect model using high frequency co-registered photoacoustic/ultrasound imaging and micro computed tomography

    NASA Astrophysics Data System (ADS)

    Zafar, Haroon; Gaynard, Sean; O'Flatharta, Cathal; Doroshenkova, Tatiana; Devine, Declan; Sharif, Faisal; Barry, Frank; Hayes, Jessica; Murphy, Mary; Leahy, Martin J.

    2016-03-01

    Stem cell based treatments hold great potential and promise to address many unmet clinical needs. The importance of non-invasive imaging techniques to monitor transplanted stem cells qualitatively and quantitatively is crucial. The objective of this study was to create a critical size bone defect in the rat femur and then assess the ability of the differentiated mesenchymal stem cells (MSCs) to repair the defect using high frequency co-registered photoacoustic(PA)/ultrasound(US) imaging and micro computed tomography (μCT) over an 8 week period. Combined PA and US imaging was performed using 256 elements, 21 MHz frequency linear-array transducer combined with multichannel collecting system. In vivo 3D PA and US images of the defect bone in the rat femur were acquired after 4 and 8 weeks of the surgery. 3D co-registered structural such as microvasculature and the functional images such as total concentration of haemoglobin (HbT) and the haemoglobin oxygen saturation (sO2) were obtained using PA and US imaging. Bone formation was assessed after 4 and 8 weeks of the surgery by μCT. High frequency linear-array based coregistered PA/US imaging has been found promising in terms of non-invasiveness, sensitivity, adaptability, high spatial and temporal resolution at sufficient depths for the assessment of the reparative ability of MSCs in a rat critical size bone repair defect model.

  20. Risk-Assessment Computer Program

    NASA Technical Reports Server (NTRS)

    Dias, William C.; Mittman, David S.

    1993-01-01

    RISK D/C is prototype computer program assisting in attempts to do program risk modeling for Space Exploration Initiative (SEI) architectures proposed in Synthesis Group Report. Risk assessment performed with respect to risk events, probabilities, and severities of potential results. Enables ranking, with respect to effectiveness, of risk-mitigation strategies proposed for exploration program architecture. Allows for fact that risk assessment in early phases of planning subjective. Although specific to SEI in present form, also used as software framework for development of risk-assessment programs for other specific uses. Developed for Macintosh(TM) series computer. Requires HyperCard(TM) 2.0 or later, as well as 2 Mb of random-access memory and System 6.0.8 or later.

  1. Mission Critical Computer Resources Management Guide

    DTIC Science & Technology

    1988-09-01

    6 SOFTWARE TEST AND EVALUATION 6.1 TEST PLANNING ........ ..................... 6-1 6.1.1 System Support Computer Resources ....... . 6-1 6.1.2...7-14 CHAPTER 8 PLANNING FOR COMPUTER SOFTWARE 8.1 INTRODUCTION ........ ..................... 8-1 8.2 PLANS AND...DOCUMENTATION ..... .................. 8-1 8.2.1 Program Management Plan (PMP) .. ......... .. 8-1 8.2.2 Test and Evaluation Master Plan (TEMP) .... 8-1 8.2.3

  2. Critical eigenvalue in LMFBRs: a physics assessment

    SciTech Connect

    McKnight, R.D.; Collins, P.J.; Olsen, D.N.

    1984-01-01

    This paper summarizes recent work to put the analysis of past critical eigenvalue measurements from the US critical experiments program on a consistent basis. The integral data base includes 53 configurations built in 11 ZPPR assemblies which simulate mixed oxide LMFBRs. Both conventional and heterogeneous designs representing 350, 700, and 900 MWe sizes and with and without simulated control rods and/or control rod positions have been studied. The review of the integral data base includes quantitative assessment of experimental uncertainties in the measured excess reactivity. Analyses have been done with design level and higher-order methods using ENDF/B-IV data. Comparisons of these analyses with the experiments are used to generate recommended bias factors for criticality predictions. Recommended methods for analysis of LMFBR fast critical assemblies and LMFBR design calculations are presented. Unresolved issues and areas which require additional experimental or analytical study are identified.

  3. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  4. Radiation exposure and risk assessment for critical female body organs

    NASA Technical Reports Server (NTRS)

    Atwell, William; Weyland, Mark D.; Hardy, Alva C.

    1991-01-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed.

  5. DOE/EM Criticality Safety Needs Assessment

    SciTech Connect

    Westfall, Robert Michael; Hopper, Calvin Mitchell

    2011-02-01

    The issue of nuclear criticality safety (NCS) in Department of Energy Environmental Management (DOE/EM) fissionable material operations presents challenges because of the large quantities of material present in the facilities and equipment that are committed to storage and/or material conditioning and dispositioning processes. Given the uncertainty associated with the material and conditions for many DOE/EM fissionable material operations, ensuring safety while maintaining operational efficiency requires the application of the most-effective criticality safety practices. In turn, more-efficient implementation of these practices can be achieved if the best NCS technologies are utilized. In 2002, DOE/EM-1 commissioned a survey of criticality safety technical needs at the major EM sites. These needs were documented in the report Analysis of Nuclear Criticality Safety Technology Supporting the Environmental Management Program, issued May 2002. Subsequent to this study, EM safety management personnel made a commitment to applying the best and latest criticality safety technology, as described by the DOE Nuclear Criticality Safety Program (NCSP). Over the past 7 years, this commitment has enabled the transfer of several new technologies to EM operations. In 2008, it was decided to broaden the basis of the EM NCS needs assessment to include not only current needs for technologies but also NCS operational areas with potential for improvements in controls, analysis, and regulations. A series of NCS workshops has been conducted over the past years, and needs have been identified and addressed by EM staff and contractor personnel. These workshops were organized and conducted by the EM Criticality Safety Program Manager with administrative and technical support by staff at Oak Ridge National Laboratory (ORNL). This report records the progress made in identifying the needs, determining the approaches for addressing these needs, and assimilating new NCS technologies into EM

  6. Critical Reflection on Cultural Difference in the Computer Conference

    ERIC Educational Resources Information Center

    Ziegahn, Linda

    2005-01-01

    Adult educators have a strong interest in designing courses that stimulate learning toward critical, more inclusive cultural perspectives. Critical reflection is a key component of both intercultural learning and a growing medium of instruction, the asynchronous computer conference (CC). This study combined qualitative methodology with a framework…

  7. Microdosing: a critical assessment of human data.

    PubMed

    Rowland, Malcolm

    2012-11-01

    Ultrasensitive analytical methodologies have now made possible the ability to characterize the pharmacokinetics (PK) of compounds following administration to humans of a minute, subpharmacologic dose, a microdose. This has the potential to provide pre-IND information to help in early candidate selection, but only if such information is reasonably predictive of PK at pharmacologic doses. The published clinical data in this area are critically assessed and perspectives drawn. The place of microdosing, alone and coupled with other innovative methodologies, both pre-IND and during clinical development, is considered as a way forward to improve the efficiency and informativeness of drug development.

  8. Nutritional Assessment in Critically Ill Patients

    PubMed Central

    Hejazi, Najmeh; Mazloom, Zohreh; Zand, Farid; Rezaianzadeh, Abbas; Amini, Afshin

    2016-01-01

    Background: Malnutrition is an important factor in the survival of critically ill patients. The purpose of the present study was to assess the nutritional status of patients in the intensive care unit (ICU) on the days of admission and discharge via a detailed nutritional assessment. Methods: Totally, 125 patients were followed up from admission to discharge at 8ICUs in Shiraz, Iran. The patients’ nutritional status was assessed using subjective global assessment (SGA), anthropometric measurements, biochemical indices, and body composition indicators. Diet prescription and intake was also evaluated. Results: Malnutrition prevalence significantly increased on the day of discharge (58.62%) compared to the day of admission (28.8%) according to SGA (P<0.001). The patients’ weight, mid-upper-arm circumference, mid-arm muscle circumference, triceps skinfold thickness, and calf circumference decreased significantly as well (P<0.001). Lean mass weight and body cell mass also decreased significantly (P<0.001). Biochemical indices showed no notable changes except for magnesium, which decreased significantly (P=0.013). A negative significant correlation was observed between malnutrition on discharge day and anthropometric measurements. Positive and significant correlations were observed between the number of days without enteral feeding, days delayed from ICU admission to the commencement of enteral feeding, and the length of ICU stay and malnutrition on discharge day. Energy and protein intakes were significantly less than the prescribed diet (26.26% and 26.48%, respectively). Conclusion: Malnutrition on discharge day increased in the patients in the ICU according to SGA. Anthropometric measurements were better predictors of the nutritional outcome of our critically ill patients than were biochemical tests. PMID:27217600

  9. Assessing Terrorist Motivations for Attacking Critical Infrastructure

    SciTech Connect

    Ackerman, G; Abhayaratne, P; Bale, J; Bhattacharjee, A; Blair, C; Hansell, L; Jayne, A; Kosal, M; Lucas, S; Moran, K; Seroki, L; Vadlamudi, S

    2006-12-04

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security and way of life. These complex and often interconnected systems have become so ubiquitous and essential to day-to-day life that they are easily taken for granted. Often it is only when the important services provided by such infrastructure are interrupted--when we lose easy access to electricity, health care, telecommunications, transportation or water, for example--that we are conscious of our great dependence on these networks and of the vulnerabilities that stem from such dependence. Unfortunately, it must be assumed that many terrorists are all too aware that CI facilities pose high-value targets that, if successfully attacked, have the potential to dramatically disrupt the normal rhythm of society, cause public fear and intimidation, and generate significant publicity. Indeed, revelations emerging at the time of this writing about Al Qaida's efforts to prepare for possible attacks on major financial facilities in New York, New Jersey, and the District of Columbia remind us just how real and immediate such threats to CI may be. Simply being aware that our nation's critical infrastructure presents terrorists with a plethora of targets, however, does little to mitigate the dangers of CI attacks. In order to prevent and preempt such terrorist acts, better understanding of the threats and vulnerabilities relating to critical infrastructure is required. The Center for Nonproliferation Studies (CNS) presents this document as both a contribution to the understanding of such threats and an initial effort at ''operationalizing'' its findings for use by analysts who work on issues of critical infrastructure protection. Specifically, this study focuses on a subsidiary aspect of CI threat assessment that has thus far remained largely unaddressed by contemporary terrorism research: the motivations and related factors that determine whether a terrorist

  10. Critical Emergency Medicine Procedural Skills: A Comparative Study of Methods for Teaching and Assessment.

    ERIC Educational Resources Information Center

    Chapman, Dane M.; And Others

    Three critical procedural skills in emergency medicine were evaluated using three assessment modalities--written, computer, and animal model. The effects of computer practice and previous procedure experience on skill competence were also examined in an experimental sequential assessment design. Subjects were six medical students, six residents,…

  11. CRITICAL ISSUES IN HIGH END COMPUTING - FINAL REPORT

    SciTech Connect

    Corones, James

    2013-09-23

    High-End computing (HEC) has been a driver for advances in science and engineering for the past four decades. Increasingly HEC has become a significant element in the national security, economic vitality, and competitiveness of the United States. Advances in HEC provide results that cut across traditional disciplinary and organizational boundaries. This program provides opportunities to share information about HEC systems and computational techniques across multiple disciplines and organizations through conferences and exhibitions of HEC advances held in Washington DC so that mission agency staff, scientists, and industry can come together with White House, Congressional and Legislative staff in an environment conducive to the sharing of technical information, accomplishments, goals, and plans. A common thread across this series of conferences is the understanding of computational science and applied mathematics techniques across a diverse set of application areas of interest to the Nation. The specific objectives of this program are: Program Objective 1. To provide opportunities to share information about advances in high-end computing systems and computational techniques between mission critical agencies, agency laboratories, academics, and industry. Program Objective 2. To gather pertinent data, address specific topics of wide interest to mission critical agencies. Program Objective 3. To promote a continuing discussion of critical issues in high-end computing. Program Objective 4.To provide a venue where a multidisciplinary scientific audience can discuss the difficulties applying computational science techniques to specific problems and can specify future research that, if successful, will eliminate these problems.

  12. HSE's safety assessment principles for criticality safety.

    PubMed

    Simister, D N; Finnerty, M D; Warburton, S J; Thomas, E A; Macphail, M R

    2008-06-01

    The Health and Safety Executive (HSE) published its revised Safety Assessment Principles for Nuclear Facilities (SAPs) in December 2006. The SAPs are primarily intended for use by HSE's inspectors when judging the adequacy of safety cases for nuclear facilities. The revised SAPs relate to all aspects of safety in nuclear facilities including the technical discipline of criticality safety. The purpose of this paper is to set out for the benefit of a wider audience some of the thinking behind the final published words and to provide an insight into the development of UK regulatory guidance. The paper notes that it is HSE's intention that the Safety Assessment Principles should be viewed as a reflection of good practice in the context of interpreting primary legislation such as the requirements under site licence conditions for arrangements for producing an adequate safety case and for producing a suitable and sufficient risk assessment under the Ionising Radiations Regulations 1999 (SI1999/3232 www.opsi.gov.uk/si/si1999/uksi_19993232_en.pdf).

  13. An Assessment of Student Computer Ergonomic Knowledge.

    ERIC Educational Resources Information Center

    Alexander, Melody W.

    1997-01-01

    Business students (n=254) were assessed on their knowledge of computers, health and safety, radiation, workstations, and ergonomic techniques. Overall knowledge was low in all categories. In particular, they had not learned computer-use techniques. (SK)

  14. Critical infrastructure systems of systems assessment methodology.

    SciTech Connect

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  15. Computer Interview Problem Assessment of Psychiatric Patients

    PubMed Central

    Angle, Hugh V.; Ellinwood, Everett H.; Carroll, Judith

    1978-01-01

    Behavioral Assessment information, a more general form of Problem- Oriented Record data, appears to have many useful clinical qualities and was selected to be the information content for a computer interview system. This interview system was designed to assess problematic behaviors of psychiatric patients. The computer interview covered 29 life problem areas and took patients from four to eight hours to complete. In two reliability studies, the computer interview was compared to human interviews. A greater number of general and specific patient problems were identified in the computer interview than in the human interviews. The attitudes of computer patients and clinicians receiving the computer reports were surveyed.

  16. Radiation exposure and risk assessment for critical female body organs

    SciTech Connect

    Atwell, W.; Weyland, M.D.; Hardy, A.C. NASA, Johnson Space Center, Houston, TX )

    1991-07-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed. 13 refs.

  17. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  18. Adapting the Critical Thinking Assessment Test for Palestinian Universities

    ERIC Educational Resources Information Center

    Basha, Sami; Drane, Denise; Light, Gregory

    2016-01-01

    Critical thinking is a key learning outcome for Palestinian students. However, there are no validated critical thinking tests in Arabic. Suitability of the US developed Critical Thinking Assessment Test (CAT) for use in Palestine was assessed. The test was piloted with university students in English (n = 30) and 4 questions were piloted in Arabic…

  19. Research in computer access assessment and intervention.

    PubMed

    Simpson, Richard; Koester, Heidi Horstmann; Lopresti, Edmund

    2010-02-01

    Computer access technology (CAT) allows people who have trouble using a standard computer keyboard, mouse, or monitor to access a computer. CAT is critical for enhancing the educational and vocational opportunities of people with disabilities. Choosing the most appropriate CAT is a collaborative decision-making process involving the consumer, clinician(s), and third party payers. The challenges involved and potential technological solutions are discussed.

  20. Cryptographic Key Management and Critical Risk Assessment

    SciTech Connect

    Abercrombie, Robert K

    2014-05-01

    The Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) CyberSecurity for Energy Delivery Systems (CSEDS) industry led program (DE-FOA-0000359) entitled "Innovation for Increasing CyberSecurity for Energy Delivery Systems (12CSEDS)," awarded a contract to Sypris Electronics LLC to develop a Cryptographic Key Management System for the smart grid (Scalable Key Management Solutions for Critical Infrastructure Protection). Oak Ridge National Laboratory (ORNL) and Sypris Electronics, LLC as a result of that award entered into a CRADA (NFE-11-03562) between ORNL and Sypris Electronics, LLC. ORNL provided its Cyber Security Econometrics System (CSES) as a tool to be modified and used as a metric to address risks and vulnerabilities in the management of cryptographic keys within the Advanced Metering Infrastructure (AMI) domain of the electric sector. ORNL concentrated our analysis on the AMI domain of which the National Electric Sector Cyber security Organization Resource (NESCOR) Working Group 1 (WG1) has documented 29 failure scenarios. The computational infrastructure of this metric involves system stakeholders, security requirements, system components and security threats. To compute this metric, we estimated the stakes that each stakeholder associates with each security requirement, as well as stochastic matrices that represent the probability of a threat to cause a component failure and the probability of a component failure to cause a security requirement violation. We applied this model to estimate the security of the AMI, by leveraging the recently established National Institute of Standards and Technology Interagency Report (NISTIR) 7628 guidelines for smart grid security and the International Electrotechnical Commission (IEC) 63351, Part 9 to identify the life cycle for cryptographic key management, resulting in a vector that assigned to each stakeholder an estimate of their average loss in terms of dollars per day of system

  1. Computer simulation of hypothetical criticality accidents in aqueous fissile solutions

    SciTech Connect

    Hetrick, D.L. )

    1991-01-01

    The purpose of this paper is to describe recent developments in computer simulation of hypothetical criticality accidents in aqueous fissile solutions of uranium and plutonium such as might be encountered in fuel fabrication and reprocessing operations. Models for reactivity shutdown mechanisms and equations of state have been combined to permit estimates of fission yield, inertial pressure, and kinetic energy for a wide range of pulse sizes and time scales. Improvements to previously published models are reported along with some recent applications. Information obtained from pulsed solution assemblies (KEWB, CRAC, SILENE, and SHEBA) and from past criticality accidents was used in the development of computer models. Applications include slow events lasting many hours (hypothetical undetected laboratory accidents) and large-yield millisecond pulses in which evolution of radiolytic gas may be important (severe accidents and pulsed reactors).

  2. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    SciTech Connect

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-09-20

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community.

  3. WSRC approach to validation of criticality safety computer codes

    SciTech Connect

    Finch, D.R.; Mincey, J.F.

    1991-12-31

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K{sub eff}) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope {sup 236}U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed.

  4. WSRC approach to validation of criticality safety computer codes

    SciTech Connect

    Finch, D.R.; Mincey, J.F.

    1991-01-01

    Recent hardware and operating system changes at Westinghouse Savannah River Site (WSRC) have necessitated review of the validation for JOSHUA criticality safety computer codes. As part of the planning for this effort, a policy for validation of JOSHUA and other criticality safety codes has been developed. This policy will be illustrated with the steps being taken at WSRC. The objective in validating a specific computational method is to reliably correlate its calculated neutron multiplication factor (K{sub eff}) with known values over a well-defined set of neutronic conditions. Said another way, such correlations should be: (1) repeatable; (2) demonstrated with defined confidence; and (3) identify the range of neutronic conditions (area of applicability) for which the correlations are valid. The general approach to validation of computational methods at WSRC must encompass a large number of diverse types of fissile material processes in different operations. Special problems are presented in validating computational methods when very few experiments are available (such as for enriched uranium systems with principal second isotope {sup 236}U). To cover all process conditions at WSRC, a broad validation approach has been used. Broad validation is based upon calculation of many experiments to span all possible ranges of reflection, nuclide concentrations, moderation ratios, etc. Narrow validation, in comparison, relies on calculations of a few experiments very near anticipated worst-case process conditions. The methods and problems of broad validation are discussed.

  5. Inequalities, Assessment and Computer Algebra

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.

    2015-01-01

    The goal of this paper is to examine single variable real inequalities that arise as tutorial problems and to examine the extent to which current computer algebra systems (CAS) can (1) automatically solve such problems and (2) determine whether students' own answers to such problems are correct. We review how inequalities arise in contemporary…

  6. Assessment tool for nursing student computer competencies.

    PubMed

    Elder, Betty L; Koehn, Mary L

    2009-01-01

    Computer skills have been established as important for nursing students and for graduate nurses. No current research was found on the best method to evaluate the skills of incoming nursing students. The purpose of this descriptive, correlational study was to compare student ratings of their computer competency to their performance of those skills on a computer-graded assessment. A convenience sample of 87 nursing students was used. There was a low, but significant correlation between the scores on the survey and the assessment. The results suggest that students rate themselves higher on their skills than their actual performance of computer skills. Implications for educators are presented, and the value of using a computer-graded assessment is discussed.

  7. Computing Critical Properties with Yang-Yang Anomalies

    NASA Astrophysics Data System (ADS)

    Orkoulas, Gerassimos; Cerdeirina, Claudio; Fisher, Michael

    2017-01-01

    Computation of the thermodynamics of fluids in the critical region is a challenging task owing to divergence of the correlation length and lack of particle-hole symmetries found in Ising or lattice-gas models. In addition, analysis of experiments and simulations reveals a Yang-Yang (YY) anomaly which entails sharing of the specific heat singularity between the pressure and the chemical potential. The size of the YY anomaly is measured by the YY ratio Rμ =C μ /CV of the amplitudes of C μ = - T d2 μ /dT2 and of the total specific heat CV. A ``complete scaling'' theory, in which the pressure mixes into the scaling fields, accounts for the YY anomaly. In Phys. Rev. Lett. 116, 040601 (2016), compressible cell gas (CCG) models which exhibit YY and singular diameter anomalies, have been advanced for near-critical fluids. In such models, the individual cell volumes are allowed to fluctuate. The thermodynamics of CCGs can be computed through mapping onto the Ising model via the seldom-used great grand canonical ensemble. The computations indicate that local free volume fluctuations are the origins of the YY effects. Furthermore, local energy-volume coupling (to model water) is another crucial factor underlying the phenomena.

  8. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    PubMed Central

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  9. Critical assessment of automated flow cytometry data analysis techniques.

    PubMed

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R; Brinkman, Ryan; Gottardo, Raphael; Scheuermann, Richard H

    2013-03-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks: (i) mammalian cell population identification, to determine whether automated algorithms can reproduce expert manual gating and (ii) sample classification, to determine whether analysis pipelines can identify characteristics that correlate with external variables (such as clinical outcome). This analysis presents the results of the first FlowCAP challenges. Several methods performed well as compared to manual gating or external variables using statistical performance measures, which suggests that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis.

  10. Computer Applications in Assessment and Counseling.

    ERIC Educational Resources Information Center

    Veldman, Donald J.; Menaker, Shirley L.

    Public school counselors and psychologists can expect valuable assistance from computer-based assessment and counseling techniques within a few years, as programs now under development become generally available for the typical computers now used by schools for grade-reporting and class-scheduling. Although routine information-giving and gathering…

  11. A Novel Instrument for Assessing Students' Critical Thinking Abilities

    ERIC Educational Resources Information Center

    White, Brian; Stains, Marilyne; Escriu-Sune, Marta; Medaglia, Eden; Rostamnjad, Leila; Chinn, Clark; Sevian, Hannah

    2011-01-01

    Science literacy involves knowledge of both science content and science process skills. In this study, we describe the Assessment of Critical Thinking Ability survey and its preliminary application to assess the critical thinking skills of undergraduate students, graduate students, and postdoctoral fellows. This survey is based on a complex and…

  12. Self-organized criticality in a computer network model

    PubMed

    Yuan; Ren; Shan

    2000-02-01

    We study the collective behavior of computer network nodes by using a cellular automaton model. The results show that when the load of network is constant, the throughputs and buffer contents of nodes are power-law distributed in both space and time. Also the feature of 1/f noise appears in the power spectrum of the change of the number of nodes that bear a fixed part of the system load. It can be seen as yet another example of self-organized criticality. Power-law decay in the distribution of buffer contents implies that heavy network congestion occurs with small probability. The temporal power-law distribution for throughput might be a reasonable explanation for the observed self-similarity in computer network traffic.

  13. CriTi-CAL: A computer program for Critical Coiled Tubing Calculations

    SciTech Connect

    He, X.

    1995-12-31

    A computer software package for simulating coiled tubing operations has been developed at Rogaland Research. The software is named CriTiCAL, for Critical Coiled Tubing Calculations. It is a PC program running under Microsoft Windows. CriTi-CAL is designed for predicting force, stress, torque, lockup, circulation pressure losses and along-hole-depth corrections for coiled tubing workover and drilling operations. CriTi-CAL features an user-friendly interface, integrated work string and survey editors, flexible input units and output format, on-line documentation and extensive error trapping. CriTi-CAL was developed by using a combination of Visual Basic and C. Such an approach is an effective way to quickly develop high quality small to medium size software for the oil industry. The software is based on the results of intensive experimental and theoretical studies on buckling and post-buckling of coiled tubing at Rogaland Research. The software has been validated by full-scale test results and field data.

  14. The Collegiate Learning Assessment: A Critical Perspective

    ERIC Educational Resources Information Center

    Shermis, Mark D.

    2008-01-01

    This article describes the Collegiate Learning Assessment (CLA), a postsecondary assessment tool designed to evaluate the "value-added" component of institutional contributions to student learning outcomes. Developed by the Council for Aid to Education (CAE), the instrument ostensibly focuses on the contributions of general education coursework…

  15. Perspectives on sedation assessment in critical care.

    PubMed

    Olson, Daiwai M; Thoyre, Suzanne M; Auyong, David B

    2007-01-01

    Multiple studies have been undertaken to show that neurofunction monitors can correlate to objective sedation assessments. Showing a correlation between these 2 patient assessments tools may not be the correct approach for validation of neurofunction monitors. Two different methods of assessing 2 different modes of the patient's response to sedation should not be expected to precisely correlate unless the desire is to replace one method with the other. We provide a brief summary of several sedation scales, physiologic measures and neurofunction monitoring tools, and correlations literature for bispectral index monitoring, and the Ramsay Scale and the Sedation Agitation Scale. Neurofunction monitors provide near continuous information about a different domain of the sedation response than intermittent observational assessments. Further research should focus on contributions from this technology to the improvement of patient outcomes when neurofunction monitoring is used as a complement, not a replacement, for observational methods of sedation assessment.

  16. Computer-based consultation in "care" of the critically ill patient.

    PubMed

    Siegel, J H; Fichthorn, J; Monteferrante, J; Moody, E; Box, N; Nolan, C; Ardrey, R

    1976-09-01

    Despite far-reaching progress in all areas of surgery, methods of medical data analysis and communication have not kept pace with the increased rate of data acquisition. The needs to organize and communicate these data and to provide a medium for continuing education are great in critical-care areas where the amount and the diversity of data collected are enormous, and the number of surgical team members involved in patient care has grown proportionately. The computer-based Clinical Assessment, Research, and Education System (CARE) is a time-shared computer system now available on a national basis designed to provide a management and education aid for the treatment of critically ill surgical patients. An initial clinical assessment and operative note are entered by the surgeon from which an estimation of the initial fluid, blood, and electrolyte deficits are calculated. Daily doctor's progress notes, shift nurses' summaries of vital signs, clinical information, intake and output data, and drug administration, biochemical, cardiovascular, blood gas, and respiratory information are entered for each shift. From these, a metabolic balance is calculated; fluid, electrolyte, and caloric requirements are determined; cardiorespiratory parameters are computed; and various therapuetic suggestions and cautions are given to alert the physician to problems that may be arising. The surgeon-user is assisted in making the best critical-care decisions through computer-directed, interactive prompting which focuses on the most important clinical conditions and correlations and metabolic considerations and relates the important problem to the relevant literature.

  17. Bad Actors Criticality Assessment for Pipeline system

    NASA Astrophysics Data System (ADS)

    Nasir, Meseret; Chong, Kit wee; Osman, Sabtuni; Siaw Khur, Wee

    2015-04-01

    Failure of a pipeline system could bring huge economic loss. In order to mitigate such catastrophic loss, it is required to evaluate and rank the impact of each bad actor of the pipeline system. In this study, bad actors are known as the root causes or any potential factor leading to the system downtime. Fault Tree Analysis (FTA) is used to analyze the probability of occurrence for each bad actor. Bimbaum's Importance and criticality measure (BICM) is also employed to rank the impact of each bad actor on the pipeline system failure. The results demonstrate that internal corrosion; external corrosion and construction damage are critical and highly contribute to the pipeline system failure with 48.0%, 12.4% and 6.0% respectively. Thus, a minor improvement in internal corrosion; external corrosion and construction damage would bring significant changes in the pipeline system performance and reliability. These results could also be useful to develop efficient maintenance strategy by identifying the critical bad actors.

  18. Criticality of Water: Aligning Water and Mineral Resources Assessment.

    PubMed

    Sonderegger, Thomas; Pfister, Stephan; Hellweg, Stefanie

    2015-10-20

    The concept of criticality has been used to assess whether a resource may become a limiting factor to economic activities. It has been primarily applied to nonrenewable resources, in particular to metals. However, renewable resources such as water may also be overused and become a limiting factor. In this paper, we therefore developed a water criticality method that allows for a new, user-oriented assessment of water availability and accessibility. Comparability of criticality across resources is desirable, which is why the presented adaptation of the criticality approach to water is based on a metal criticality method, whose basic structure is maintained. With respect to the necessary adaptations to the water context, a transparent water criticality framework is proposed that may pave the way for future integrated criticality assessment of metals, water, and other resources. Water criticality scores were calculated for 159 countries subdivided into 512 geographic units for the year 2000. Results allow for a detailed analysis of criticality profiles, revealing locally specific characteristics of water criticality. This is useful for the screening of sites and their related water criticality, for indication of water related problems and possible mitigation options and water policies, and for future water scenario analysis.

  19. RHIC CRITICAL POINT SEARCH: ASSESSING STARs CAPABILITIES.

    SciTech Connect

    SORENSEN,P.

    2006-07-03

    In this report we discuss the capabilities and limitations of the STAR detector to search for signatures of the QCD critical point in a low energy scan at RHIC. We find that a RHIC low energy scan will cover a broad region of interest in the nuclear matter phase diagram and that the STAR detector--a detector designed to measure the quantities that will be of interest in this search--will provide new observables and improve on previous measurements in this energy range.

  20. Assessing Vulnerabilities, Risks, and Consequences of Damage to Critical Infrastructure

    SciTech Connect

    Suski, N; Wuest, C

    2011-02-04

    Phase brings together infrastructure owners and operators to identify critical assets and help the team create a structured information request. During this phase, we gain information about the critical assets from those who are most familiar with operations and interdependencies, making the time we spend on the ground conducting the assessment much more productive and enabling the team to make actionable recommendations. The Assessment Phase analyzes 10 areas: Threat environment, cyber architecture, cyber penetration, physical security, physical penetration, operations security, policies and procedures, interdependencies, consequence analysis, and risk characterization. Each of these individual tasks uses direct and indirect data collection, site inspections, and structured and facilitated workshops to gather data. Because of the importance of understanding the cyber threat, LLNL has built both fixed and mobile cyber penetration, wireless penetration and supporting tools that can be tailored to fit customer needs. The Post-Assessment Phase brings vulnerability and risk assessments to the customer in a format that facilitates implementation of mitigation options. Often the assessment findings and recommendations are briefed and discussed with several levels of management and, if appropriate, across jurisdictional boundaries. The end result is enhanced awareness and informed protective measures. Over the last 15 years, we have continued to refine our methodology and capture lessons learned and best practices. The resulting risk and decision framework thus takes into consideration real-world constraints, including regulatory, operational, and economic realities. In addition to 'on the ground' assessments focused on mitigating vulnerabilities, we have integrated our computational and atmospheric dispersion capability with easy-to-use geo-referenced visualization tools to support emergency planning and response operations. LLNL is home to the National Atmospheric Release

  1. A Critical Evaluation of Cognitive Style Assessment.

    ERIC Educational Resources Information Center

    Richter, Ricka

    This document reviews theories of cognitive style and methods of cognitive style assessment as they relate to the context of South Africa, where sociopolitical changes call for reassessment of theoretical assumptions in education and training. The report consists of six chapters. After a brief introductory chapter, the second chapter gives an…

  2. Fuzzy architecture assessment for critical infrastructure resilience

    SciTech Connect

    Muller, George

    2012-12-01

    This paper presents an approach for the selection of alternative architectures in a connected infrastructure system to increase resilience of the overall infrastructure system. The paper begins with a description of resilience and critical infrastructure, then summarizes existing approaches to resilience, and presents a fuzzy-rule based method of selecting among alternative infrastructure architectures. This methodology includes considerations which are most important when deciding on an approach to resilience. The paper concludes with a proposed approach which builds on existing resilience architecting methods by integrating key system aspects using fuzzy memberships and fuzzy rule sets. This novel approach aids the systems architect in considering resilience for the evaluation of architectures for adoption into the final system architecture.

  3. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  4. [Pain management for cancer patients with critical pathway on computer].

    PubMed

    Hori, Natsuki; Konishi, Toshiro

    2005-02-01

    For relief from cancer pain, we developed critical pathway (CP) as an effective strategy for the medical staff treating cancer patients. This CP was made out of Microsoft Excel, and was used on personal computers. "Good sleeping" was set as the first goal and the second was "No pain in rest position." To achieve this, physicians and nurses evaluate medical efficacy and complications including nausea/vomiting, constipation, somnolence and hallucination everyday using controlled release oxycodone in addition to NSAIDs and prochlorperazine, stool softener and peristaltic stimulant for adverse effects. These outcomes lead to the medication change the next day by calculation using visual basic function due to opioid titration theory. In twelve patients this CP was acceptable, and all of them achieved the second goal within a week without severe adverse effects except constipation.

  5. Assessing the physical loading of wearable computers.

    PubMed

    Knight, James F; Baber, Chris

    2007-03-01

    Wearable computers enable workers to interact with computer equipment in situations where previously they were unable. Attaching a computer to the body though has an unknown physical effect. This paper reports a methodology for addressing this, by assessing postural effects and the effect of added weight. Using the example of arm-mounted computers (AMCs), the paper shows that adopting a posture to interact with an AMC generates fatiguing levels of stress and a load of 0.54 kg results in increased level of stress and increased rate of fatigue. The paper shows that, due to poor postures adopted when wearing and interacting with computers and the weight of the device attached to the body, one possible outcome for prolonged exposure is the development of musculoskeletal disorders.

  6. Assessment of critical-fluid extractions in the process industries

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The potential for critical-fluid extraction as a separation process for improving the productive use of energy in the process industries is assessed. Critical-fluid extraction involves the use of fluids, normally gaseous at ambient conditions, as extraction solvents at temperatures and pressures around the critical point. Equilibrium and kinetic properties in this regime are very favorable for solvent applications, and generally allow major reductions in the energy requirements for separating and purifying chemical component of a mixture.

  7. NUMERICAL COMPUTATIONS OF CO-EXISTING SUPER-CRITICAL AND SUB-CRITICAL FLOWS BASED UPON CRD SCHEMES

    NASA Astrophysics Data System (ADS)

    Horie, Katsuya; Okamura, Seiji; Kobayashi, Yusuke; Hyodo, Makoto; Hida, Yoshihisa; Nishimoto, Naoshi; Mori, Akio

    Stream flows in steep gradient bed form complicating flow configurations, where co-exist super-critical and sub-critical flows. Computing numerically such flows are the key to successful river management. This study applied CRD schemes to 1D and 2D stream flow computations and proposed genuine ways to eliminate expansion shock waves. Through various cases of computing stream flows conducted, CRD schemes showed that i) conservativeness of discharge and accuracy of four significant figures are ensured, ii) artificial viscosity is not explicitly used for computational stabilization, and thus iii) 1D and 2D computations based upon CRD schemes are applicable to evaluating complicating stream flows for river management.

  8. Assessment of Critical Thinking Ability in Medical Students

    ERIC Educational Resources Information Center

    Macpherson, Karen; Owen, Cathy

    2010-01-01

    In this study conducted with 80 first-year students in a graduate medical course at the Australian National University, Canberra, students' critical thinking skills were assessed using the Watson-Glaser Critical Thinking Appraisal (Forms A and B) in a test-retest design. Results suggested that overall subjects retained consistent patterns of…

  9. Cyber Security: Critical Infrastructure Controls Assessment Framework

    DTIC Science & Technology

    2011-05-01

    Industry SANS ‐ CAG OASIS Private   ISA ‐99 <more…> SOX <more…> OWASP <more…> And Growing Day by Day……………….. CIP Security Controls Assessment...NERC-CIP NIST-Cyber Grid Chemical Cyber Physical System Security Standards PCI OASIS OWASP Nuclear Transportation ISA -99 CIP Security Controls...Institute of Electrical and Electronics Engineers.  –           14. ISA  – Industrial Society for Automation 15. ISO – International Standards Organization

  10. A COMPUTER-ASSIST MATERIAL TRACKING SYSTEM AS A CRITICALITY SAFETY AID TO OPERATORS

    SciTech Connect

    Claybourn, R V; Huang, S T

    2007-03-30

    In today's compliant-driven environment, fissionable material handlers are inundated with work control rules and procedures in carrying out nuclear operations. Historically, human errors are one of the key contributors of various criticality accidents. Since moving and handling fissionable materials are key components of their job functions, any means that can be provided to assist operators in facilitating fissionable material moves will help improve operational efficiency and enhance criticality safety implementation. From the criticality safety perspective, operational issues have been encountered in Lawrence Livermore National Laboratory (LLNL) plutonium operations. Those issues included lack of adequate historical record keeping for the fissionable material stored in containers, a need for a better way of accommodating operations in a research and development setting, and better means of helping material handlers in carrying out various criticality safety controls. Through the years, effective means were implemented including better work control process, standardized criticality control conditions (SCCC) and relocation of criticality safety engineers to the plutonium facility. Another important measure taken was to develop a computer data acquisition system for criticality safety assessment, which is the subject of this paper. The purpose of the Criticality Special Support System (CSSS) is to integrate many of the proven operational support protocols into a software system to assist operators with assessing compliance to procedures during the handling and movement of fissionable materials. Many nuclear facilities utilize mass cards or a computer program to track fissionable material mass data in operations. Additional item specific data such as, the presence of moderators or close fitting reflectors, could be helpful to fissionable material handlers in assessing compliance to SCCC's. Computer-assist checking of a workstation material inventory against the

  11. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  12. Computer Competence: The First National Assessment.

    ERIC Educational Resources Information Center

    Martinez, Michael E.; Mead, Nancy A.

    This report contains the results of a national survey conducted by the National Assessment of Educational Progress (NAEP) during the 1985-86 school year. The report, which attempts to capture the interacting forces influencing computer competence among students, is presented in six chapters: (1) Overview (major findings, significance of this…

  13. Establishing the Critical Elements That Determine Authentic Assessment

    ERIC Educational Resources Information Center

    Ashford-Rowe, Kevin; Herrington, Janice; Brown, Christine

    2014-01-01

    This study sought to determine the critical elements of an authentic learning activity, design them into an applicable framework and then use this framework to guide the design, development and application of work-relevant assessment. Its purpose was to formulate an effective model of task design and assessment. The first phase of the study…

  14. Guidelines for a Scientific Approach to Critical Thinking Assessment

    ERIC Educational Resources Information Center

    Bensley, D. Alan; Murtagh, Michael P.

    2012-01-01

    Assessment of student learning outcomes can be a powerful tool for improvement of instruction when a scientific approach is taken; unfortunately, many educators do not take full advantage of this approach. This article examines benefits of taking a scientific approach to critical thinking assessment and proposes guidelines for planning,…

  15. Criticism and Assessment Applied to New Media Art

    ERIC Educational Resources Information Center

    Ursyn, Anna

    2015-01-01

    This text examines educational criticism and assessment with an emphasis on the new media arts. The article shares with readers the versatile, abridged to four points criteria, based on a research on assessment made by students, faculty, and non-art-related professionals, thus providing a preliminary tool for the use in the classroom environment.…

  16. Integrating Critical Thinking into the Assessment of College Writing

    ERIC Educational Resources Information Center

    McLaughlin, Frost; Moore, Miriam

    2012-01-01

    When writing teachers at any level get together to assess student essays, they often disagree in their evaluations of the writing at hand. This is no surprise as writing is a complex process, and in evaluating it, teachers go through a complex sequence of thoughts before emerging with an overall assessment. Critical thinking, or the complexity of…

  17. Critical Assessment of Correction Methods for Fisheye Lens Distortion

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Tian, C.; Huang, Y.

    2016-06-01

    A fisheye lens is widely used to create a wide panoramic or hemispherical image. It is an ultra wide-angle lens that produces strong visual distortion. The distortion modeling and estimation of the fisheye lens are the crucial step for fisheye lens calibration and image rectification in computer vision and close-range photography. There are two kinds of distortion: radial and tangential distortion. Radial distortion is large for fisheye imaging and critical for the subsequent image processing. Although many researchers have developed calibration algorithms of radial distortion of fisheye lens, quantitative evaluation of the correction performance has remained a challenge. This is the first paper that intuitively and objectively evaluates the performance of five different calibration algorithms. Upto- date research on fisheye lens calibration is comprehensively reviewed to identify the research need. To differentiate their performance in terms of precision and ease-using, five methods are then tested using a diverse set of actual images of the checkerboard that are taken at Wuhan University, China under varying lighting conditions, shadows, and shooting angles. The method of rational function model, which was generally used for wide-angle lens correction, outperforms the other methods. However, the one parameter division model is easy for practical use without compromising too much the precision. The reason is that it depends on the linear structure in the image and requires no preceding calibration. It is a tradeoff between correction precision and ease-using. By critically assessing the strengths and limitations of the existing algorithms, the paper provides valuable insight and guideline for future practice and algorithm development that are important for fisheye lens calibration. It is promising for the optimal design of lens correction models that are suitable for the millions of portable imaging devices.

  18. Mobile sources critical review: 1998 NARSTO assessment

    NASA Astrophysics Data System (ADS)

    Sawyer, R. F.; Harley, R. A.; Cadle, S. H.; Norbeck, J. M.; Slott, R.; Bravo, H. A.

    Mobile sources of air pollutants encompass a range of vehicle, engine, and fuel combinations. They emit both of the photochemical ozone precursors, hydrocarbons and oxides of nitrogen. The most important source of hydrocarbons and oxides of nitrogen are light- and heavy-duty on-road vehicles and heavy-duty off-road vehicles, utilizing spark and compression ignition engines burning gasoline and diesel respectively. Fuel consumption data provide a convenient starting point for assessing current and future emissions. Modern light-duty, gasoline vehicles when new have very low emissions. The in-use fleet, due largely to emissions from a small "high emitter" fraction, has significantly larger emissions. Hydrocarbons and carbon monoxide are higher than reported in current inventories. Other gasoline powered mobile sources (motorcycles, recreational vehicles, lawn, garden, and utility equipment, and light aircraft) have high emissions on a per quantity of fuel consumed basis, but their contribution to total emissions is small. Additional uncertainties in spatial and temporal distribution of emissions exist. Heavy-duty diesel vehicles are becoming the dominant mobile source of oxides of nitrogen. Oxides of nitrogen emissions may be greater than reported in current inventories, but the evidence for this is mixed. Oxides of nitrogen emissions on a fuel-consumed basis are much greater from diesel mobile sources than from gasoline mobile sources. This is largely the result of stringent control of gasoline vehicle emissions and a lesser (heavy-duty trucks) or no control (construction equipment, locomotives, ships) of heavy-duty mobile sources. The use of alternative fuels, natural gas, propane, alcohols, and oxygenates in motor vehicles is increasing but remains small. Vehicles utilizing these fuels can be but are not necessarily cleaner than their gasoline or diesel counterparts. Historical vehicle kilometers traveled growth rates of about 2% annually in both the United States

  19. Research on computer aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  20. Computed Tomography: Image and Dose Assessment

    SciTech Connect

    Valencia-Ortega, F.; Ruiz-Trejo, C.; Rodriguez-Villafuerte, M.; Buenfil, A. E.; Mora-Hernandez, L. A.

    2006-09-08

    In this work an experimental evaluation of image quality and dose imparted during a computed tomography study in a Public Hospital in Mexico City is presented; The measurements required the design and construction of two phantoms at the Institute of Physics, UNAM, according to the recommendations of American Association of Physicists in Medicine (AAPM). Image assessment was performed in terms the spatial resolution and image contrast. Dose measurements were carried out using LiF: Mg,Ti (TLD-100) dosemeters and pencil-shaped ionisation chamber; The results for a computed tomography head study in single and multiple detector modes are presented.

  1. Critical thinking traits of top-tier experts and implications for computer science education

    NASA Astrophysics Data System (ADS)

    Bushey, Dean E.

    of this study suggest a need to examine how critical-thinking abilities are learned in the undergraduate computer science curriculum and the need to foster these abilities in order to produce the high-level, critical-thinking professionals necessary to fill the growing need for these experts. Due to the fact that current measures of academic performance do not adequately depict students' cognitive abilities, assessment of these skills must be incorporated into existing curricula.

  2. An Exploration of Three-Dimensional Integrated Assessment for Computational Thinking

    ERIC Educational Resources Information Center

    Zhong, Baichang; Wang, Qiyun; Chen, Jie; Li, Yi

    2016-01-01

    Computational thinking (CT) is a fundamental skill for students, and assessment is a critical factor in education. However, there is a lack of effective approaches to CT assessment. Therefore, we designed the Three-Dimensional Integrated Assessment (TDIA) framework in this article. The TDIA has two aims: one was to integrate three dimensions…

  3. Accessible high performance computing solutions for near real-time image processing for time critical applications

    NASA Astrophysics Data System (ADS)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  4. Computer Software Training and HRD: What Are the Critical Issues?

    ERIC Educational Resources Information Center

    Altemeyer, Brad

    2005-01-01

    The paper explores critical issues for HRD practice from a parsonian framework across the HRD legs of organizational development, adult learning, and training and development. Insights into the critical issues emerge from this approach. Identifying successful transfer of training to be critical for organizational, group, and individual success.…

  5. Effects of Computer-Aided Personalized System of Instruction in Developing Knowledge and Critical Thinking in Blended Learning Courses

    ERIC Educational Resources Information Center

    Svenningsen, Louis; Pear, Joseph J.

    2011-01-01

    Two experiments were conducted to assess an online version of Keller's personalized system of instruction, called computer-aided personalized system of instruction (CAPSI), as part of a blended learning design with regard to course knowledge and critical thinking development. In Experiment 1, two lecture sections of an introduction to University…

  6. Critical issues using brain-computer interfaces for augmentative and alternative communication.

    PubMed

    Hill, Katya; Kovacs, Thomas; Shin, Sangeun

    2015-03-01

    Brain-computer interfaces (BCIs) may potentially be of significant practical value to patients in advanced stages of amyotrophic lateral sclerosis and locked-in syndrome for whom conventional augmentative and alternative communication (AAC) systems, which require some measure of consistent voluntary muscle control, are not satisfactory options. However, BCIs have primarily been used for communication in laboratory research settings. This article discusses 4 critical issues that should be addressed as BCIs are translated out of laboratory settings to become fully functional BCI/AAC systems that may be implemented clinically. These issues include (1) identification of primary, secondary, and tertiary system features; (2) integrating BCI/AAC systems in the World Health Organization's International Classification of Functioning, Disability and Health framework; (3) implementing language-based assessment and intervention; and (4) performance measurement. A clinical demonstration project is presented as an example of research beginning to address these critical issues.

  7. Antiracist Education in Theory and Practice: A Critical Assessment

    ERIC Educational Resources Information Center

    Niemonen, Jack

    2007-01-01

    "Antiracist Education in Theory and Practice: A Critical Assessment" As a set of pedagogical, curricular, and organizational strategies, antiracist education claims to be the most progressive way today to understand race relations. Constructed from whiteness studies and the critique of colorblindness, its foundational core is located in…

  8. What Is a Good School? Critical Thoughts about Curriculum Assessments

    ERIC Educational Resources Information Center

    Zierer, Klaus

    2013-01-01

    Within the educational field, measurements such as the Programme for International Student Assessment (PISA), the Trends in International Mathematics and Science Study (TIMSS), and the Progress in International Reading Literacy Study (PIRLS) suggest we are living in a time of competition. This article takes a critical view of the modern drive to…

  9. Implementation and Critical Assessment of the Flipped Classroom Experience

    ERIC Educational Resources Information Center

    Scheg, Abigail G., Ed.

    2015-01-01

    In the past decade, traditional classroom teaching models have been transformed in order to better promote active learning and learner engagement. "Implementation and Critical Assessment of the Flipped Classroom Experience" seeks to capture the momentum of non-traditional teaching methods and provide a necessary resource for individuals…

  10. VOXMAT: Hybrid Computational Phantom for Dose Assessment

    SciTech Connect

    Akkurt, Hatice; Eckerman, Keith F

    2007-01-01

    The Oak Ridge National Laboratory (ORNL) computational phantoms have been the standard for assessing the radiation dose due to internal and external exposure over the past three decades. In these phantoms, the body surface and each organ are approximated by mathematical equations; hence, some of the organs are not necessarily realistic in their shape. Over the past two decades, these phantoms have been revised and updated: some of the missing internal organs have been added and the locations of the existing organs have been revised (e.g., thyroid). In the original phantom, only three elemental compositions were used to describe all body tissues. Recently, the compositions of the organs have been updated based on ICRP-89 standards. During the past decade, phantoms based on CT scans were developed for use in dose assessment. Although their shapes are realistic, some computational challenges are noted; including increased computational times and increased memory requirements. For good spatial resolution, more than several million voxels are used to represent the human body. Moreover, when CT scans are obtained, the subject is in a supine position with arms at the side. In some occupational exposure cases, it is necessary to evaluate the dose with the arms and legs in different positions. It will be very difficult and inefficient to reposition the voxels defining the arms and legs to simulate these exposure geometries. In this paper, a new approach for computational phantom development is presented. This approach utilizes the combination of a mathematical phantom and a voxelized phantom for the representation of the anatomy.

  11. Assessing knowledge change in computer science

    NASA Astrophysics Data System (ADS)

    Gradwohl Nash, Jane; Bravaco, Ralph J.; Simonson, Shai

    2006-03-01

    The purpose of this study was to assess structural knowledge change across a two-week workshop designed to provide high-school teachers with training in Java and Object Oriented Programming. Both before and after the workshop, teachers assigned relatedness ratings to pairs of key concepts regarding Java and Object Oriented Programming. Their ratings were submitted to the Pathfinder network-scaling algorithm, which uses distance estimates to generate an individual's knowledge structure representation composed of nodes that are connected by links. Results showed that significant change in teachers' knowledge structure occurred during the workshop, both in terms of individual teacher networks and their averaged networks. Moreover, these changes were significantly related to performance in the workshop. The results of this study suggest several implications for teaching and assessment in computer science.

  12. None but Ourselves Can Free Our Minds: Critical Computational Literacy as a Pedagogy of Resistance

    ERIC Educational Resources Information Center

    Lee, Clifford H.; Soep, Elisabeth

    2016-01-01

    Critical computational literacy (CCL) is a new pedagogical and conceptual framework that combines the strengths of critical literacy and computational thinking. Through CCL, young people conceptualize, create, and disseminate digital projects that break silences, expose important truths, and challenge unjust systems, all the while building skills…

  13. Assessing Terrorist Motivations for Attacking Critical "Chemical" Infrastructure

    SciTech Connect

    Ackerman, G; Bale, J; Moran, K

    2004-12-14

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security, and way of life. One particular type of CI--that relating to chemicals--constitutes both an important element of our nation's infrastructure and a particularly attractive set of potential targets. This is primarily because of the large quantities of toxic industrial chemicals (TICs) it employs in various operations and because of the essential economic functions it serves. This study attempts to minimize some of the ambiguities that presently impede chemical infrastructure threat assessments by providing new insight into the key motivational factors that affect terrorist organizations propensity to attack chemical facilities. Prepared as a companion piece to the Center for Nonproliferation Studies August 2004 study--''Assessing Terrorist Motivations for Attacking Critical Infrastructure''--it investigates three overarching research questions: (1) why do terrorists choose to attack chemical-related infrastructure over other targets; (2) what specific factors influence their target selection decisions concerning chemical facilities; and (3) which, if any, types of groups are most inclined to attack chemical infrastructure targets? The study involved a multi-pronged research design, which made use of four discrete investigative techniques to answer the above questions as comprehensively as possible. These include: (1) a review of terrorism and threat assessment literature to glean expert consensus regarding terrorist interest in targeting chemical facilities; (2) the preparation of case studies to help identify internal group factors and contextual influences that have played a significant role in leading some terrorist groups to attack chemical facilities; (3) an examination of data from the Critical Infrastructure Terrorist Incident Catalog (CrITIC) to further illuminate the nature of terrorist attacks against chemical facilities to date; and (4

  14. Antibiotic prophylaxis and reflux: critical review and assessment

    PubMed Central

    Baquerizo, Bernarda Viteri

    2014-01-01

    The use of continuous antibiotic prophylaxis (CAP) was critical in the evolution of vesicoureteral reflux (VUR) from a condition in which surgery was the standard of treatment to its becoming a medically managed condition. The efficacy of antibiotic prophylaxis in the management of VUR has been challenged in recent years, and significant confusion exists as to its clinical value. This review summarizes the critical factors in the history, use, and investigation of antibiotic prophylaxis in VUR. This review provides suggestions for assessing the potential clinical utility of prophylaxis. PMID:25580258

  15. Computer Assessment of Mild Cognitive Impairment

    PubMed Central

    Saxton, Judith; Morrow, Lisa; Eschman, Amy; Archer, Gretchen; Luther, James; Zuccolotto, Anthony

    2009-01-01

    Many older individuals experience cognitive decline with aging. The causes of cognitive dysfunction range from the devastating effects of Alzheimer’s disease (AD) to treatable causes of dysfunction and the normal mild forgetfulness described by many older individuals. Even mild cognitive dysfunction can impact medication adherence, impair decision making, and affect the ability to drive or work. However, primary care physicians do not routinely screen for cognitive difficulties and many older patients do not report cognitive problems. Identifying cognitive impairment at an office visit would permit earlier referral for diagnostic work-up and treatment. The Computer Assessment of Mild Cognitive Impairment (CAMCI) is a self-administered, user-friendly computer test that scores automatically and can be completed independently in a quiet space, such as a doctor’s examination room. The goal of this study was to compare the sensitivity and specificity of the CAMCI and the Mini Mental State Examination (MMSE) to identify mild cognitive impairment (MCI) in 524 nondemented individuals > 60 years old who completed a comprehensive neuropsychological and clinical assessment together with the CAMCI and MMSE. We hypothesized that the CAMCI would exhibit good sensitivity and specificity and would be superior compared with the MMSE in these measures. The results indicated that the MMSE was relatively insensitive to MCI. In contrast, the CAMCI was highly sensitive (86%) and specific (94%) for the identification of MCI in a population of community-dwelling nondemented elderly individuals. PMID:19332976

  16. Computational Tools to Assess Turbine Biological Performance

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

  17. Laptop Computer - Based Facial Recognition System Assessment

    SciTech Connect

    R. A. Cain; G. B. Singleton

    2001-03-01

    The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results. After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in

  18. Critical thinking: assessing the risks to the future security of supply of critical metals

    NASA Astrophysics Data System (ADS)

    Gunn, Gus

    2015-04-01

    Increasing world population, the spread of prosperity across the globe and the demands of new technologies have led to a revival of concerns about the availability of raw materials needed by society. Despite scare stories about resource depletion, physical exhaustion of minerals is considered to be unlikely. However, we do need to know which materials might be of concern so that we can develop strategies to secure adequate supplies and to mitigate the effects of supply disruption. This requirement has led to renewed interest in criticality, a term that is generally used to refer to metals and minerals of high economic importance that have a relatively high likelihood of supply disruption. The European Union (EU) developed a quantitative methodology for the assessment of criticality which led to the definition of 14 raw materials as critical to the EU economy (EC, 2010). This has succeeded in raising awareness of potential supply issues and in helping to prioritise requirements for new policies and supporting research. The EU has recently assessed a larger number of candidate materials of which 20 are now identified as critical to the EU (EC, 2014). These include metals such as indium, mostly used in flat-screen displays, antimony for flame retardants and cobalt for rechargeable batteries, alloys and a host of other products. Although there is no consensus on the methodology for criticality assessments and broad analyses at this scale are inevitably imperfect, they can, nevertheless, provide early warning of supply problems. However, in order to develop more rigorous and dynamic assessments of future availability detailed analysis of the whole life-cycle of individual metals to identify specific problems and develop appropriate solutions is required. New policies, such as the Raw Materials Initiative (2008) and the European Innovation Partnership on Raw Materials (2013), have been developed by the European Commission (EC) and are aimed at securing sustainable

  19. Geospatial decision support framework for critical infrastructure interdependency assessment

    NASA Astrophysics Data System (ADS)

    Shih, Chung Yan

    Critical infrastructures, such as telecommunications, energy, banking and finance, transportation, water systems and emergency services are the foundations of modern society. There is a heavy dependence on critical infrastructures at multiple levels within the supply chain of any good or service. Any disruptions in the supply chain may cause profound cascading effect to other critical infrastructures. A 1997 report by the President's Commission on Critical Infrastructure Protection states that a serious interruption in freight rail service would bring the coal mining industry to a halt within approximately two weeks and the availability of electric power could be reduced in a matter of one to two months. Therefore, this research aimed at representing and assessing the interdependencies between coal supply, transportation and energy production. A proposed geospatial decision support framework was established and applied to analyze interdependency related disruption impact. By utilizing the data warehousing approach, geospatial and non-geospatial data were retrieved, integrated and analyzed based on the transportation model and geospatial disruption analysis developed in the research. The results showed that by utilizing this framework, disruption impacts can be estimated at various levels (e.g., power plant, county, state, etc.) for preventative or emergency response efforts. The information derived from the framework can be used for data mining analysis (e.g., assessing transportation mode usages; finding alternative coal suppliers, etc.).

  20. Benchmarking Pain Assessment Rate in Critical Care Transport.

    PubMed

    Reichert, Ryan J; Gothard, M David; Schwartz, Hamilton P; Bigham, Michael T

    The purpose of this study is to determine the rate of pain assessment in pediatric neonatal critical care transport (PNCCT). The GAMUT database was interrogated for an 18-month period and excluded programs with less than 10% pediatric or neonatal patient contacts and less than 3 months of any metric data reporting during the study period. We hypothesized pain assessment during PNCCT is superior to prehospital pain assessment rates, although inferior to in-hospital rates. Sixty-two programs representing 104,445 patient contacts were analyzed. A total of 21,693 (20.8%) patients were reported to have a documented pain assessment. Subanalysis identified 17 of the 62 programs consistently reporting pain assessments. This group accounted for 24,599 patients and included 7,273 (29.6%) neonatal, 12,655 (51.5%) pediatric, and 4,664 (19.0%) adult patients. Among these programs, the benchmark rate of pain assessment was 90.0%. Our analysis shows a rate below emergency medical services and consistent with published hospital rates of pain assessment. Poor rates of tracking of this metric among participating programs was noted, suggesting an opportunity to investigate the barriers to documentation and reporting of pain assessments in PNCCT and a potential quality improvement initiative.

  1. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Möderl, Michael; Rauch, Wolfgang

    2011-12-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g., by terrorist attacks, infrastructure deterioration or climate change. For the spatial risk assessment, vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process. Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios. Thereby parameters are varied according to the specific impact of a particular threat scenario. Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past. The application of the spatial risk assessment is exemplified by means of a case study for a water supply system, but the principal concept is applicable likewise to other critical network infrastructure. The aim of the approach is to help decision makers in choosing zones for preventive measures.

  2. Critical evaluation of oxygen-uptake assessment in swimming.

    PubMed

    Sousa, Ana; Figueiredo, Pedro; Pendergast, David; Kjendlie, Per-Ludvik; Vilas-Boas, João P; Fernandes, Ricardo J

    2014-03-01

    Swimming has become an important area of sport science research since the 1970s, with the bioenergetic factors assuming a fundamental performance-influencing role. The purpose of this study was to conduct a critical evaluation of the literature concerning oxygen-uptake (VO2) assessment in swimming, by describing the equipment and methods used and emphasizing the recent works conducted in ecological conditions. Particularly in swimming, due to the inherent technical constraints imposed by swimming in a water environment, assessment of VO2max was not accomplished until the 1960s. Later, the development of automated portable measurement devices allowed VO2max to be assessed more easily, even in ecological swimming conditions, but few studies have been conducted in swimming-pool conditions with portable breath-by-breath telemetric systems. An inverse relationship exists between the velocity corresponding to VO2max and the time a swimmer can sustain it at this velocity. The energy cost of swimming varies according to its association with velocity variability. As, in the end, the supply of oxygen (whose limitation may be due to central-O2 delivery and transportation to the working muscles-or peripheral factors-O2 diffusion and utilization in the muscles) is one of the critical factors that determine swimming performance, VO2 kinetics and its maximal values are critical in understanding swimmers' behavior in competition and to develop efficient training programs.

  3. Critical fault patterns determination in fault-tolerant computer systems

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Losq, J.

    1978-01-01

    The method proposed tries to enumerate all the critical fault-patterns (successive occurrences of failures) without analyzing every single possible fault. The conditions for the system to be operating in a given mode can be expressed in terms of the static states. Thus, one can find all the system states that correspond to a given critical mode of operation. The next step consists in analyzing the fault-detection mechanisms, the diagnosis algorithm and the process of switch control. From them, one can find all the possible system configurations that can result from a failure occurrence. Thus, one can list all the characteristics, with respect to detection, diagnosis, and switch control, that failures must have to constitute critical fault-patterns. Such an enumeration of the critical fault-patterns can be directly used to evaluate the overall system tolerance to failures. Present research is focused on how to efficiently make use of these system-level characteristics to enumerate all the failures that verify these characteristics.

  4. Computer-Supported Development of Critical Reasoning Skills

    ERIC Educational Resources Information Center

    Spurrett, David

    2005-01-01

    Thinking skills are important and education is expected to develop them. Empirical results suggest that formal education makes a modest and largely indirect difference. This paper will describe the early stages of an ongoing curriculum initiative in the teaching of critical reasoning skills in the philosophy curriculum on the Howard College Campus…

  5. Quality assessment of clinical computed tomography

    NASA Astrophysics Data System (ADS)

    Berndt, Dorothea; Luckow, Marlen; Lambrecht, J. Thomas; Beckmann, Felix; Müller, Bert

    2008-08-01

    Three-dimensional images are vital for the diagnosis in dentistry and cranio-maxillofacial surgery. Artifacts caused by highly absorbing components such as metallic implants, however, limit the value of the tomograms. The dominant artifacts observed are blowout and streaks. Investigating the artifacts generated by metallic implants in a pig jaw, the data acquisition for the patients in dentistry should be optimized in a quantitative manner. A freshly explanted pig jaw including related soft-tissues served as a model system. Images were recorded varying the accelerating voltage and the beam current. The comparison with multi-slice and micro computed tomography (CT) helps to validate the approach with the dental CT system (3D-Accuitomo, Morita, Japan). The data are rigidly registered to comparatively quantify their quality. The micro CT data provide a reasonable standard for quantitative data assessment of clinical CT.

  6. Critical evaluation of soil contamination assessment methods for trace metals.

    PubMed

    Desaules, André

    2012-06-01

    Correctly distinguishing between natural and anthropogenic trace metal contents in soils is crucial for assessing soil contamination. A series of assessment methods is critically outlined. All methods rely on assumptions of reference values for natural content. According to the adopted reference values, which are based on various statistical and geochemical procedures, there is a considerable range and discrepancy in the assessed soil contamination results as shown by the five methods applied to three weakly contaminated sites. This is a serious indication of their high methodological specificity and bias. No method with off-site reference values could identify any soil contamination in the investigated trace metals (Pb, Cu, Zn, Cd, Ni), while the specific and sensitive on-site reference methods did so for some sites. Soil profile balances are considered to produce the most plausible site-specific results, provided the numerous assumptions are realistic and the required data reliable. This highlights the dilemma between model and data uncertainty. Data uncertainty, however, is a neglected issue in soil contamination assessment so far. And the model uncertainty depends much on the site-specific realistic assumptions of pristine natural trace metal contents. Hence, the appropriate assessment of soil contamination is a subtle optimization exercise of model versus data uncertainty and specification versus generalization. There is no general and accurate reference method and soil contamination assessment is still rather fuzzy, with negative implications for the reliability of subsequent risk assessments.

  7. Fool's Gold: A Critical Look at Computers in Childhood.

    ERIC Educational Resources Information Center

    Cordes, Colleen, Ed.; Miller, Edward, Ed.

    Noting that computers are reshaping children's lives in profound and unexpected ways, this report examines potential harms and promised benefits of these changes, focusing on early childhood and elementary education. Chapter 1 argues that popular attempts to hurry children intellectually are at odds with the natural pace of human development.…

  8. Exact computation of the critical exponents of the jamming transition

    NASA Astrophysics Data System (ADS)

    Zamponi, Francesco

    2015-03-01

    The jamming transition marks the emergence of rigidity in a system of amorphous and athermal grains. It is characterized by a divergent correlation length of the force-force correlation and non-trivial critical exponents that are independent of spatial dimension, suggesting that a mean field theory can correctly predict their values. I will discuss a mean field approach to the problem based on the exact solution of the hard sphere model in infinite dimension. An unexpected analogy with the Sherrington-Kirkpatrick spin glass model emerges in the solution: as in the SK model, the glassy states turn out to be marginally stable, and are described by a Parisi equation. Marginal stability has a deep impact on the critical properties of the jamming transition and allows one to obtain analytic predictions for the critical exponents. The predictions are consistent with a recently developed scaling theory of the jamming transition, and with numerical simulations. Finally, I will briefly discuss some possible extensions of this approach to other open issues in the theory of glasses.

  9. Breadth-Oriented Outcomes Assessment in Computer Science.

    ERIC Educational Resources Information Center

    Cordes, David; And Others

    Little work has been done regarding the overall assessment of quality of computer science graduates at the undergraduate level. This paper reports on a pilot study at the University of Alabama of a prototype computer science outcomes assessment designed to evaluate the breadth of knowledge of computer science seniors. The instrument evaluated two…

  10. Assessing Computer Knowledge among College Students.

    ERIC Educational Resources Information Center

    Parrish, Allen; And Others

    This paper reports on a study involving the administration of two examinations that were designed to evaluate student knowledge in several areas of computing. The tests were given both to computer science majors and to those enrolled in computer science classes from other majors. They sought to discover whether computer science majors demonstrated…

  11. Ultrasound to assess diaphragmatic function in the critically ill—a critical perspective

    PubMed Central

    Haaksma, Mark; Tuinman, Pieter Roel

    2017-01-01

    Ultrasound of the diaphragm in critically ill patients has become a diagnostic technique of emerging interest among clinicians and scientists. The advantages include that it is widely available, non-invasive and examination can be performed after relatively short training and at low costs. It is used to estimate muscle mass by measurement of muscle thickness and diagnose weakness by the assessment of diaphragm movement during unassisted breathing. Thickening of the muscle during inspiration has been used to quantify force generation. The enthusiasm that surrounds this topic is shared by many clinicians and we agree that ultrasound is a valuable tool to screen for diaphragm dysfunction in intensive care unit (ICU) patients. However, in our opinion much more studies are required to validate ultrasound as a tool to quantify breathing effort. More sophisticated ultrasound techniques, such as speckle tracking imaging are promising techniques to evaluate respiratory muscle function in patients, including the critically ill. PMID:28361079

  12. A CAD (Classroom Assessment Design) of a Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, Nazir S.

    2012-01-01

    This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified…

  13. Clinical significance of computed tomography assessment for third molar surgery

    PubMed Central

    Nakamori, Kenji; Tomihara, Kei; Noguchi, Makoto

    2014-01-01

    Surgical extraction of the third molar is the most commonly performed surgical procedure in the clinical practice of oral surgery. Third molar surgery is warranted when there is inadequate space for eruption, malpositioning, or risk for cyst or odontogenic tumor formation. Preoperative assessment should include a detailed morphologic analysis of the third molar and its relationship to adjacent structures and surrounding tissues. Due to developments in medical engineering technology, computed tomography (CT) now plays a critical role in providing the clear images required for adequate assessment prior to third molar surgery. Removal of the maxillary third molar is associated with a risk for maxillary sinus perforation, whereas removal of the mandibular third molar can put patients at risk for a neurosensory deficit from damage to the lingual nerve or inferior alveolar nerve. Multiple factors, including demographic, anatomic, and treatment-related factors, influence the incidence of nerve injury during or following removal of the third molar. CT assessment of the third molar prior to surgery can identify some of these risk factors, such as the absence of cortication between the mandibular third molar and the inferior alveolar canal, prior to surgery to reduce the risk for nerve damage. This topic highlight presents an overview of the clinical significance of CT assessment in third molar surgery. PMID:25071882

  14. Literary and Electronic Hypertext: Borges, Criticism, Literary Research, and the Computer.

    ERIC Educational Resources Information Center

    Davison, Ned J.

    1991-01-01

    Examines what "hypertext" means to literary criticism on the one hand (i.e., intertextuality) and computing on the other, to determine how the two concepts may serve each other in a mutually productive way. (GLR)

  15. 24 CFR 901.105 - Computing assessment score.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Computing assessment score. 901.105 Section 901.105 Housing and Urban Development Regulations Relating to Housing and Urban Development... DEVELOPMENT PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.105 Computing assessment score. (a)...

  16. Reverse engineering of metabolic networks, a critical assessment.

    PubMed

    Hendrickx, Diana M; Hendriks, Margriet M W B; Eilers, Paul H C; Smilde, Age K; Hoefsloot, Huub C J

    2011-02-01

    Inferring metabolic networks from metabolite concentration data is a central topic in systems biology. Mathematical techniques to extract information about the network from data have been proposed in the literature. This paper presents a critical assessment of the feasibility of reverse engineering of metabolic networks, illustrated with a selection of methods. Appropriate data are simulated to study the performance of four representative methods. An overview of sampling and measurement methods currently in use for generating time-resolved metabolomics data is given and contrasted with the needs of the discussed reverse engineering methods. The results of this assessment show that if full inference of a real-world metabolic network is the goal there is a large discrepancy between the requirements of reverse engineering of metabolic networks and contemporary measurement practice. Recommendations for improved time-resolved experimental designs are given.

  17. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  18. Criticality assessment of the Defense Waste Processing Facility

    SciTech Connect

    Ha, B.C.; Williamson, T.G.; Clemmons, J.S.; Chandler, M.C.

    1996-08-01

    Assessment of nuclear criticality potential of the S-Area Defense Waste Processing Facility (DWPF) is required to ensure the safe processing of radioactive waste for final disposal. At the Savannah River Site (SRS), high-level radioactive wastes are stored as caustic slurries. During storage, the wastes separate into a supernate layer and a sludge layer. The radionuclides from the sludge and supernate will be immobilized into borosilicate glass for storage and eventual disposal. The DWPF will initially immobilize sludge only, with simulated non-radioactive Precipitate Hydrolysis Aqueous (PHA) product. This paper demonstrates that criticality poses only a negligible risk in the DWPF process because of the characteristics of the waste and the DWPF process. The waste contains low concentration of fissile material and many elements which act as neutron poisons. Also, the DWPF process chemistry does not affect separation and accumulation of fissile materials. Experiments showed that DWPF can process all the high-level radioactive wastes currently stored at SRS with negligible criticality risk under normal and abnormal/process upset operation.

  19. 78 FR 29375 - Protected Critical Infrastructure Information (PCII) Office Self-Assessment Questionnaire

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... SECURITY Protected Critical Infrastructure Information (PCII) Office Self- Assessment Questionnaire AGENCY... Information Collection Division (IICD), Protected Critical Infrastructure Information (PCII) Program will...: The PCII Program was created by Congress under the Critical Infrastructure Information Act of...

  20. 77 FR 68795 - Protected Critical Infrastructure Information (PCII) Office Self-Assessment Questionnaire

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... SECURITY Protected Critical Infrastructure Information (PCII) Office Self- Assessment Questionnaire AGENCY... Information Collection Division (IICD), Protected Critical Infrastructure Information (PCII) Program will... PCII Program under the Critical Infrastructure Information Act of 2002 for DHS to encourage...

  1. A critical assessment of vector control for dengue prevention.

    PubMed

    Achee, Nicole L; Gould, Fred; Perkins, T Alex; Reiner, Robert C; Morrison, Amy C; Ritchie, Scott A; Gubler, Duane J; Teyssou, Remy; Scott, Thomas W

    2015-05-01

    Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations.

  2. A Critical Assessment of Vector Control for Dengue Prevention

    PubMed Central

    Achee, Nicole L.; Gould, Fred; Perkins, T. Alex; Reiner, Robert C.; Morrison, Amy C.; Ritchie, Scott A.; Gubler, Duane J.; Teyssou, Remy; Scott, Thomas W.

    2015-01-01

    Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations. PMID:25951103

  3. Assessing monoclonal antibody product quality attribute criticality through clinical studies.

    PubMed

    Goetze, Andrew M; Schenauer, Matthew R; Flynn, Gregory C

    2010-01-01

    Recombinant therapeutic proteins, including antibodies, contain a variety of chemical and physical modifications. Great effort is expended during process and formulation development in controlling and minimizing this heterogeneity, which may not affect safety or efficacy, and, therefore, may not need to be controlled. Many of the chemical conversions also occur in vivo, and knowledge about the alterations can be applied to assessment of the potential impact on characteristics and the biological activity of therapeutic proteins. Other attributes may affect the drug clearance and thereby alter drug efficacy. In this review article, we describe attribute studies conducted using clinical samples and how information gleaned from them is applied to attribute criticality assessment. In general, how fast attributes change in vivo compared to the rate of mAb elimination is the key parameter used in these evaluations. An attribute with more rapidly changing levels may have greater potential to affect safety or efficacy and thereby reach the status of a Critical Quality Attribute (CQA) that should be controlled during production and storage, but the effect will depend on whether compositional changes are due to chemical conversion or differential clearance.

  4. Limited transthoracic echocardiography assessment in anaesthesia and critical care.

    PubMed

    Faris, John G; Veltman, Michael G; Royse, Colin F

    2009-09-01

    The use of echocardiography in anaesthesia and critical care started with transoesophageal echocardiography, whereas transthoracic echocardiography was largely the domain of the cardiologist. In recent times, there has been a change in focus towards transthoracic echocardiography owing to the development of small and portable, yet high-fidelity, echocardiography machines. The cost has reduced, thereby increasing the availability of equipment. A parallel development has been the concept of limited transthoracic echocardiography that can be performed by practitioners with limited experience. The basis of these examinations is to provide the practising clinician with immediate information to help guide management with a focus on haemodynamic evaluation, and limited structural (valve) assessment to categorise whether there is a valve disorder that may or may not cause haemodynamic instability. The limited examination is therefore goal directed. A number of named examinations exist which differ in their scope and views. All of these require a limited knowledge base, and are designed for the clinician to recognise patterns consistent with haemodynamic or anatomical abnormalities. They range from very limited two-dimensional assessments of ventricular function to more complex (yet presently limited) studies such as HEART (haemodynamic echocardiography assessment in real time) scan, which is designed to provide haemodynamic state, as well as basic valvular and pericardial assessment. It is suitable for goal-directed examination in the operating theatre, emergency department or intensive care unit (ICU) and for preoperative screening.

  5. Collected Wisdom: Assessment Tools for Computer Science Programs

    ERIC Educational Resources Information Center

    Sanders, Kathryn E.; McCartney, Robert

    2004-01-01

    In this paper, we investigate the question of what assessment tools are being used in practice by United States computing programs and what the faculty doing the assessment think of the tools they use. After presenting some background with regard to the design, implementation, and use of assessment, with particular attention to assessment tools,…

  6. Risk assessment for physical and cyber attacks on critical infrastructures.

    SciTech Connect

    Smith, Bryan J.; Sholander, Peter E.; Phelan, James M.; Wyss, Gregory Dane; Varnado, G. Bruce; Depoy, Jennifer Mae

    2005-08-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies. Existing risk assessment methodologies consider physical security and cyber security separately. As such, they do not accurately model attacks that involve defeating both physical protection and cyber protection elements (e.g., hackers turning off alarm systems prior to forced entry). This paper presents a risk assessment methodology that accounts for both physical and cyber security. It also preserves the traditional security paradigm of detect, delay and respond, while accounting for the possibility that a facility may be able to recover from or mitigate the results of a successful attack before serious consequences occur. The methodology provides a means for ranking those assets most at risk from malevolent attacks. Because the methodology is automated the analyst can also play 'what if with mitigation measures to gain a better understanding of how to best expend resources towards securing the facilities. It is simple enough to be applied to large infrastructure facilities without developing highly complicated models. Finally, it is applicable to facilities with extensive security as well as those that are less well-protected.

  7. Computer Viruses: An Assessment of Student Perceptions.

    ERIC Educational Resources Information Center

    Jones, Mary C.; Arnett, Kirk P.

    1992-01-01

    A majority of 213 college business students surveyed had knowledge of computer viruses; one-fourth had been exposed to them. Many believed that computer professionals are responsible for prevention and cure. Educators should make students aware of multiple sources of infection, the breadth and extent of possible damage, and viral detection and…

  8. The Acceptance and Use of Computer Based Assessment

    ERIC Educational Resources Information Center

    Terzis, Vasileios; Economides, Anastasios A.

    2011-01-01

    The effective development of a computer based assessment (CBA) depends on students' acceptance. The purpose of this study is to build a model that demonstrates the constructs that affect students' behavioral intention to use a CBA. The proposed model, Computer Based Assessment Acceptance Model (CBAAM) is based on previous models of technology…

  9. Data on NAEP 2011 writing assessment prior computer use.

    PubMed

    Tate, Tamara P; Warschauer, Mark; Abedi, Jamal

    2016-09-01

    This data article contains information based on the 2011 National Assessment of Educational Progress in Writing Restricted-Use Data, available from the National Center for Education Statistics (NCES Pub. No. 2014476). https://nces.ed.gov/nationsreportcard/researchcenter/datatools.aspx. The data include the statistical relationships between survey reports of teachers and students regarding prior use of computers and other technology and writing achievement levels on the 2011 computer-based NAEP writing assessment. This data article accompanies "The Effects of Prior Computer Use on Computer-Based Writing: The 2011 NAEP Writing Assessment" [1].

  10. Report on the 2011 Critical Assessment of Function Annotation (CAFA) meeting

    SciTech Connect

    Friedberg, Iddo

    2015-01-21

    The Critical Assessment of Function Annotation meeting was held July 14-15, 2011 at the Austria Conference Center in Vienna, Austria. There were 73 registered delegates at the meeting. We thank the DOE for this award. It helped us organize and support a scientific meeting AFP 2011 as a special interest group (SIG) meeting associated with the ISMB 2011 conference. The conference was held in Vienna, Austria, in July 2011. The AFP SIG was held on July 15-16, 2011 (immediately preceding the conference). The meeting consisted of two components, the first being a series of talks (invited and contributed) and discussion sections dedicated to protein function research, with an emphasis on the theory and practice of computational methods utilized in functional annotation. The second component provided a large-scale assessment of computational methods through participation in the Critical Assessment of Functional Annotation (CAFA). The meeting was exciting and, based on feedback, quite successful. There were 73 registered participants. The schedule was only slightly different from the one proposed, due to two cancellations. Dr. Olga Troyanskaya has canceled and we invited Dr. David Jones instead. Similarly, instead of Dr. Richard Roberts, Dr. Simon Kasif gave a closing keynote. The remaining invited speakers were Janet Thornton (EBI) and Amos Bairoch (University of Geneva).

  11. Critical factors in assessing risk from exposure to nasal carcinogens.

    PubMed

    Bogdanffy, M S; Mathison, B H; Kuykendall, J R; Harman, A E

    1997-10-31

    Anatomical, physiological, biochemical and molecular factors that contribute to chemical-induced nasal carcinogenesis are either largely divergent between test species and humans, or we know very little of them. These factors, let alone the uncertainty associated with our knowledge gap, present a risk assessor with the formidable task of making judgments about risks to human health from exposure to chemicals that have been identified in rodent studies to be nasal carcinogens. This paper summarizes some of the critical attributes of the hazard identification and dose-response aspects of risk assessments for nasal carcinogens that must be accounted for by risk assessors in order to make informed decisions. Data on two example compounds, dimethyl sulfate and hexamethylphosphoramide, are discussed to illustrate the diversity of information that can be used to develop informed hypotheses about mode of action and decisions on appropriate dosimeters for interspecies extrapolation. Default approaches to interspecies dosimetry extrapolation are described briefly and are followed by a discussion of a generalized physiologically based pharmacokinetic model that, unlike default approaches, is flexible and capable of incorporating many of the critical species-specific factors. Recent advancements in interspecies nasal dosimetry modeling are remarkable. However, it is concluded that without the development of research programs aimed at understanding carcinogenic susceptibility factors in human and rodent nasal tissues, development of plausible modes of action will lag behind the advancements made in dosimetry modeling.

  12. High Performance Computing Facility Operational Assessment 2015: Oak Ridge Leadership Computing Facility

    SciTech Connect

    Barker, Ashley D.; Bernholdt, David E.; Bland, Arthur S.; Gary, Jeff D.; Hack, James J.; McNally, Stephen T.; Rogers, James H.; Smith, Brian E.; Straatsma, T. P.; Sukumar, Sreenivas Rangan; Thach, Kevin G.; Tichenor, Suzy; Vazhkudai, Sudharshan S.; Wells, Jack C.

    2016-03-01

    Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility (OLCF) continues to surpass its operational target goals: supporting users; delivering fast, reliable systems; creating innovative solutions for high-performance computing (HPC) needs; and managing risks, safety, and security aspects associated with operating one of the most powerful computers in the world. The results can be seen in the cutting-edge science delivered by users and the praise from the research community. Calendar year (CY) 2015 was filled with outstanding operational results and accomplishments: a very high rating from users on overall satisfaction that ties the highest-ever mark set in CY 2014; the greatest number of core-hours delivered to research projects; the largest percentage of capability usage since the OLCF began tracking the metric in 2009; and success in delivering on the allocation of 60, 30, and 10% of core hours offered for the INCITE (Innovative and Novel Computational Impact on Theory and Experiment), ALCC (Advanced Scientific Computing Research Leadership Computing Challenge), and Director’s Discretionary programs, respectively. These accomplishments, coupled with the extremely high utilization rate, represent the fulfillment of the promise of Titan: maximum use by maximum-size simulations. The impact of all of these successes and more is reflected in the accomplishments of OLCF users, with publications this year in notable journals Nature, Nature Materials, Nature Chemistry, Nature Physics, Nature Climate Change, ACS Nano, Journal of the American Chemical Society, and Physical Review Letters, as well as many others. The achievements included in the 2015 OLCF Operational Assessment Report reflect first-ever or largest simulations in their communities; for example Titan enabled engineers in Los Angeles and the surrounding region to design and begin building improved critical infrastructure by enabling the highest-resolution Cybershake map for Southern

  13. A Practical and Theoretical Approach to Assessing Computer Attitudes: The Computer Attitudes Measure (CAM).

    ERIC Educational Resources Information Center

    Kay, Robin H.

    1989-01-01

    Describes study conducted at the University of Toronto that assessed the attitudes of student teachers toward computers by using a multicomponent model, the Computer Attitude Measure (CAM). Cognitive, affective, and behavioral attitudes are examined, and correlations of computer literacy, experience, and internal locus of control are discussed.…

  14. Critical Assessment of the Evidence for Striped Nanoparticles

    PubMed Central

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques. PMID:25402426

  15. Critical assessment of the evidence for striped nanoparticles.

    PubMed

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques.

  16. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    Through unsustainable land use practices, mining, deforestation, urbanisation and degradation by industrial pollution, soil losses are now hypothesized to be much faster (100 times or more) than soil formation - with the consequence that soil has become a finite resource. The crucial challenge for the international research community is to understand the rates of processes that dictate soil mass stocks and their function within Earth's Critical Zone (CZ). The CZ is the environment where soils are formed, degrade and provide their essential ecosystem services. Key among these ecosystem services are food and fibre production, filtering, buffering and transformation of water, nutrients and contaminants, storage of carbon and maintaining biological habitat and genetic diversity. We have initiated a new research project to address the priority research areas identified in the European Union Soil Thematic Strategy and to contribute to the development of a global network of Critical Zone Observatories (CZO) committed to soil research. Our hypothesis is that the combined physical-chemical-biological structure of soil can be assessed from first-principles and the resulting soil functions can be quantified in process models that couple the formation and loss of soil stocks with descriptions of biodiversity and nutrient dynamics. The objectives of this research are to 1. Describe from 1st principles how soil structure influences processes and functions of soils, 2. Establish 4 European Critical Zone Observatories to link with established CZOs, 3. Develop a CZ Integrated Model of soil processes and function, 4. Create a GIS-based modelling framework to assess soil threats and mitigation at EU scale, 5. Quantify impacts of changing land use, climate and biodiversity on soil function and its value and 6. Form with international partners a global network of CZOs for soil research and deliver a programme of public outreach and research transfer on soil sustainability. The

  17. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  18. Does Computer-Aided Formative Assessment Improve Learning Outcomes?

    ERIC Educational Resources Information Center

    Hannah, John; James, Alex; Williams, Phillipa

    2014-01-01

    Two first-year engineering mathematics courses used computer-aided assessment (CAA) to provide students with opportunities for formative assessment via a series of weekly quizzes. Most students used the assessment until they achieved very high (>90%) quiz scores. Although there is a positive correlation between these quiz marks and the final…

  19. Assessment of Computer Aids in Shipyards

    DTIC Science & Technology

    1993-04-01

    read NIAM diagrams with very little training. It takes really more to write good NIAM. Just like anyone can listen to music and appreciate it. Almost any...technology five years horn now. And just like the ex- ample of the antigravity machine, five years from now the computer business will look so different

  20. Empirically Assessing the Importance of Computer Skills

    ERIC Educational Resources Information Center

    Baker, William M.

    2013-01-01

    This research determines which computer skills are important for entry-level accountants, and whether some skills are more important than others. Students participated before and after internships in public accounting. Longitudinal analysis is also provided; responses from 2001 are compared to those from 2008-2009. Responses are also compared to…

  1. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  2. Assessing Critical Thinking in Higher Education: The HEIghten™ Approach and Preliminary Validity Evidence

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Mao, Liyang; Frankel, Lois; Xu, Jun

    2016-01-01

    Critical thinking is a learning outcome highly valued by higher education institutions and the workforce. The Educational Testing Service (ETS) has designed a next generation assessment, the HEIghten™ critical thinking assessment, to measure students' critical thinking skills in analytical and synthetic dimensions. This paper introduces the…

  3. The Effects of Using a Critical Thinking Scoring Rubric to Assess Undergraduate Students' Reading Skills

    ERIC Educational Resources Information Center

    Leist, Cathy W.; Woolwine, Mark A.; Bays, Cathy L.

    2012-01-01

    The purpose of this study was to investigate the use of a critical thinking rubric as an assessment of reading achievement for students enrolled in a reading intervention course. A reading prompt and scoring rubric, based on Richard Paul and Linda Elder's critical thinking framework, were created to assess critical reading in an intervention…

  4. Manipulating Critical Variables: A Framework for Improving the Impact of Computers in the School Environment.

    ERIC Educational Resources Information Center

    Collis, Betty

    Previous work assessing the effectiveness of computers in education has gone no further than acknowledging a network of interconnected variables (comprising a system) which contribute to computer impact and describing its component parts. An impact systems model developed by Glasman and Bibiaminov (1981) has been adapted to facilitate measurement…

  5. Computer assessment of atherosclerosis from angiographic images

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Brooks, S. H.; Crawford, D. W.; Cashin, W. L.

    1982-01-01

    A computer method for detection and quantification of atherosclerosis from angiograms has been developed and used to measure lesion change in human clinical trials. The technique involves tracking the vessel edges and measuring individual lesions as well as the overall irregularity of the arterial image. Application of the technique to conventional arterial-injection femoral and coronary angiograms is outlined and an experimental study to extend the technique to analysis of intravenous angiograms of the carotid and cornary arteries is described.

  6. [Computer-assisted surgery: assessment and perspectives].

    PubMed

    Demongeot, J

    The hospital in the future will be faced with the major problem of managing and optimizing the use of images provided from numerous sources examining both anatomy (MRI, CT-scan...) and function (gamma-camera, PET-scan...). One of the first to benefit from such rationalization will be the surgeon. After studying the results of the physical examination, the laboratory reports and the medical imaging, the surgeon will decide on the best curative measured and the best surgical route before operating. He thus needs a computer to assist him in integrating the multi-modal information available for his patient, in particular the imaging with automatic integration and visualisation in synoptic mode (perception step), showing the trajectory of possible access routes to the target organ, memorization of the chosen route (decision step) and real operation either using laser or a manuel tool, or with robot assistance under human control (action step). This close cooperation between surgery and computers is called computer-assisted surgery. A few examples of current uses an future perspectives of this new field of surgery are presented.

  7. Redefining second modernity for East Asia: a critical assessment.

    PubMed

    Han, Sang-Jin; Shim, Young-Hee

    2010-09-01

    The aim of this paper is to critically assess the extent to which the concept of second modernity and reflexive modernization proposed by Beck and Grande is relevant to East Asia. Concepts such as driving forces, human agency, objective-structural versus cultural-discursive dimensions, radicalizing versus deficiencies aspects of modernity, push versus pull factors are used to clarify the basic conditions of this historical transformation. Utilizing these conceptual schemes, this paper has advanced the following central claims: 1) Second modernity and reflexive modernization, as a global trend, affects East Asia as deeply as it does in the West, especially when we see this as a structurally conditioned historical transformation; 2) Global risks, as a driving force of second modernity, are more relevant in East Asia because, as a result of the side-effects of the rush-to development, East Asian countries face complex risks of far greater intensity than in the West; 3) The action-mediated pull factor of second-modern transformation in East Asia, expressed through the cultural-discursive articulation of collective desire and aspiration, differs significantly from the West. Consequently, the East Asian pathways to individualization display distinctive characteristics despite the common structural background where push factors operate; 4) East Asia also differs from the West in terms of the normative vision anchored in second modernity; 5) Nevertheless, concrete pathways to second modernity within East Asia differ from one country to another.

  8. Marine proteomics: a critical assessment of an emerging technology.

    PubMed

    Slattery, Marc; Ankisetty, Sridevi; Corrales, Jone; Marsh-Hunkin, K Erica; Gochfeld, Deborah J; Willett, Kristine L; Rimoldi, John M

    2012-10-26

    The application of proteomics to marine sciences has increased in recent years because the proteome represents the interface between genotypic and phenotypic variability and, thus, corresponds to the broadest possible biomarker for eco-physiological responses and adaptations. Likewise, proteomics can provide important functional information regarding biosynthetic pathways, as well as insights into mechanism of action, of novel marine natural products. The goal of this review is to (1) explore the application of proteomics methodologies to marine systems, (2) assess the technical approaches that have been used, and (3) evaluate the pros and cons of this proteomic research, with the intent of providing a critical analysis of its future roles in marine sciences. To date, proteomics techniques have been utilized to investigate marine microbe, plant, invertebrate, and vertebrate physiology, developmental biology, seafood safety, susceptibility to disease, and responses to environmental change. However, marine proteomics studies often suffer from poor experimental design, sample processing/optimization difficulties, and data analysis/interpretation issues. Moreover, a major limitation is the lack of available annotated genomes and proteomes for most marine organisms, including several "model species". Even with these challenges in mind, there is no doubt that marine proteomics is a rapidly expanding and powerful integrative molecular research tool from which our knowledge of the marine environment, and the natural products from this resource, will be significantly expanded.

  9. A critical assessment of wind tunnel results for the NACA 0012 airfoil

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.

    1987-01-01

    A large body of experimental results, obtained in more than 40 wind tunnels on a single, well-known two-dimensional configuration, has been critically examined and correlated. An assessment of some of the possible sources of error has been made for each facility, and data which are suspect have been identified. It was found that no single experiment provided a complete set of reliable data, although one investigation stands out as superior in many respects. However, from the aggregate of data the representative properties of the NACA 0012 airfoil can be identified with reasonable confidence over wide ranges of Mach number, Reynolds number, and angles of attack. This synthesized information can now be used to assess and validate existing and future wind tunnel results and to evaluate advanced Computational Fluid Dynamics codes.

  10. Comparative assessment of life cycle assessment methods used for personal computers.

    PubMed

    Yao, Marissa A; Higgs, Tim G; Cullen, Michael J; Stewart, Scott; Brady, Todd A

    2010-10-01

    This article begins with a summary of findings from commonly cited life cycle assessments (LCA) of Information and Communication Technology (ICT) products. While differing conclusions regarding environmental impact are expected across product segments (mobile phones, personal computers, servers, etc.) significant variation and conflicting conclusions are observed even within product segments such as the desktop Personal Computer (PC). This lack of consistent conclusions and accurate data limits the effectiveness of LCA to influence policy and product design decisions. From 1997 to 2010, the majority of published studies focused on the PC concluded that the use phase contributes most to the life cycle energy demand of PC products with a handful of studies suggesting that manufacturing phase of the PC has the largest impact. The purpose of this article is to critically review these studies in order to analyze sources of uncertainty, including factors that extend beyond data quality to the models and assumptions used. These findings suggest existing methods to combine process-based LCA data with product price data and remaining value adjustments are not reliable in conducting life cycle assessments for PC products. Recommendations are provided to assist future LCA work.

  11. Two Configurations for Accessing Classroom Computers: Differential Impact on Students' Critical Reflections and Their Empowerment

    ERIC Educational Resources Information Center

    Solhaug, T.

    2009-01-01

    The context of this article is the new technological environment and the struggle to use meaningful teaching practices in Norwegian schools. Students' critical reflections in two different technological learning environments in six upper secondary schools are compared. Three of these schools offer Internet-connected computers in special computer…

  12. Interaction and Critical Inquiry in Asynchronous Computer-Mediated Conferencing: A Research Agenda

    ERIC Educational Resources Information Center

    Hopkins, Joseph; Gibson, Will; Ros i. Sole, Cristina; Savvides, Nicola; Starkey, Hugh

    2008-01-01

    This paper reviews research on learner and tutor interaction in asynchronous computer-mediated (ACM) conferences used in distance learning. The authors note claims made for the potential of ACM conferences to promote higher-order critical inquiry and the social construction of knowledge, and argue that there is a general lack of evidence regarding…

  13. The Development of Computer-Based Piagetian Assessment Instruments.

    ERIC Educational Resources Information Center

    Barman, Charles R.

    1986-01-01

    Described are the development and validation of two computer-based Piagetian assessment instruments, designed to assist teachers in identifying cognitive reasoning patterns. Implications for teachers are presented. (Author/MT)

  14. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  15. Computer-Based Assessment of Problem Solving.

    ERIC Educational Resources Information Center

    Baker, E. L.; Mayer, R. E.

    1999-01-01

    Examines the components required to assess student problem solving in technology environments. Discusses the purposes of testing, provides an example demonstrating the difference between retention and transfer, defines and analyzes problem solving, and explores techniques and standards for measuring the quality of student understanding. Contains…

  16. Assessing Knowledge Change in Computer Science

    ERIC Educational Resources Information Center

    Nash, Jane Gradwohl; Bravaco, Ralph J.; Simonson, Shai

    2006-01-01

    The purpose of this study was to assess structural knowledge change across a two-week workshop designed to provide high-school teachers with training in Java and Object Oriented Programming. Both before and after the workshop, teachers assigned relatedness ratings to pairs of key concepts regarding Java and Object Oriented Programming. Their…

  17. Assessing Existing Item Bank Depth for Computer Adaptive Testing.

    ERIC Educational Resources Information Center

    Bergstrom, Betty A.; Stahl, John A.

    This paper reports a method for assessing the adequacy of existing item banks for computer adaptive testing. The method takes into account content specifications, test length, and stopping rules, and can be used to determine if an existing item bank is adequate to administer a computer adaptive test efficiently across differing levels of examinee…

  18. Using Computer-Assisted Assessment Heuristics for Usability Evaluations

    ERIC Educational Resources Information Center

    Sim, Gavin; Read, Janet C.

    2016-01-01

    Teaching practices within educational institutions have evolved through the increased adoption of technology to deliver the curriculum and the use of computers for assessment purposes. For educational technologists, there is a vast array of commercial computer applications available for the delivery of objective tests, and in some instances,…

  19. International Computer and Information Literacy Study: Assessment Framework

    ERIC Educational Resources Information Center

    Fraillon, Julian; Schulz, Wolfram; Ainley, John

    2013-01-01

    The purpose of the International Computer and Information Literacy Study 2013 (ICILS 2013) is to investigate, in a range of countries, the ways in which young people are developing "computer and information literacy" (CIL) to support their capacity to participate in the digital age. To achieve this aim, the study will assess student…

  20. Geography Students Assess Their Learning Using Computer-Marked Tests.

    ERIC Educational Resources Information Center

    Hogg, Jim

    1997-01-01

    Reports on a pilot study designed to assess the potential of computer-marked tests for allowing students to monitor their learning. Students' answers to multiple choice tests were fed into a computer that provided a full analysis of their strengths and weaknesses. Students responded favorably to the feedback. (MJP)

  1. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  2. Developing a Critical Lens among Preservice Teachers while Working within Mandated Performance-Based Assessment Systems

    ERIC Educational Resources Information Center

    Moss, Glenda

    2008-01-01

    This article addresses the dilemma of promoting critical pedagogy within portfolio assessment, which has been implemented in many teacher education programs to meet state and national mandates for performance-based assessment. It explores how one teacher educator works to move portfolio assessment to a level of critical self-reflection that…

  3. Assessing Critical Thinking Performance of Postgraduate Students in Threaded Discussions

    ERIC Educational Resources Information Center

    Tan, Cheng Lee; Ng, Lee Luan

    2014-01-01

    Critical thinking has increasingly been seen as one of the important attributes where human capital is concerned and in line with this recognition, the tertiary educational institutions worldwide are putting more effort into designing courses that produce university leavers who are critical thinkers. This study aims to investigate the critical…

  4. Modelling Critical Thinking through Learning-Oriented Assessment

    ERIC Educational Resources Information Center

    Lombard, B. J. J.

    2008-01-01

    One of the cornerstones peculiar to the outcomes-based approach adopted by the South African education and training sector is the so-called "critical outcomes". Included in one of these outcomes is the ability to think critically. Although this outcome articulates well with the cognitive domain of holistic development, it also gives rise…

  5. What Do They Know? A Strategy for Assessing Critical Literacy

    ERIC Educational Resources Information Center

    Morrissette, Rhonda

    2007-01-01

    In this article, the author describes how difficult it is to know how critically literate her students are in the adult senior high school in which she is a teacher-librarian. She assumes that many would have gaps in their learning, including gaps in information and critical literacy skills, and that they were likely to approach all online…

  6. Developing Critical Thinking Skills: Assessing the Effectiveness of Workbook Exercises

    ERIC Educational Resources Information Center

    Wallace, Elise D.; Jefferson, Renee N.

    2015-01-01

    To address the challenge of developing critical thinking skills in college students, this empirical study examines the effectiveness of cognitive exercises in developing those skills. The study uses Critical Thinking: Building the Basics by Walter, Knudsvig, and Smith (2003). This workbook is specifically designed to exercise and develop critical…

  7. Perceptions of University Students regarding Computer Assisted Assessment

    ERIC Educational Resources Information Center

    Jamil, Mubashrah

    2012-01-01

    Computer assisted assessment (CAA) is a common technique of assessment in higher educational institutions in Western countries, but a relatively new concept for students and teachers in Pakistan. It was therefore interesting to investigate students' perceptions about CAA practices from different universities of Pakistan. Information was collected…

  8. Test Review: Computer-Based Reading Assessment Instrument (CRAI).

    ERIC Educational Resources Information Center

    Blanchard, Jay S.

    1987-01-01

    Evaluates the Computer-Based Assessment Instrument (CRAI) as a test for reading proficiency. Notes strengths of CRAI, including its use as a quick assessment of silent reading comprehension level, and the problems with readability and content specific words lists and the lack of scoring features. (JC)

  9. Portfolios Plus: A Critical Guide to Alternative Assessment.

    ERIC Educational Resources Information Center

    Mabry, Linda

    This book explains some basic assumptions that underlie different assessment systems, some connections between education and assessment, and some assessment options that have gone unrecognized. The discussion serves as a guide to designing a custom assessment program individualized to fit the students, school, and community. Part 2 contains…

  10. Critical Assessment Issues in Work-Integrated Learning

    ERIC Educational Resources Information Center

    Ferns, Sonia; Zegwaard, Karsten E.

    2014-01-01

    Assessment has long been a contentious issue in work-integrated learning (WIL) and cooperative education. Despite assessment being central to the integrity and accountability of a university and long-standing theories around best practice in assessment, enacting quality assessment practices has proven to be more difficult. Authors in this special…

  11. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  12. Assessment of Computer Literacy of Nurses in Lesotho.

    PubMed

    Mugomeri, Eltony; Chatanga, Peter; Maibvise, Charles; Masitha, Matseliso

    2016-11-01

    Health systems worldwide are moving toward use of information technology to improve healthcare delivery. However, this requires basic computer skills. This study assessed the computer literacy of nurses in Lesotho using a cross-sectional quantitative approach. A structured questionnaire with 32 standardized computer skills was distributed to 290 randomly selected nurses in Maseru District. Univariate and multivariate logistic regression analyses in Stata 13 were performed to identify factors associated with having inadequate computer skills. Overall, 177 (61%) nurses scored below 16 of the 32 skills assessed. Finding hyperlinks on Web pages (63%), use of advanced search parameters (60.2%), and downloading new software (60.1%) proved to be challenging to the highest proportions of nurses. Age, sex, year of obtaining latest qualification, computer experience, and work experience were significantly (P < .05) associated with inadequate computer skills in univariate analysis. However, in multivariate analyses, sex (P = .001), year of obtaining latest qualification (P = .011), and computer experience (P < .001) emerged as significant factors. The majority of nurses in Lesotho have inadequate computer skills, and this is significantly associated with having many years since obtaining their latest qualification, being female, and lack of exposure to computers. These factors should be considered during planning of training curriculum for nurses in Lesotho.

  13. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    SciTech Connect

    Ivanova, T.; Laville, C.; Dyrda, J.; Mennerdahl, D.; Golovko, Y.; Raskach, K.; Tsiboulia, A.; Lee, G. S.; Woo, S. W.; Bidaud, A.; Sabouri, P.; Bledsoe, K.; Rearden, B.; Gulliford, J.; Michel-Sendis, F.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)

  14. Review of Estelle and LOTOS with respect to critical computer applications

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.

  15. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    NASA Astrophysics Data System (ADS)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  16. Void Fraction and Critical Power Assessment of CORETRAN-01/VIPRE-02

    SciTech Connect

    Aounallah, Yacine

    2004-02-15

    CORETRAN-01 is the Electric Power Research Institute core analysis computer program that couples the neutronic code ARROTTA to the thermal-hydraulic code VIPRE-02 to achieve an integrated three-dimensional representation of the core for both steady-state and transient applications. The thermal-hydraulic module VIPRE-02, the two-fluid version of the one-fluid code VIPRE-01, has been the object of relatively few assessment studies, and the work presented seeks to reduce this lacuna. The priority has been given to the assessment of the void fraction prediction due to the importance of the void feedback on the core power generation. The assessment data are experimental void fractions obtained from X- and gamma-ray attenuation techniques applied at assembly-averaged as well as subchannel level for both steady-state and transient conditions. These experiments are part of the NUPEC (Japan) program where full-scale boiling water reactor (BWR) assemblies of different types, including assemblies with part-length rods, and pressurized water reactor subassemblies were tested at nominal reactor operating conditions, as well as for a range of flow rates and pressures. Generally, the code performance ranged from adequate to good, except for configurations exhibiting a strong gradient in power-to-flow ratio. Critical power predictions have also been assessed and code limitations identified, based on measurements on full-scale BWR 8 x 8 and high-burnup assemblies operated over a range of thermal-hydraulic conditions.

  17. Does Computer-Based Motor Skill Assessment Training Transfer to Live Assessing?

    ERIC Educational Resources Information Center

    Kelly, Luke E.; Taliaferro, Andrea; Krause, Jennifer

    2012-01-01

    Developing competency in motor skill assessment has been identified as a critical need in physical educator preparation. We conducted this study to evaluate (a) the effectiveness of a web-based instructional program--Motor Skill Assessment Program (MSAP)--for developing assessment competency, and specifically (b) whether competency developed by…

  18. An Assessment of Post-Professional Athletic Training Students' Critical Thinking Skills and Dispositions

    ERIC Educational Resources Information Center

    Walter, Jessica Marie

    2013-01-01

    The need for outcome measures in critical thinking skills and dispositions for post-professional athletic training programs (PPATPs) is significant. It has been suggested that athletic trainers who are competent and disposed towards thinking critically will be successful in the profession. The purpose of this study is to assess critical thinking…

  19. Critical Thinking and Political Participation: Development and Assessment of a Casual Model.

    ERIC Educational Resources Information Center

    Guyton, Edith M.

    1988-01-01

    This study assessed a model of the relationship between critical thinking and political participation. Findings indicated that critical thinking has indirect positive effects on orientations toward political participation, that critical thinking positively affects personal control, political efficacy, and democratic attitude, and that personal…

  20. Criticism or praise? The impact of verbal versus text-only computer feedback on social presence, intrinsic motivation, and recall.

    PubMed

    Bracken, Cheryl Campanella; Jeffres, Leo W; Neuendorf, Kimberly A

    2004-06-01

    The Computers Are Social Actors (CASA) paradigm asserts that human computer users interact socially with computers, and has provided extensive evidence that this is the case. In this experiment (n = 134), participants received either praise or criticism from a computer. Independent variables were the direction feedback (praise or criticism), and voice channel (verbal or text-only). Dependent variables measured via a computer-based questionnaire were recall, perceived ability, intrinsic motivation, and perceptions of the computer as a social entity. Results demonstrate that participants had similar reactions to computers as predicted by interpersonal communication research with participants who received text-only criticism reporting higher levels of intrinsic motivation, perceived ability, and recall. Additionally, the computer was seen as more intelligent. Implications for theory and application are discussed.

  1. Using student writing assignments to assess critical thinking skills: a holistic approach.

    PubMed

    Niedringhaus, L K

    2001-04-01

    This work offers an example of one school's holistic approach to the evaluation of critical thinking by using student writing assignments. Faculty developed tools to assess achievement of critical thinking competencies, such as analysis, synthesis, insight, reflection, open mindedness, and depth, breadth, and appropriateness of clinical interventions. Faculty created a model for the development of program-specific critical thinking competencies, selected appropriate writing assignments that demonstrate critical thinking, and implemented a holistic assessment plan for data collection and analysis. Holistic assessment involves the identification of shared values and practices, and the use of concepts and language important to nursing.

  2. Workplace Educators' Interpretations of Their Assessment Practices: A View through a Critical Practice Lens

    ERIC Educational Resources Information Center

    Trede, Franziska; Smith, Megan

    2014-01-01

    In this paper, we examine workplace educators' interpretations of their assessment practices. We draw on a critical practice lens to conceptualise assessment practice as a social, relational and situated practice that becomes critical through critique and emancipation. We conducted semi-structured interviews followed by roundtable discussions with…

  3. The Halpern Critical Thinking Assessment and Real-World Outcomes: Cross-National Applications

    ERIC Educational Resources Information Center

    Butler, Heather A.; Dwyer, Christopher P.; Hogan, Michael J.; Franco, Amanda; Rivas, Silvia F.; Saiz, Carlos; Almeida, Leandro S.

    2012-01-01

    The Halpern Critical Thinking Assessment (HCTA) is a reliable measure of critical thinking that has been validated with numerous qualitatively different samples and measures of academic success (Halpern, 2010a). This paper presents several cross-national applications of the assessment, and recent work to expand the validation of the HCTA with…

  4. Using a Client Memo to Assess Critical Thinking of Finance Majors

    ERIC Educational Resources Information Center

    Carrithers, David; Bean, John C.

    2008-01-01

    This article describes a holistic, discourse-based method for assessing the critical thinking skills of undergraduate senior-level finance majors. Rejecting a psychometric assessment approach in which component features of critical thinking are disaggregated, this study is based on a holistic scoring of student memos. Students were asked to…

  5. Moving beyond Assessment to Improving Students' Critical Thinking Skills: A Model for Implementing Change

    ERIC Educational Resources Information Center

    Haynes, Ada; Lisic, Elizabeth; Goltz, Michele; Stein, Barry; Harris, Kevin

    2016-01-01

    This research examines how the use of the CAT (Critical thinking Assessment Test) and involvement in CAT-Apps (CAT Applications within the discipline) training can serve as an important part of a faculty development model that assists faculty in the assessment of students' critical thinking skills and in the development of these skills within…

  6. Risk Assessment Methodology for Protecting Our Critical Physical Infrastructures

    SciTech Connect

    BIRINGER,BETTY E.; DANNEELS,JEFFREY J.

    2000-12-13

    Critical infrastructures are central to our national defense and our economic well-being, but many are taken for granted. Presidential Decision Directive (PDD) 63 highlights the importance of eight of our critical infrastructures and outlines a plan for action. Greatly enhanced physical security systems will be required to protect these national assets from new and emerging threats. Sandia National Laboratories has been the lead laboratory for the Department of Energy (DOE) in developing and deploying physical security systems for the past twenty-five years. Many of the tools, processes, and systems employed in the protection of high consequence facilities can be adapted to the civilian infrastructure.

  7. Validation of a computer based system for assessing dietary intake.

    PubMed Central

    Levine, J A; Madden, A M; Morgan, M Y

    1987-01-01

    Dietary intake was assessed in 50 patients in hospital by using a dietary history method and computer based system for data collection and standard food tables to calculate the composition of nutrients. The results were compared with those from a weighed assessment that was calculated by using both food tables and manufacturers' food analyses. The use of the food tables overestimated mean (SEM) individual nutrient intakes by between 2.5% (1.5%) and 15.5% (3.0%). The mean errors associated with the dietary history assessment varied from -23% (7.8%) for fat intake to +21.4% (8.5%) for carbohydrate intake. Overall, 30% of the assessments of total nutrient intakes that were calculated using this method were within -20% to +20% of actual values; 18% were within -10% to +10%. The mean errors associated with the computer based assessment varied from -1.0% (4.3%) for carbohydrate intake to +8.5% (3.4%) for protein intake. Overall, 56% of the assessments of total nutrient intakes were within -20% to +20% of actual intakes; 31% were within -10% to +10%. The computer based system provides an accurate, reproducible, convenient, and inexpensive method for assessing dietary intake. PMID:3115455

  8. A review of literature and computer models on exposure assessment.

    PubMed

    Butta, T E; Clarkb, M; Coulone, F; Oduyemi, K O K

    2009-12-14

    At the present time, risk analysis is an effective management tool used by environmental managers to protect the environment from inevitable anthropogenic activities. There are generic elements in environmental risk assessments, which are independent of the subject to which risk analysis is applied. Examples of these elements are: baseline study, hazard identification, hazards' concentration assessment and risk quantification. Another important example of such generic elements is exposure assessment, which is required in a risk analysis process for landfill leachate as it would in any other environmental risk issue. Furthermore, computer models are also being developed to assist risk analysis in different fields. However, in the review of current computer models and literature, particularly regarding landfills, the authors have found no evidence for the existence of a holistic exposure assessment procedure underpinned with a computational method for landfill leachate. This paper, with reference to the relevant literature and models reviewed, discusses the extent to which exposure assessment is absent in landfill risk assessment approaches. The study also indicates a number of factors and features that should be added to the exposure assessment system in order to render it more strategic, thereby enhancing the quantitative risk analysis.

  9. Assessment of examinations in computer science doctoral education

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types of candidacy examinations that are required. The performance of computer science departments, estimated via two common surrogate metrics, in these different categories of candidacy requirements are compared and contrasted and the correlation between candidacy requirements and program/department performance is assessed.

  10. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  11. Assessing Program Impact with the Critical Incident Technique

    ERIC Educational Resources Information Center

    O'Neill, Barbara

    2013-01-01

    The critical incident technique (CIT) is a qualitative research method where subjects are encouraged to tell personal stories that provide descriptive data. Researchers who use the CIT employ a structured methodology to encourage respondents to share their experiences regarding a particular topic. Incidents are considered effective/successful when…

  12. Assess the Critical Period Hypothesis in Second Language Acquisition

    ERIC Educational Resources Information Center

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  13. Assessment of the adequacy of a criticality incident detection system

    SciTech Connect

    Cartwright, C.M.; Finnerty, M.D.

    1993-12-31

    The primary purpose of a criticality incident detection (CID) and alarm system is to minimize, by means of building evacuation, the radiation doses received by plant personnel. The adequacy of a CID systems installed in a nuclear plant within the UK was investigated. Results are described.

  14. Conceptualising, Developing and Assessing Critical Thinking in Law

    ERIC Educational Resources Information Center

    James, Nickolas; Hughes, Clair; Cappa, Clare

    2010-01-01

    "Critical thinking" is commonly included in the lists of graduate attributes (GAs), which all Australian universities are now required to develop and implement. That efforts to do so have met with limited success is due to a range of factors including inconsistent or naive conceptualisations, the failure to explicitly develop or assess…

  15. Assessment of Prospective Teachers' Views Regarding the Concept of Criticism

    ERIC Educational Resources Information Center

    Karakus, Neslihan

    2015-01-01

    Critical thinking is one of the skills that exist in the Turkish course curriculum and is aimed to be acquired by students. The objective of the study is to determine prospective Turkish teachers' perspectives regarding the concept of critism, which is both a mental exercise and carries an important role in the world of ideas. In order to assess…

  16. Assessing Critical Thinking: A College's Journey and Lessons Learned

    ERIC Educational Resources Information Center

    Peach, Brian E.; Mukherjee, Arup; Hornyak, Martin

    2007-01-01

    The business college at University of West Florida is currently in the throes of implementing an assessment initiative to develop student learning outcomes, design assessment devices to measure learning, analyze the measurement results to identify learning shortfalls, and establish feedback mechanisms to modify the curriculum to address the…

  17. Evaluation of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) and the Collected Data.

    PubMed

    Huff, Andrew G; Hodges, James S; Kennedy, Shaun P; Kircher, Amy

    2015-08-01

    To protect and secure food resources for the United States, it is crucial to have a method to compare food systems' criticality. In 2007, the U.S. government funded development of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) to determine which food and agriculture systems were most critical to the nation. FASCAT was developed in a collaborative process involving government officials and food industry subject matter experts (SMEs). After development, data were collected using FASCAT to quantify threats, vulnerabilities, consequences, and the impacts on the United States from failure of evaluated food and agriculture systems. To examine FASCAT's utility, linear regression models were used to determine: (1) which groups of questions posed in FASCAT were better predictors of cumulative criticality scores; (2) whether the items included in FASCAT's criticality method or the smaller subset of FASCAT items included in DHS's risk analysis method predicted similar criticality scores. Akaike's information criterion was used to determine which regression models best described criticality, and a mixed linear model was used to shrink estimates of criticality for individual food and agriculture systems. The results indicated that: (1) some of the questions used in FASCAT strongly predicted food or agriculture system criticality; (2) the FASCAT criticality formula was a stronger predictor of criticality compared to the DHS risk formula; (3) the cumulative criticality formula predicted criticality more strongly than weighted criticality formula; and (4) the mixed linear regression model did not change the rank-order of food and agriculture system criticality to a large degree.

  18. Optimal recovery sequencing for critical infrastructure resilience assessment.

    SciTech Connect

    Vugrin, Eric D.; Brown, Nathanael J. K.; Turnquist, Mark Alan

    2010-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the identification of optimal recovery strategies that maximize resilience. To this goal, we formulate a bi-level optimization problem for infrastructure network models. In the 'inner' problem, we solve for network flows, and we use the 'outer' problem to identify the optimal recovery modes and sequences. We draw from the literature of multi-mode project scheduling problems to create an effective solution strategy for the resilience optimization model. We demonstrate the application of this approach to a set of network models, including a national railroad model and a supply chain for Army munitions production.

  19. A critical review on sustainability assessment of recycled water schemes.

    PubMed

    Chen, Zhuo; Ngo, Huu Hao; Guo, Wenshan

    2012-06-01

    Recycled water provides a viable opportunity to supplement water supplies as well as alleviate environmental loads. To further expand current schemes and explore new recycled water end uses, this study reviews several environmental assessment tools, including Life Cycle Assessment (LCA), Material Flow Analysis (MFA) and Environmental Risk Assessment (ERA) in terms of their types, characteristics and weaknesses in evaluating the sustainability of recycled water schemes. Due to the limitations in individual models, the integrated approaches are recommended in most cases, of which the outputs could be further combined with additional economic and social assessments in multi-criteria decision making framework. The study also proposes several management strategies in improving the environmental scores. The discussion and suggestions could help decision makers in making a sound judgement as well as recognising the challenges and tasks in the future.

  20. Computer Applications for Alternative Assessment: An Instructional and Organization Dilemma.

    ERIC Educational Resources Information Center

    Mills, Ed; Brown, John A.

    1997-01-01

    Describes the possibilities and problems that computer-generated portfolios will soon present to instructors across America. Highlights include the history of portfolio assessment, logistical problems of handling portfolios in the traditional three-ring binder format, use of the zip drive for storage, and software/hardware compatibility problems.…

  1. The Use of Computers in Social Work Practice: An Assessment.

    ERIC Educational Resources Information Center

    Miller, Henry

    1986-01-01

    The potential use of computers in social work education and practice is discussed. Possibilities are emerging in regard to case management, diagnosis and assessment, and even treatment. The bottleneck is no longer expensive hardware but the development of usable and relevant software and courseware. (Author/MH)

  2. Computer Technology for Nursing Staff Learning Need Assessment.

    ERIC Educational Resources Information Center

    Forte, Paula S.

    1984-01-01

    The advantages of using a computer to analyze needs assessment data for continuing education are (1) it allows the expression of organizational needs, (2) all learners are able to declare their own needs, and (3) it provides rapid access to large amounts of information. (SK)

  3. How Effective Is Feedback in Computer-Aided Assessments?

    ERIC Educational Resources Information Center

    Gill, Mundeep; Greenhow, Martin

    2008-01-01

    Computer-Aided Assessments (CAAs) have been used increasingly at Brunel University for over 10 years to test students' mathematical abilities. Recently, we have focussed on providing very rich feedback to the students; given the work involved in designing and coding such feedback, it is important to study the impact of the interaction between…

  4. Computation of cross sections and dose conversion factors for criticality accident dosimetry.

    PubMed

    Devine, R T

    2004-01-01

    In the application of criticality accident dosemeters the cross sections and fluence-to-dose conversion factors have to be computed. The cross section and fluence-to-dose conversion factor for the thermal and epi-thermal contributions to neutron dose are well documented; for higher energy regions (>100 keV) these depend on the spectrum assumed. Fluence is determined using threshold detectors. The cross sections require the folding of an expected spectrum with the reaction cross sections. The fluence-to-dose conversion factors also require a similar computation. The true and effective thresholds are used to include the information on the expected spectrum. The spectra can either be taken from compendia or measured at the facility at which the exposures are to be expected. The cross sections can be taken from data computations or analytic representations and the fluence-to-dose conversion factors are determined by various standards making bodies. The problem remaining is the method of computation. The purpose of this paper is to compare two methods for computing these factors: analytic and Monte Carlo.

  5. Assessment of Teaching Methods and Critical Thinking in a Course for Science Majors

    NASA Astrophysics Data System (ADS)

    Speck, Angela; Ruzhitskaya, L.; Whittington, A. G.

    2014-01-01

    Ability to think critically is a key ingredient to the scientific mindset. Students who take science courses may or may not be predisposed to critical thinking - the ability to evaluate information analytically. Regardless of their initial stages, students can significantly improve their critical thinking through learning and practicing their reasoning skills, critical assessments, conducting and reflecting on observations and experiments, building their questioning and communication skills, and through the use of other techniques. While, there are several of teaching methods that may help to improve critical thinking, there are only a few assessment instruments that can help in evaluating the efficacy of these methods. Critical thinking skills and improvement in those skills are notoriously difficult to measure. Assessments that are based on multiple-choice questions demonstrate students’ final decisions but not their thinking processes. In addition, during the course of studies students may develop subject-based critical thinking while not being able to extend the skills to the general critical thinking. As such, we wanted to design and conduct a study on efficacy of several teaching methods in which we would learn how students’ improve their thinking processes within a science discipline as well as in everyday life situations. We conducted a study among 20 astronomy, physics and geology majors-- both graduate and undergraduate students-- enrolled in our Solar System Science course (mostly seniors and early graduate students) at the University of Missouri. We used the Ennis-Weir Critical Thinking Essay test to assess students’ general critical thinking and, in addition, we implemented our own subject-based critical thinking assessment. Here, we present the results of this study and share our experience on designing a subject-based critical thinking assessment instrument.

  6. The use of computers for perioperative simulation in anesthesia, critical care, and pain medicine.

    PubMed

    Lambden, Simon; Martin, Bruce

    2011-09-01

    Simulation in perioperative anesthesia training is a field of considerable interest, with an urgent need for tools that reliably train and facilitate objective assessment of performance. This article reviews the available simulation technologies, their evolution, and the current evidence base for their use. The future directions for research in the field and potential applications of simulation technology in anesthesia, critical care, and pain medicine are discussed.

  7. Validation of a scenario-based assessment of critical thinking using an externally validated tool.

    PubMed

    Buur, Jennifer L; Schmidt, Peggy; Smylie, Dean; Irizarry, Kris; Crocker, Carlos; Tyler, John; Barr, Margaret

    2012-01-01

    With medical education transitioning from knowledge-based curricula to competency-based curricula, critical thinking skills have emerged as a major competency. While there are validated external instruments for assessing critical thinking, many educators have created their own custom assessments of critical thinking. However, the face validity of these assessments has not been challenged. The purpose of this study was to compare results from a custom assessment of critical thinking with the results from a validated external instrument of critical thinking. Students from the College of Veterinary Medicine at Western University of Health Sciences were administered a custom assessment of critical thinking (ACT) examination and the externally validated instrument, California Critical Thinking Skills Test (CCTST), in the spring of 2011. Total scores and sub-scores from each exam were analyzed for significant correlations using Pearson correlation coefficients. Significant correlations between ACT Blooms 2 and deductive reasoning and total ACT score and deductive reasoning were demonstrated with correlation coefficients of 0.24 and 0.22, respectively. No other statistically significant correlations were found. The lack of significant correlation between the two examinations illustrates the need in medical education to externally validate internal custom assessments. Ultimately, the development and validation of custom assessments of non-knowledge-based competencies will produce higher quality medical professionals.

  8. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    DOE PAGES

    Jacobsen, Jesper Lykke; Scullard, Christian R.

    2013-02-04

    We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size ofmore » B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.« less

  9. Critical assessment of Reynolds stress turbulence models using homogeneous flows

    NASA Technical Reports Server (NTRS)

    Shabbir, Aamir; Shih, Tsan-Hsing

    1992-01-01

    In modeling the rapid part of the pressure correlation term in the Reynolds stress transport equations, extensive use has been made of its exact properties which were first suggested by Rotta. These, for example, have been employed in obtaining the widely used Launder, Reece and Rodi (LRR) model. Some recent proposals have dropped one of these properties to obtain new models. We demonstrate, by computing some simple homogeneous flows, that doing so does not lead to any significant improvements over the LRR model and it is not the right direction in improving the performance of existing models. The reason for this, in our opinion, is that violation of one of the exact properties can not bring in any new physics into the model. We compute thirteen homogeneous flows using LRR (with a recalibrated rapid term constant), IP and SSG models. The flows computed include the flow through axisymmetric contraction; axisymmetric expansion; distortion by plane strain; and homogeneous shear flows with and without rotation. Results show that for most general representation for a model linear in the anisotropic tensor, performs either better or as good as the other two models of the same level.

  10. Critical assessment of Reynolds stress turbulence models using homogeneous flows

    NASA Astrophysics Data System (ADS)

    Shabbir, Aamir; Shih, Tsan-Hsing

    1992-12-01

    In modeling the rapid part of the pressure correlation term in the Reynolds stress transport equations, extensive use has been made of its exact properties which were first suggested by Rotta. These, for example, have been employed in obtaining the widely used Launder, Reece and Rodi (LRR) model. Some recent proposals have dropped one of these properties to obtain new models. We demonstrate, by computing some simple homogeneous flows, that doing so does not lead to any significant improvements over the LRR model and it is not the right direction in improving the performance of existing models. The reason for this, in our opinion, is that violation of one of the exact properties can not bring in any new physics into the model. We compute thirteen homogeneous flows using LRR (with a recalibrated rapid term constant), IP and SSG models. The flows computed include the flow through axisymmetric contraction; axisymmetric expansion; distortion by plane strain; and homogeneous shear flows with and without rotation. Results show that for most general representation for a model linear in the anisotropic tensor, performs either better or as good as the other two models of the same level.

  11. Computer-aided modeling of beam propagation effects in diffraction-critical spaceborne instruments

    NASA Astrophysics Data System (ADS)

    Caldwell, Martin E.; Gray, Peter F.; McNamara, Paul

    1996-08-01

    This talk concerns applications of a ray-trace model to the computation of the effect of diffraction on beam propagation. It reports the use of the technique in the design of apertures for space-borne instruments having critical diffraction properties. The modeling technique used is that of gaussian beam decomposition, a numerical beam propagation technique incorporated in a commercially available ray-trace program. The result is the powerful capability to model the optical field at any point, in systems of any geometry, with any amount of aberration. The technique is particularly useful for design problems where `non-imaging' effects are important, and examples of its use will be given. Although the computation requirements for such detailed analysis may seem daunting, the continuing increase in readily available computing power is now overcoming this drawback. The application here is to certain `diffraction-critical' situations, where the design of correctly sized apertures is needed for the control of unwanted diffraction effects. Three recent design studies are illustrated: (1) Millimeter wave imaging with off-axis reflectors. Analysis of the effects of aberration on coherent detection efficiency. (2) Long-distance beam propagation in space-borne laser interferometry. This involves the analysis of coherent detection efficiency in the presence of aberrated gaussian beams. (3) Design of a Lyot stop system for an infra-red radiometer which is to view the Earth's limb from space. Here the critical (and unwanted) diffraction is that from the bright Earth disc, lying just outside of the instrument field of view. The analysis technique is explained, and examples given of diffracted energy patterns analyzed at progressive stages in the system. It is shown how these aid the design and analysis of the systems. The aim is to show the range problems in which this method is useful, and to hopefully learn from others at the conference about other cases where such techniques

  12. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    SciTech Connect

    Pevey, Ronald E.

    2005-09-15

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.

  13. Computer-aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.

    1984-01-01

    This research on pilot response to critical in-flight events employs a unique methodology including an interactive computer-aided scenario-testing system. Navigation displays, instrument-panel displays, and assorted textual material are presented on a touch-sensitive CRT screen. Problem diagnosis scenarios, destination-diversion scenarios and combined destination/diagnostic tests are available. A complete time history of all data inquiries and responses is maintained. Sample results of diagnosis scenarios obtained from testing 38 licensed pilots are presented and discussed.

  14. Critical Inquiry and Writing Centers: A Methodology of Assessment

    ERIC Educational Resources Information Center

    Bell, Diana Calhoun; Frost, Alanna

    2012-01-01

    By examining one writing center's role in student success, this project offers two examples of the way writing centers impact student engagement. This analysis models a methodology that writing and learning center directors can utilize in order to foster effective communication with stakeholders. By conducting data-driven assessment, directors can…

  15. Critical Issues in Assessing Teacher Compensation. Backgrounder. No. 2638

    ERIC Educational Resources Information Center

    Richwine, Jason; Biggs, Andrew G.

    2012-01-01

    A November 2011 Heritage Foundation report--"Assessing the Compensation of Public-School Teachers"--presented data on teacher salaries and benefits in order to inform debates about teacher compensation reform. The report concluded that public-school teacher compensation is far ahead of what comparable private-sector workers enjoy, and that…

  16. Classroom Dynamic Assessment: A Critical Examination of Constructs and Practices

    ERIC Educational Resources Information Center

    Davin, Kristin J.

    2016-01-01

    This article explores the implementation of dynamic assessment (DA) in an elementary school foreign language classroom by considering its theoretical basis and its applicability to second language (L2) teaching, learning, and development. In existing applications of L2 classroom DA, errors serve as a window into learners' instructional needs and…

  17. Critical Factors in Assessment of Students with Visual Impairments.

    ERIC Educational Resources Information Center

    Loftin, Marnee

    1997-01-01

    Discusses the assessment component of individualized education programs for students with visual impairments. Important issues reviewed are the appropriate selection of a battery of tests; the knowledge base about a particular vision condition; the importance of supplementing testing information with meaningful observations and interviews; and…

  18. Theories of Occupational Choice: A Critical Assessment of Selected Viewpoints.

    ERIC Educational Resources Information Center

    Hotchkiss, Lawrence; And Others

    Five theoretical perspectives related to occupational choice were assessed. These were (1) Super's career development perspective, (2) Holland's typology of occupational choice, (3) status-attainment research in the field of sociology, (4) economic theory of individual willingness to work in different occupations, and (5) a model of decision…

  19. A critical review of seven selected neighborhood sustainability assessment tools

    SciTech Connect

    Sharifi, Ayyoob Murayama, Akito

    2013-01-15

    Neighborhood sustainability assessment tools have become widespread since the turn of 21st century and many communities, mainly in the developed world, are utilizing these tools to measure their success in approaching sustainable development goals. In this study, seven tools from Australia, Europe, Japan, and the United States are selected and analyzed with the aim of providing insights into the current situations; highlighting the strengths, weaknesses, successes, and failures; and making recommendations for future improvements. Using a content analysis, the issues of sustainability coverage, pre-requisites, local adaptability, scoring and weighting, participation, reporting, and applicability are discussed in this paper. The results of this study indicate that most of the tools are not doing well regarding the coverage of social, economic, and institutional aspects of sustainability; there are ambiguities and shortcomings in the weighting, scoring, and rating; in most cases, there is no mechanism for local adaptability and participation; and, only those tools which are embedded within the broader planning framework are doing well with regard to applicability. - Highlights: Black-Right-Pointing-Pointer Seven widely used assessment tools were analyzed. Black-Right-Pointing-Pointer There is a lack of balanced assessment of sustainability dimensions. Black-Right-Pointing-Pointer Tools are not doing well regarding the applicability. Black-Right-Pointing-Pointer Refinements are needed to make the tools more effective. Black-Right-Pointing-Pointer Assessment tools must be integrated into the planning process.

  20. Assessment of Critical Business Skill Development by MBA Alumni

    ERIC Educational Resources Information Center

    Glynn, Joseph G.; Wood, Gregory R.

    2008-01-01

    Six years of survey data were analyzed to assess, among other things, the degree to which an AACSB accredited graduate business program successfully developed student skills in a variety of areas deemed important for career success. The study illustrates a methodology institutions can use to respond to increasing demands for program evaluation and…

  1. Assessing computer waste generation in Chile using material flow analysis.

    PubMed

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  2. Incorporating Colour Information for Computer-Aided Diagnosis of Melanoma from Dermoscopy Images: A Retrospective Survey and Critical Analysis

    PubMed Central

    Drew, Mark S.

    2016-01-01

    Cutaneous melanoma is the most life-threatening form of skin cancer. Although advanced melanoma is often considered as incurable, if detected and excised early, the prognosis is promising. Today, clinicians use computer vision in an increasing number of applications to aid early detection of melanoma through dermatological image analysis (dermoscopy images, in particular). Colour assessment is essential for the clinical diagnosis of skin cancers. Due to this diagnostic importance, many studies have either focused on or employed colour features as a constituent part of their skin lesion analysis systems. These studies range from using low-level colour features, such as simple statistical measures of colours occurring in the lesion, to availing themselves of high-level semantic features such as the presence of blue-white veil, globules, or colour variegation in the lesion. This paper provides a retrospective survey and critical analysis of contributions in this research direction. PMID:28096807

  3. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  4. Embodied cognition and mirror neurons: a critical assessment.

    PubMed

    Caramazza, Alfonso; Anzellotti, Stefano; Strnad, Lukas; Lingnau, Angelika

    2014-01-01

    According to embodied cognition theories, higher cognitive abilities depend on the reenactment of sensory and motor representations. In the first part of this review, we critically analyze the central claims of embodied theories and argue that the existing behavioral and neuroimaging data do not allow investigators to discriminate between embodied cognition and classical cognitive accounts, which assume that conceptual representations are amodal and symbolic. In the second part, we review the main claims and the core electrophysiological findings typically cited in support of the mirror neuron theory of action understanding, one of the most influential examples of embodied cognition theories. In the final part, we analyze the claim that mirror neurons subserve action understanding by mapping visual representations of observed actions on motor representations, trying to clarify in what sense the representations carried by these neurons can be claimed motor.

  5. Ecological risk assessment of acidification in the Northern Eurasia using critical load concept

    SciTech Connect

    Bashkin, V.; Golinets, O.

    1995-12-31

    This research presents the risk analysis of acid forming compounds input using critical loads (CL) values of sulfur, nitrogen, and acidity under the computer calculations for terrestrial and freshwater ecosystems of Northern Eurasia. The Cl values are used to set goals for future deposition rates of acidifying and eutrophication compounds so that the environment is protected. CL values for various ecosystems are determined using EM GIS approach. The most influential sources, such as nitrogen, sulfur and base cations uptake by vegetation, surface and groundwater leaching from terrestrial to freshwater ecosystems are described for the whole territory under study regarding uncertainty analysis and the level of corresponding risk assessment. This may be explained by many factors of which the most important are: the estimation of plant uptake is carried out on the basis of data on the biogeochemical cycling of various elements, for which adequate quantitative characterization for all ecosystems under study is either absent or insufficient; reliable information on the quantitative assessment of the ratio between perennial plant biomes increase and dead matter is absent for the required level of spatial and temporal resolution; reliable data on surface and underground runoff in various ecosystems are rare; the influence of hydrothermic factors on the above mentioned processes has not been quantitatively determined at required level of model resolution.

  6. Solutions for data integration in functional genomics: a critical assessment and case study.

    PubMed

    Smedley, Damian; Swertz, Morris A; Wolstencroft, Katy; Proctor, Glenn; Zouberakis, Michael; Bard, Jonathan; Hancock, John M; Schofield, Paul

    2008-11-01

    The torrent of data emerging from the application of new technologies to functional genomics and systems biology can no longer be contained within the traditional modes of data sharing and publication with the consequence that data is being deposited in, distributed across and disseminated through an increasing number of databases. The resulting fragmentation poses serious problems for the model organism community which increasingly rely on data mining and computational approaches that require gathering of data from a range of sources. In the light of these problems, the European Commission has funded a coordination action, CASIMIR (coordination and sustainability of international mouse informatics resources), with a remit to assess the technical and social aspects of database interoperability that currently prevent the full realization of the potential of data integration in mouse functional genomics. In this article, we assess the current problems with interoperability, with particular reference to mouse functional genomics, and critically review the technologies that can be deployed to overcome them. We describe a typical use-case where an investigator wishes to gather data on variation, genomic context and metabolic pathway involvement for genes discovered in a genome-wide screen. We go on to develop an automated approach involving an in silico experimental workflow tool, Taverna, using web services, BioMart and MOLGENIS technologies for data retrieval. Finally, we focus on the current impediments to adopting such an approach in a wider context, and strategies to overcome them.

  7. Critical Technology Assessment: Fine Grain, High Density Graphite

    DTIC Science & Technology

    2010-04-01

    Control Classification Number ( ECCN ) 1C107.a on the Commerce Control List (CCL). The parameters of 1C107.a stem from controls established by the Missile...Technology Control Regime (MTCR). In this assessment, BIS specifically examined: • The application of ECCN 1C107.a and related licensing...export licensing process for fine grain, high density graphite controlled by ECCN 1C107.a, especially to China, requires more license conditions and

  8. Critical Technology Assessment of Five Axis Simultaneous Control Machine Tools

    DTIC Science & Technology

    2009-07-01

    assessment, BIS specifically examined: • The application of Export Control Classification Numbers ( ECCN ) 2B001.b.2 and 2B001.c.2 controls and related...availability of certain five axis simultaneous control mills, mill/turns, and machining centers controlled by ECCN 2B001.b.2 (but not grinders controlled by... ECCN 2B001.c.2) exists to China and Taiwan, which both have an indigenous capability to produce five axis simultaneous control machine tools with

  9. Critical Thinking Assessment: Measuring a Moving Target. Report & Recommendations of the South Carolina Higher Education Assessment Network Critical Thinking Task Force.

    ERIC Educational Resources Information Center

    Cook, Patricia; Johnson, Reid; Moore, Phil; Myers, Phyllis; Pauly, Susan; Pendarvis, Faye; Prus, Joe; Ulmer-Sottong, Lovely

    This report is part of South Carolina's effort to move toward "100 percent performance funding" for the state's public colleges and universities and results from a task force's investigation of ways to assess critical thinking. The following eight major findings are reported: (1) policy makers must determine priorities; (2) critical…

  10. FORTRAN 4 computer program for calculating critical speeds of rotating shafts

    NASA Technical Reports Server (NTRS)

    Trivisonno, R. J.

    1973-01-01

    A FORTRAN 4 computer program, written for the IBM DCS 7094/7044 computer, that calculates the critical speeds of rotating shafts is described. The shaft may include bearings, couplings, extra masses (nonshaft mass), and disks for the gyroscopic effect. Shear deflection is also taken into account, and provision is made in the program for sections of the shaft that are tapered. The boundary conditions at the ends of the shaft can be fixed (deflection and slope equal to zero) or free (shear and moment equal to zero). The fixed end condition enables the program to calculate the natural frequencies of cantilever beams. Instead of using the lumped-parameter method, the program uses continuous integration of the differential equations of beam flexure across different shaft sections. The advantages of this method over the usual lumped-parameter method are less data preparation and better approximation of the distribution of the mass of the shaft. A main feature of the program is the nature of the output. The Calcomp plotter is used to produce a drawing of the shaft with superimposed deflection curves at the critical speeds, together with all pertinent information related to the shaft.

  11. Computer-based assessment of movement difficulties in Parkinson's disease.

    PubMed

    Cunningham, Laura M; Nugent, Chris D; Moore, George; Finlay, Dewar D; Craig, David

    2012-01-01

    The prevalence of Parkinson's disease (PD) is increasing due to an ageing population. It is an unpredictable disease which requires regular assessment and monitoring. Current techniques used to assess PD are subjective. Clinicians observe movements made by a patient and subsequently rate the level of severity of, for example tremor or slowness of movement. Within this work, we have developed and evaluated a prototype computer-based assessment tool capable of collecting information on the movement difficulties present in PD. Twenty participants took part in an assessment of the tool, 10 of whom were diagnosed with PD and 10 were without the disease. Following the usage of the tool, it was found that there was a significant difference (p = 0.038) in the speed of movement between the two groups. We envisage that this tool could have the potential to enable more objective clinical conclusions to be made.

  12. A conceptual framework for developing a critical thinking self-assessment scale.

    PubMed

    Nair, Girija G; Stamler, Lynnette Leeseberg

    2013-03-01

    Nurses must be talented critical thinkers to cope with the challenges related to the ever-changing health care system, population trends, and extended role expectations. Several countries now recognize critical thinking skills (CTS) as an expected outcome of nursing education programs. Critical thinking has been defined in multiple ways by philosophers, critical thinking experts, and educators. Nursing experts conceptualize critical thinking as a process involving cognitive and affective domains of reasoning. Nurse educators are often challenged with teaching and measuring CTS because of their latent nature and the lack of a uniform definition of the concept. In this review of the critical thinking literature, we examine various definitions, identify a set of constructs that define critical thinking, and suggest a conceptual framework on which to base a self-assessment scale for measuring CTS.

  13. Nuclear criticality safety assessment of the proposed CFC replacement coolants

    SciTech Connect

    Jordan, W.C.; Dyer, H.R.

    1993-12-01

    The neutron multiplication characteristics of refrigerant-114 (R-114) and proposed replacement coolants perfluorobutane (C{sub 4}F{sub 10}) and cycloperfluorobutane C{sub 4}F{sub 8}) have been compared by evaluating the infinite media multiplication factors of UF{sub 6}/H/coolant systems and by replacement calculations considering a 10-MW freezer/sublimer. The results of these comparisons demonstrate that R-114 is a neutron absorber, due to its chlorine content, and that the alternative fluorocarbon coolants are neutron moderators. Estimates of critical spherical geometries considering mixtures of UF{sub 6}/HF/C{sub 4}F{sub 10} indicate that the flourocarbon-moderated systems are large compared with water-moderated systems. The freezer/sublimer calculations indicate that the alternative coolants are more reactive than R-114, but that the reactivity remains significantly below the condition of water in the tubes, which was a limiting condition. Based on these results, the alternative coolants appear to be acceptable; however, several follow-up tasks have been recommended, and additional evaluation will be required on an individual equipment basis.

  14. Computer Usage and the Validity of Self-Assessed Computer Competence among First-Year Business Students

    ERIC Educational Resources Information Center

    Ballantine, Joan A.; McCourt Larres, Patricia; Oyelere, Peter

    2007-01-01

    This study evaluates the reliability of self-assessment as a measure of computer competence. This evaluation is carried out in response to recent research which has employed self-reported ratings as the sole indicator of students' computer competence. To evaluate the reliability of self-assessed computer competence, the scores achieved by students…

  15. Assessment of nonequilibrium radiation computation methods for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Sharma, Surendra

    1993-01-01

    The present understanding of shock-layer radiation in the low density regime, as appropriate to hypersonic vehicles, is surveyed. Based on the relative importance of electron excitation and radiation transport, the hypersonic flows are divided into three groups: weakly ionized, moderately ionized, and highly ionized flows. In the light of this division, the existing laboratory and flight data are scrutinized. Finally, an assessment of the nonequilibrium radiation computation methods for the three regimes in hypersonic flows is presented. The assessment is conducted by comparing experimental data against the values predicted by the physical model.

  16. Assessing the Amazon Cloud Suitability for CLARREO's Computational Needs

    NASA Technical Reports Server (NTRS)

    Goldin, Daniel; Vakhnin, Andrei A.; Currey, Jon C.

    2015-01-01

    In this document we compare the performance of the Amazon Web Services (AWS), also known as Amazon Cloud, with the CLARREO (Climate Absolute Radiance and Refractivity Observatory) cluster and assess its suitability for computational needs of the CLARREO mission. A benchmark executable to process one month and one year of PARASOL (Polarization and Anistropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar) data was used. With the optimal AWS configuration, adequate data-processing times, comparable to the CLARREO cluster, were found. The assessment of alternatives to the CLARREO cluster continues and several options, such as a NASA-based cluster, are being considered.

  17. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use.

  18. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  19. Criticality Model

    SciTech Connect

    A. Alsaed

    2004-09-14

    computational method will be used for evaluating the criticality potential of configurations of fissionable materials (in-package and external to the waste package) within the repository at Yucca Mountain, Nevada for all waste packages/waste forms. The criticality computational method is also applicable to preclosure configurations. The criticality computational method is a component of the methodology presented in ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003). How the criticality computational method fits in the overall disposal criticality analysis methodology is illustrated in Figure 1 (YMP 2003, Figure 3). This calculation will not provide direct input to the total system performance assessment for license application. It is to be used as necessary to determine the criticality potential of configuration classes as determined by the configuration probability analysis of the configuration generator model (BSC 2003a).

  20. Teaching Critical Thinking Skills with CAI: A Design by Two Researchers Shows Computers Can Make a Difference.

    ERIC Educational Resources Information Center

    Bass, George M., Jr.; Perkins, Harvey W.

    1984-01-01

    Describes a project which involved designing a nine-week course utilizing computer assisted instruction (CAI) to teach seventh graders critical thinking skills. Results indicate measurable gains were made in the critical thinking skills of verbal analogy and inductive/deductive reasoning, although no consistent gains were made in logical reasoning…

  1. Need Assessment of Computer Science and Engineering Graduates

    NASA Astrophysics Data System (ADS)

    Surakka, Sami; Malmi, Lauri

    2005-06-01

    This case study considered the syllabus of the first and second year studies in computer science. The aim of the study was to reveal which topics covered in the syllabi were really needed during the following years of study or in working life. The program that was assessed in the study was a Masters program in computer science and engineering at a university of technology in Finland. The necessity of different subjects for the advanced studies (years 3? ?5) and for working life was assessed using four content analyses: (a) the course catalog of the institution where this study was carried out, (b) employment reports that were attached to the applications for internship credits, (c) masters theses, and (d) job advertisements in a newspaper. The results of the study imply that the necessity of physics for the advanced study and work was very low compared to the extent to which it was studied. On the other hand, the necessity for mathematics was moderate, and it had remained quite steady during the period 1989? ?2002. The most necessary computer science topic was programming. Also telecommunications and networking was needed often, whereas theoretical computer science was needed quite rarely.

  2. Complexity theory and geographies of health: a critical assessment.

    PubMed

    Gatrell, Anthony C

    2005-06-01

    The interest of social scientists in complexity theory has developed rapidly in recent years. Here, I consider briefly the primary characteristics of complexity theory, with particular emphasis given to relations and networks, non-linearity, emergence, and hybrids. I assess the 'added value' compared with other, existing perspectives that emphasise relationality and connectedness. I also consider the philosophical underpinnings of complexity theory and its reliance on metaphor. As a vehicle for moving away from reductionist accounts, complexity theory potentially has much to say to those interested in research on health inequalities, spatial diffusion, emerging and resurgent infections, and risk. These and other applications in health geography that have invoked complexity theory are examined in the paper. Finally, I consider some of the missing elements in complexity theory and argue that while it is refreshing to see a fruitful line of theoretical debate in health geography, we need good empirical work to illuminate it.

  3. Concepts in critical thinking applied to caries risk assessment in dental education.

    PubMed

    Guzman-Armstrong, Sandra; Warren, John J; Cunningham-Ford, Marsha A; von Bergmann, HsingChi; Johnsen, David C

    2014-06-01

    Much progress has been made in the science of caries risk assessment and ways to analyze caries risk, yet dental education has seen little movement toward the development of frameworks to guide learning and assess critical thinking in caries risk assessment. In the absence of previous proactive implementation of a learning framework that takes the knowledge of caries risk and critically applies it to the patient with the succinctness demanded in the clinical setting, the purpose of this study was to develop a model learning framework that combines the science of caries risk assessment with principles of critical thinking from the education literature. This article also describes the implementation of that model at one dental school and presents some preliminary assessment data.

  4. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention.

    PubMed

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2015-09-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald and Garland, Psycholog Assess 25:146-156, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on "joining," which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached 0.83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings.

  5. Use of writing portfolios for interdisciplinary assessment of critical thinking outcomes of nursing students.

    PubMed

    Sorrell, J M; Brown, H N; Silva, M C; Kohlenberg, E M

    1997-01-01

    This article discusses an interdisciplinary research project in which faculty from nursing and english collaborated in the assessment of students' critical thinking skills as reflected in writing portfolios. Faculty reviewed students' writing portfolios and then corresponded on email from two different universities about evidence of critical thinking in the portfolios. Findings suggest that writing portfolios can provide important evidence of critical thinking outcomes. To do this, however, faculty need to design writing assignments to foster critical thinking skills, helping students to think not only about learning to write, but also about using writing to learn.

  6. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    SciTech Connect

    Jacobsen, Jesper Lykke; Scullard, Christian R.

    2013-02-04

    We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size of B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.

  7. Advanced criticality assessment method for sewer pipeline assets.

    PubMed

    Syachrani, S; Jeong, H D; Chung, C S

    2013-01-01

    For effective management of water and wastewater infrastructure, the United States Environmental Protection Agency (US-EPA) has long emphasized the significant role of risk in prioritizing and optimizing asset management decisions. High risk assets are defined as assets with a high probability of failure (e.g. soon to fail, old, poor condition) and high consequences of failure (e.g. environmental impact, high expense, safety concerns, social disruption). In practice, the consequences of failure are often estimated by experts through a Delphi method. However, the estimation of the probability of failure has been challenging as it requires the thorough analysis of the historical condition assessment data, repair and replacement records, and other factors influencing the deterioration of the asset. The most common predictor in estimating the probability of failure is calendar age. However, a simple reliance on calendar age as a basis for estimating the asset's deterioration pattern completely ignores the different aging characteristics influenced by various operational and environmental conditions. This paper introduces a new approach of using 'real age' in estimating the probability of failure. Unlike the traditional calendar age method, the real age represents the adjusted age based on the unique operational and environmental conditions of the asset. Depending on the individual deterioration pattern, the real age could be higher or lower than its calendar age. Using the concept of real age, the probability of failure of an asset can be more accurately estimated.

  8. Prediction of critical heat flux in water-cooled plasma facing components using computational fluid dynamics.

    SciTech Connect

    Bullock, James H.; Youchison, Dennis Lee; Ulrickson, Michael Andrew

    2010-11-01

    Several commercial computational fluid dynamics (CFD) codes now have the capability to analyze Eulerian two-phase flow using the Rohsenow nucleate boiling model. Analysis of boiling due to one-sided heating in plasma facing components (pfcs) is now receiving attention during the design of water-cooled first wall panels for ITER that may encounter heat fluxes as high as 5 MW/m2. Empirical thermalhydraulic design correlations developed for long fission reactor channels are not reliable when applied to pfcs because fully developed flow conditions seldom exist. Star-CCM+ is one of the commercial CFD codes that can model two-phase flows. Like others, it implements the RPI model for nucleate boiling, but it also seamlessly transitions to a volume-of-fluid model for film boiling. By benchmarking the results of our 3d models against recent experiments on critical heat flux for both smooth rectangular channels and hypervapotrons, we determined the six unique input parameters that accurately characterize the boiling physics for ITER flow conditions under a wide range of absorbed heat flux. We can now exploit this capability to predict the onset of critical heat flux in these components. In addition, the results clearly illustrate the production and transport of vapor and its effect on heat transfer in pfcs from nucleate boiling through transition to film boiling. This article describes the boiling physics implemented in CCM+ and compares the computational results to the benchmark experiments carried out independently in the United States and Russia. Temperature distributions agreed to within 10 C for a wide range of heat fluxes from 3 MW/m2 to 10 MW/m2 and flow velocities from 1 m/s to 10 m/s in these devices. Although the analysis is incapable of capturing the stochastic nature of critical heat flux (i.e., time and location may depend on a local materials defect or turbulence phenomenon), it is highly reliable in determining the heat flux where boiling instabilities begin

  9. Assessing computer skills in Tanzanian medical students: an elective experience

    PubMed Central

    Samuel, Miriam; Coombes, John C; Miranda, J Jaime; Melvin, Rob; Young, Eoin JW; Azarmina, Pejman

    2004-01-01

    Background One estimate suggests that by 2010 more than 30% of a physician's time will be spent using information technology tools. The aim of this study is to assess the information and communication technologies (ICT) skills of medical students in Tanzania. We also report a pilot intervention of peer mentoring training in ICT by medical students from the UK tutoring students in Tanzania. Methods Design: Cross sectional study and pilot intervention study. Participants: Fourth year medical students (n = 92) attending Muhimbili University College of Health Sciences, Dar es Salaam, Tanzania. Main outcome measures: Self-reported assessment of competence on ICT-related topics and ability to perform specific ICT tasks. Further information related to frequency of computer use (hours per week), years of computer use, reasons for use and access to computers. Skills at specific tasks were reassessed for 12 students following 4 to 6 hours of peer mentoring training. Results The highest levels of competence in generic ICT areas were for email, Internet and file management. For other skills such as word processing most respondents reported low levels of competence. The abilities to perform specific ICT skills were low – less than 60% of the participants were able to perform the core specific skills assessed. A period of approximately 5 hours of peer mentoring training produced an approximate doubling of competence scores for these skills. Conclusion Our study has found a low level of ability to use ICT facilities among medical students in a leading university in sub-Saharan Africa. A pilot scheme utilising UK elective students to tutor basic skills showed potential. Attention is required to develop interventions that can improve ICT skills, as well as computer access, in order to bridge the digital divide. PMID:15306029

  10. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    PubMed

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  11. Does computer-aided formative assessment improve learning outcomes?

    NASA Astrophysics Data System (ADS)

    Hannah, John; James, Alex; Williams, Phillipa

    2014-02-01

    Two first-year engineering mathematics courses used computer-aided assessment (CAA) to provide students with opportunities for formative assessment via a series of weekly quizzes. Most students used the assessment until they achieved very high (>90%) quiz scores. Although there is a positive correlation between these quiz marks and the final exam marks, spending time on the CAA component of the course was negatively correlated with final exam performance. Students across the ability spectrum reduced their time commitment to CAA in their second semester, with weaker students achieving lower quiz totals, but with more able students' quiz marks hardly affected. Despite this lower quiz performance, the weaker students still improved their final exam marks in the second semester.

  12. Assessment of asthmatic inflammation using hybrid fluorescence molecular tomography-x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Ma, Xiaopeng; Prakash, Jaya; Ruscitti, Francesca; Glasl, Sarah; Stellari, Fabio Franco; Villetti, Gino; Ntziachristos, Vasilis

    2016-01-01

    Nuclear imaging plays a critical role in asthma research but is limited in its readings of biology due to the short-lived signals of radio-isotopes. We employed hybrid fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) for the assessment of asthmatic inflammation based on resolving cathepsin activity and matrix metalloproteinase activity in dust mite, ragweed, and Aspergillus species-challenged mice. The reconstructed multimodal fluorescence distribution showed good correspondence with ex vivo cryosection images and histological images, confirming FMT-XCT as an interesting alternative for asthma research.

  13. What Can You Learn in Three Minutes? Critical Reflection on an Assessment Task that Embeds Technology

    ERIC Educational Resources Information Center

    Brown, Natalie Ruth

    2009-01-01

    Purpose: The purpose of this paper is to critically examine an assessment task, undertaken by pre-service science teachers, that integrates the use of technology (in this case digital video-recorders and video-editing software) whilst scaffolding skill development. The embedding of technology into the assessment task is purposeful, aiming to…

  14. Evidence Based Clinical Assessment of Child and Adolescent Social Phobia: A Critical Review of Rating Scales

    ERIC Educational Resources Information Center

    Tulbure, Bogdan T.; Szentagotai, Aurora; Dobrean, Anca; David, Daniel

    2012-01-01

    Investigating the empirical support of various assessment instruments, the evidence based assessment approach expands the scientific basis of psychotherapy. Starting from Hunsley and Mash's evaluative framework, we critically reviewed the rating scales designed to measure social anxiety or phobia in youth. Thirteen of the most researched social…

  15. CRITICAL ANALYSIS OF THE MATHEMATICAL RELATIONSHIPS AND COMPREHENSIVENESS OF LIFE CYCLE IMPACT ASSESSMENT APPROACHES

    EPA Science Inventory

    The impact assessment phase of Life Cycle Assessment (LCA) has received much criticism due to lack of consistency. ISO 14042 requires selection of impact categories that “reflect a comprehensive set of environmental issues” related to the system being studied, especi...

  16. Problem-Based Learning in Geography: Towards a Critical Assessment of Its Purposes, Benefits and Risks

    ERIC Educational Resources Information Center

    Pawson, Eric; Fournier, Eric; Haigh, Martin; Muniz, Osvaldo; Trafford, Julie; Vajoczki, Susan

    2006-01-01

    This paper makes a critical assessment of problem-based learning (PBL) in geography. It assesses what PBL is, in terms of the range of definitions in use and in light of its origins in specific disciplines such as medicine. It considers experiences of PBL from the standpoint of students, instructors and managers (e.g. deans), and asks how well…

  17. Control System Applicable Use Assessment of the Secure Computing Corporation - Secure Firewall (Sidewinder)

    SciTech Connect

    Hadley, Mark D.; Clements, Samuel L.

    2009-01-01

    Battelle’s National Security & Defense objective is, “applying unmatched expertise and unique facilities to deliver homeland security solutions. From detection and protection against weapons of mass destruction to emergency preparedness/response and protection of critical infrastructure, we are working with industry and government to integrate policy, operational, technological, and logistical parameters that will secure a safe future”. In an ongoing effort to meet this mission, engagements with industry that are intended to improve operational and technical attributes of commercial solutions that are related to national security initiatives are necessary. This necessity will ensure that capabilities for protecting critical infrastructure assets are considered by commercial entities in their development, design, and deployment lifecycles thus addressing the alignment of identified deficiencies and improvements needed to support national cyber security initiatives. The Secure Firewall (Sidewinder) appliance by Secure Computing was assessed for applicable use in critical infrastructure control system environments, such as electric power, nuclear and other facilities containing critical systems that require augmented protection from cyber threat. The testing was performed in the Pacific Northwest National Laboratory’s (PNNL) Electric Infrastructure Operations Center (EIOC). The Secure Firewall was tested in a network configuration that emulates a typical control center network and then evaluated. A number of observations and recommendations are included in this report relating to features currently included in the Secure Firewall that support critical infrastructure security needs.

  18. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    SciTech Connect

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  19. Improving Educational Assessment: A Computer-Adaptive Multiple Choice Assessment Using NRET as the Scoring Method

    ERIC Educational Resources Information Center

    Sie Hoe, Lau; Ngee Kiong, Lau; Kian Sam, Hong; Bin Usop, Hasbee

    2009-01-01

    Assessment is central to any educational process. Number Right (NR) scoring method is a conventional scoring method for multiple choice items, where students need to pick one option as the correct answer. One point is awarded for the correct response and zero for any other responses. However, it has been heavily criticized for guessing and failure…

  20. A computer program for conducting incinerator risk assessments.

    PubMed

    Walter, M A

    1999-02-01

    In 1994, the United States Environmental Protection Agency (USEPA) developed a screening methodology for conducting indirect exposure risk assessments for combustion facilities. The United States Army Center for Health Promotion and Preventive Medicine currently utilizes this methodology in conjunction with other USEPA guidance documents to perform human health risk assessments (HHRAs). The HHRAs require the development of complex human health models using spreadsheet software packages which estimate various media concentrations of contaminants in the environment. Since the quality assurance/quality control procedures associated with verifying the model's results are extremely time consuming, a computer program was developed using Microsoft Excel to minimize the amount of time needed. This discussion describes the 6 steps taken in developing this computer program, which are: (1) understanding the problem; (2) establishing the structure of each table in the spreadsheets; (3) developing an algorithm to solve the problem; (4) writing code; (5) running the program; and (6) testing the results. The automated process of having the computer predict health risk and hazards for each potentially exposed individual saves a tremendous amount of time because each calculated value is placed in the correct spreadsheet cell location. In addition to the time needed to develop human health spreadsheets, this program also minimizes the potential for reducing human error.

  1. Computational Pollutant Environment Assessment from Propulsion-System Testing

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; McConnaughey, Paul; Chen, Yen-Sen; Warsi, Saif

    1996-01-01

    An asymptotic plume growth method based on a time-accurate three-dimensional computational fluid dynamics formulation has been developed to assess the exhaust-plume pollutant environment from a simulated RD-170 engine hot-fire test on the F1 Test Stand at Marshall Space Flight Center. Researchers have long known that rocket-engine hot firing has the potential for forming thermal nitric oxides, as well as producing carbon monoxide when hydrocarbon fuels are used. Because of the complex physics involved, most attempts to predict the pollutant emissions from ground-based engine testing have used simplified methods, which may grossly underpredict and/or overpredict the pollutant formations in a test environment. The objective of this work has been to develop a computational fluid dynamics-based methodology that replicates the underlying test-stand flow physics to accurately and efficiently assess pollutant emissions from ground-based rocket-engine testing. A nominal RD-170 engine hot-fire test was computed, and pertinent test-stand flow physics was captured. The predicted total emission rates compared reasonably well with those of the existing hydrocarbon engine hot-firing test data.

  2. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention

    PubMed Central

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2014-01-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald et al, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on “joining,” which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached .83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings. PMID:24500022

  3. Guiding dental student learning and assessing performance in critical thinking with analysis of emerging strategies.

    PubMed

    Johnsen, David C; Lipp, Mitchell J; Finkelstein, Michael W; Cunningham-Ford, Marsha A

    2012-12-01

    Patient-centered care involves an inseparable set of knowledge, abilities, and professional traits on the part of the health care provider. For practical reasons, health professions education is segmented into disciplines or domains like knowledge, technical skills, and critical thinking, and the culture of dental education is weighted toward knowledge and technical skills. Critical thinking, however, has become a growing presence in dental curricula. To guide student learning and assess performance in critical thinking, guidelines have been developed over the past several decades in the educational literature. Prominent among these guidelines are the following: engage the student in multiple situations/exercises reflecting critical thinking; for each exercise, emulate the intended activity for validity; gain agreement of faculty members across disciplines and curriculum years on the learning construct, application, and performance assessment protocol for reliability; and use the same instrument to guide learning and assess performance. The purposes of this article are 1) to offer a set of concepts from the education literature potentially helpful to guide program design or corroborate existing programs in dental education; 2) to offer an implementation model consolidating these concepts as a guide for program design and execution; 3) to cite specific examples of exercises and programs in critical thinking in the dental education literature analyzed against these concepts; and 4) to discuss opportunities and challenges in guiding student learning and assessing performance in critical thinking for dentistry.

  4. Can Dental Cone Beam Computed Tomography Assess Bone Mineral Density?

    PubMed Central

    2014-01-01

    Mineral density distribution of bone tissue is altered by active bone modeling and remodeling due to bone complications including bone disease and implantation surgery. Clinical cone beam computed tomography (CBCT) has been examined whether it can assess oral bone mineral density (BMD) in patient. It has been indicated that CBCT has disadvantages of higher noise and lower contrast than conventional medical computed tomography (CT) systems. On the other hand, it has advantages of a relatively lower cost and radiation dose but higher spatial resolution. However, the reliability of CBCT based mineral density measurement has not yet been fully validated. Thus, the objectives of this review are to discuss 1) why assessment of BMD distribution is important and 2) whether the clinical CBCT can be used as a potential tool to measure the BMD. Brief descriptions of image artefacts associated with assessment of gray value, which has been used to account for mineral density, in CBCT images are provided. Techniques to correct local and conversion errors in obtaining the gray values in CBCT images are also introduced. This review can be used as a quick reference for users who may encounter these errors during analysis of CBCT images. PMID:25006568

  5. Assessment of Zero Power Critical Experiments and Needs for a Fission Surface Power System

    SciTech Connect

    Jim R Parry; John Darrell bess; Brad T. Rearden; Gary A. Harms

    2009-06-01

    The National Aeronautics and Space Administration (NASA) is providing funding to the Department of Energy (DOE) to assess, develop, and test nuclear technologies that could provide surface power to a lunar outpost. Sufficient testing of this fission surface power (FSP) system will need to be completed to enable a decision by NASA for flight development. The near-term goal for the FSP work is to conduct the minimum amount of testing needed to validate the system performance within an acceptable risk. This report attempts to assess the current modeling capabilities and quantify any bias associated with the modeling methods for designing the nuclear reactor. The baseline FSP system is a sodium-potassium (NaK) cooled, fast spectrum reactor with 93% 235U enriched HEU-O2 fuel, SS316 cladding, and beryllium reflectors with B4C control drums. The FSP is to produce approximately 40 kWe net power with a lifetime of at least 8 years at full power. A flight-ready FSP is to be ready for launch and deployment by 2020. Existing benchmarks from the International Criticality Safety Benchmark Evaluation Program (ICSBEP) were reviewed and modeled in MCNP. An average bias of less than 0.6% was determined using the ENDF/B-VII cross-section libraries except in the case of subcritical experiments, which exhibited an average bias of approximately 1.5%. The bias increases with increasing reflector worth of the beryllium. The uncertainties and sensitivities in cross section data for the FSP model and ZPPR-20 configurations were assessed using TSUNAMI-3D. The cross-section covariance uncertainty in the FSP model was calculated as 2.09%, which was dominated by the uncertainty in the 235U(n,?) reactions. Global integral indices were generated in TSUNAMI-IP using pre-release SCALE 6 cross-section covariance data. The ZPPR-20 benchmark models exhibit strong similarity with the FSP model. A penalty assessment was performed to determine the degree of which the FSP model could not be characterized

  6. Performance Assessment of OVERFLOW on Distributed Computing Environment

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Rizk, Yehia M.

    2000-01-01

    The aerodynamic computer code, OVERFLOW, with a multi-zone overset grid feature, has been parallelized to enhance its performance on distributed and shared memory paradigms. Practical application benchmarks have been set to assess the efficiency of code's parallelism on high-performance architectures. The code's performance has also been experimented with in the context of the distributed computing paradigm on distant computer resources using the Information Power Grid (IPG) toolkit, Globus. Two parallel versions of the code, namely OVERFLOW-MPI and -MLP, have developed around the natural coarse grained parallelism inherent in a multi-zonal domain decomposition paradigm. The algorithm invokes a strategy that forms a number of groups, each consisting of a zone, a cluster of zones and/or a partition of a large zone. Each group can be thought of as a process with one or multithreads assigned to it and that all groups run in parallel. The -MPI version of the code uses explicit message-passing based on the standard MPI library for sending and receiving interzonal boundary data across processors. The -MLP version employs no message-passing paradigm; the boundary data is transferred through the shared memory. The -MPI code is suited for both distributed and shared memory architectures, while the -MLP code can only be used on shared memory platforms. The IPG applications are implemented by the -MPI code using the Globus toolkit. While a computational task is distributed across multiple computer resources, the parallelism can be explored on each resource alone. Performance studies are achieved with some practical aerodynamic problems with complex geometries, consisting of 2.5 up to 33 million grid points and a large number of zonal blocks. The computations were executed primarily on SGI Origin 2000 multiprocessors and on the Cray T3E. OVERFLOW's IPG applications are carried out on NASA homogeneous metacomputing machines located at three sites, Ames, Langley and Glenn. Plans

  7. Computer database takes confusion out of multi-property assessments

    SciTech Connect

    Kinworthy, M.L.

    1996-03-01

    Managing environmental site assessments in multi-property transactions poses a special challenge. Multi-site ESAs require a tremendous amount of coordination, data collection and interpretation; often, these tasks must be completed according to accelerated timeframes to meet client deadlines. The tasks can be particularly challenging when several hundred sites are included in the transaction. In such cases, a computer database can be an effective, powerful tool for tracking and managing property data, and generating customized reports for large, multi-site ESAs.

  8. RESRAD-CHEM: A computer code for chemical risk assessment

    SciTech Connect

    Cheng, J.J.; Yu, C.; Hartmann, H.M.; Jones, L.G.; Biwer, B.M.; Dovel, E.S.

    1993-10-01

    RESRAD-CHEM is a computer code developed at Argonne National Laboratory for the U.S. Department of Energy to evaluate chemically contaminated sites. The code is designed to predict human health risks from multipathway exposure to hazardous chemicals and to derive cleanup criteria for chemically contaminated soils. The method used in RESRAD-CHEM is based on the pathway analysis method in the RESRAD code and follows the U.S. Environmental Protection Agency`s (EPA`s) guidance on chemical risk assessment. RESRAD-CHEM can be used to evaluate a chemically contaminated site and, in conjunction with the use of the RESRAD code, a mixed waste site.

  9. Assessment of a human computer interface prototyping environment

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1993-01-01

    A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.

  10. [Vascular assessment in stroke codes: role of computed tomography angiography].

    PubMed

    Mendigaña Ramos, M; Cabada Giadas, T

    2015-01-01

    Advances in imaging studies for acute ischemic stroke are largely due to the development of new efficacious treatments carried out in the acute phase. Together with computed tomography (CT) perfusion studies, CT angiography facilitates the selection of patients who are likely to benefit from appropriate early treatment. CT angiography plays an important role in the workup for acute ischemic stroke because it makes it possible to confirm vascular occlusion, assess the collateral circulation, and obtain an arterial map that is very useful for planning endovascular treatment. In this review about CT angiography, we discuss the main technical characteristics, emphasizing the usefulness of the technique in making the right diagnosis and improving treatment strategies.

  11. Computational geometry assessment for morphometric analysis of the mandible.

    PubMed

    Raith, Stefan; Varga, Viktoria; Steiner, Timm; Hölzle, Frank; Fischer, Horst

    2017-01-01

    This paper presents a fully automated algorithm for geometry assessment of the mandible. Anatomical landmarks could be reliably detected and distances were statistically evaluated with principal component analysis. The method allows for the first time to generate a mean mandible shape with statistically valid geometrical variations based on a large set of 497 CT-scans of human mandibles. The data may be used in bioengineering for designing novel oral implants, for planning of computer-guided surgery, and for the improvement of biomechanical models, as it is shown that commercially available mandible replicas differ significantly from the mean of the investigated population.

  12. Nuclear criticality safety assessment of the Consolidated Edison Uranium-Solidification Program Facility

    SciTech Connect

    Thomas, J.T.

    1984-01-01

    A nuclear criticality assessment of the Consolidated Edison Uranium-Solidification Program facility confirms that all operations involved in the process may be conducted with an acceptable margin of subcriticality. Normal operation presents no concern since subcriticality is maintained by design. Several recommendations are presented to prevent, or mitigate the consequences of, any abnormal events that might occur in the various portions of the process. These measures would also serve to reduce to a minimum the administrative controls required to prevent criticality.

  13. Assessment of spare reliability for multi-state computer networks within tolerable packet unreliability

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Kuei; Huang, Cheng-Fu

    2015-04-01

    From a quality of service viewpoint, the transmission packet unreliability and transmission time are both critical performance indicators in a computer system when assessing the Internet quality for supervisors and customers. A computer system is usually modelled as a network topology where each branch denotes a transmission medium and each vertex represents a station of servers. Almost every branch has multiple capacities/states due to failure, partial failure, maintenance, etc. This type of network is known as a multi-state computer network (MSCN). This paper proposes an efficient algorithm that computes the system reliability, i.e., the probability that a specified amount of data can be sent through k (k ≥ 2) disjoint minimal paths within both the tolerable packet unreliability and time threshold. Furthermore, two routing schemes are established in advance to indicate the main and spare minimal paths to increase the system reliability (referred to as spare reliability). Thus, the spare reliability can be readily computed according to the routing scheme.

  14. Deconstructing the Assessment of Anomaly-based Intrusion Detectors for Critical Applications

    SciTech Connect

    Viswanathan, Arun; Tan, Kymie; Neuman, Clifford

    2013-10-01

    Anomaly detection is a key strategy for cyber intrusion detection because it is conceptually capable of detecting novel attacks. This makes it an appealing defensive technique for environments such as the nation's critical infrastructures that is currently facing increased cyber adversarial activity. When considering deployment within the purview of such critical infrastructures it is imperative that the technology is well understood and reliable, where its performance is benchmarked on the results of principled assessments. This paper works towards such an imperative by analyzing the current state of anomaly detector assessments with a view toward mission critical deployments. We compile a framework of key evaluation constructs that identify how and where current assessment methods may fall short in providing sufficient insight into detector performance characteristics. Within the context of three case studies from literature, we show how error factors that influence the performance of detectors interact with different phases of a canonical evaluation strategy to compromise the integrity of the final results.

  15. Assessing executive function using a computer game: computational modeling of cognitive processes.

    PubMed

    Hagler, Stuart; Jimison, Holly Brugge; Pavel, Misha

    2014-07-01

    Early and reliable detection of cognitive decline is one of the most important challenges of current healthcare. In this project, we developed an approach whereby a frequently played computer game can be used to assess a variety of cognitive processes and estimate the results of the pen-and-paper trail making test (TMT)--known to measure executive function, as well as visual pattern recognition, speed of processing, working memory, and set-switching ability. We developed a computational model of the TMT based on a decomposition of the test into several independent processes, each characterized by a set of parameters that can be estimated from play of a computer game designed to resemble the TMT. An empirical evaluation of the model suggests that it is possible to use the game data to estimate the parameters of the underlying cognitive processes and using the values of the parameters to estimate the TMT performance. Cognitive measures and trends in these measures can be used to identify individuals for further assessment, to provide a mechanism for improving the early detection of neurological problems, and to provide feedback and monitoring for cognitive interventions in the home.

  16. Strategic Computing. New-Generation Computing Technology: A Strategic Plan for Its Development and Application to Critical Problems in Defense

    DTIC Science & Technology

    1983-10-28

    Computing. By seizing an opportunity to leverage recent advances in artificial intelligence, computer science, and microelectronics, the Agency plans...occurred in many separated areas of artificial intelligence, computer science, and microelectronics. Advances in "expert system" technology now...and expert knowledge o Advances in Artificial Intelligence: Mechanization of speech recognition, vision, and natural language understanding. o

  17. Development and Evaluation of the Diagnostic Power for a Computer-Based Two-Tier Assessment

    ERIC Educational Resources Information Center

    Lin, Jing-Wen

    2016-01-01

    This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…

  18. An assessment technique for computer-socket manufacturing

    PubMed Central

    Sanders, Joan; Severance, Michael

    2015-01-01

    An assessment strategy is presented for testing the quality of carving and forming of individual computer aided manufacturing facilities. The strategy is potentially useful to facilities making sockets and companies marketing manufacturing equipment. To execute the strategy, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, and then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file shapes. Then model shapes are compared with electronic file shapes to characterize carving performance. Socket shapes are compared with model shapes to characterize forming performance. The mean radial error (MRE), which is the average difference in radii between the two shapes being compared, provides insight into sizing quality. Inter-quartile range (IQR), the range of radial error for the best matched half of the points on the surfaces being compared, provides insight into shape quality. By determining MRE and IQR for carving and forming separately, the source(s) of socket shape error may be pinpointed. The developed strategy may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer aided manufacturing and insight into appropriate modifications to overcome them. PMID:21938663

  19. The development and testing of a qualitative instrument designed to assess critical thinking

    NASA Astrophysics Data System (ADS)

    Clauson, Cynthia Louisa

    This study examined a qualitative approach to assess critical thinking. An instrument was developed that incorporates an assessment process based on Dewey's (1933) concepts of self-reflection and critical thinking as problem solving. The study was designed to pilot test the critical thinking assessment process with writing samples collected from a heterogeneous group of students. The pilot test included two phases. Phase 1 was designed to determine the validity and inter-rater reliability of the instrument using two experts in critical thinking, problem solving, and literacy development. Validity of the instrument was addressed by requesting both experts to respond to ten questions in an interview. The inter-rater reliability was assessed by analyzing the consistency of the two experts' scorings of the 20 writing samples to each other, as well as to my scoring of the same 20 writing samples. Statistical analyses included the Spearman Rho and the Kuder-Richardson (Formula 20). Phase 2 was designed to determine the validity and reliability of the critical thinking assessment process with seven science teachers. Validity was addressed by requesting the teachers to respond to ten questions in a survey and interview. Inter-rater reliability was addressed by comparing the seven teachers' scoring of five writing samples with my scoring of the same five writing samples. Again, the Spearman Rho and the Kuder-Richardson (Formula 20) were used to determine the inter-rater reliability. The validity results suggest that the instrument is helpful as a guide for instruction and provides a systematic method to teach and assess critical thinking while problem solving with students in the classroom. The reliability results show the critical thinking assessment instrument to possess fairly high reliability when used by the experts, but weak reliability when used by classroom teachers. A major conclusion was drawn that teachers, as well as students, would need to receive instruction

  20. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment.

    PubMed

    Jain, Sangeeta; Menon, Kusum; Piquette, Dominique; Gottesman, Ronald; Hutchison, James; Gilfoyle, Elaine; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees.

  1. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment

    PubMed Central

    Jain, Sangeeta; Hutchison, James; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees. PMID:27610029

  2. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    NASA Astrophysics Data System (ADS)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step

  3. Electronic Quality of Life Assessment Using Computer-Adaptive Testing

    PubMed Central

    2016-01-01

    Background Quality of life (QoL) questionnaires are desirable for clinical practice but can be time-consuming to administer and interpret, making their widespread adoption difficult. Objective Our aim was to assess the performance of the World Health Organization Quality of Life (WHOQOL)-100 questionnaire as four item banks to facilitate adaptive testing using simulated computer adaptive tests (CATs) for physical, psychological, social, and environmental QoL. Methods We used data from the UK WHOQOL-100 questionnaire (N=320) to calibrate item banks using item response theory, which included psychometric assessments of differential item functioning, local dependency, unidimensionality, and reliability. We simulated CATs to assess the number of items administered before prespecified levels of reliability was met. Results The item banks (40 items) all displayed good model fit (P>.01) and were unidimensional (fewer than 5% of t tests significant), reliable (Person Separation Index>.70), and free from differential item functioning (no significant analysis of variance interaction) or local dependency (residual correlations < +.20). When matched for reliability, the item banks were between 45% and 75% shorter than paper-based WHOQOL measures. Across the four domains, a high standard of reliability (alpha>.90) could be gained with a median of 9 items. Conclusions Using CAT, simulated assessments were as reliable as paper-based forms of the WHOQOL with a fraction of the number of items. These properties suggest that these item banks are suitable for computerized adaptive assessment. These item banks have the potential for international development using existing alternative language versions of the WHOQOL items. PMID:27694100

  4. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  5. Crosswords to computers: a critical review of popular approaches to cognitive enhancement.

    PubMed

    Jak, Amy J; Seelye, Adriana M; Jurick, Sarah M

    2013-03-01

    Cognitive enhancement strategies have gained recent popularity and have the potential to benefit clinical and non-clinical populations. As technology advances and the number of cognitively healthy adults seeking methods of improving or preserving cognitive functioning grows, the role of electronic (e.g., computer and video game based) cognitive training becomes more relevant and warrants greater scientific scrutiny. This paper serves as a critical review of empirical evaluations of publically available electronic cognitive training programs. Many studies have found that electronic training approaches result in significant improvements in trained cognitive tasks. Fewer studies have demonstrated improvements in untrained tasks within the trained cognitive domain, non-trained cognitive domains, or on measures of everyday function. Successful cognitive training programs will elicit effects that generalize to untrained, practical tasks for extended periods of time. Unfortunately, many studies of electronic cognitive training programs are hindered by methodological limitations such as lack of an adequate control group, long-term follow-up and ecologically valid outcome measures. Despite these limitations, evidence suggests that computerized cognitive training has the potential to positively impact one's sense of social connectivity and self-efficacy.

  6. Understanding climate-induced migration through computational modeling: A critical overview with guidance for future efforts

    DOE PAGES

    Till, Charlotte; Haverkamp, Jamie; White, Devin; ...

    2016-11-22

    Climate change has the potential to displace large populations in many parts of the developed and developing world. Understanding why, how, and when environmental migrants decide to move is critical to successful strategic planning within organizations tasked with helping the affected groups, and mitigating their systemic impacts. One way to support planning is through the employment of computational modeling techniques. Models can provide a window into possible futures, allowing planners and decision makers to test different scenarios in order to understand what might happen. While modeling is a powerful tool, it presents both opportunities and challenges. This paper builds amore » foundation for the broader community of model consumers and developers by: providing an overview of pertinent climate-induced migration research, describing some different types of models and how to select the most relevant one(s), highlighting three perspectives on obtaining data to use in said model(s), and the consequences associated with each. It concludes with two case studies based on recent research that illustrate what can happen when ambitious modeling efforts are undertaken without sufficient planning, oversight, and interdisciplinary collaboration. Lastly, we hope that the broader community can learn from our experiences and apply this knowledge to their own modeling research efforts.« less

  7. Understanding climate-induced migration through computational modeling: A critical overview with guidance for future efforts

    SciTech Connect

    Till, Charlotte; Haverkamp, Jamie; White, Devin; Bhaduri, Budhendra

    2016-11-22

    Climate change has the potential to displace large populations in many parts of the developed and developing world. Understanding why, how, and when environmental migrants decide to move is critical to successful strategic planning within organizations tasked with helping the affected groups, and mitigating their systemic impacts. One way to support planning is through the employment of computational modeling techniques. Models can provide a window into possible futures, allowing planners and decision makers to test different scenarios in order to understand what might happen. While modeling is a powerful tool, it presents both opportunities and challenges. This paper builds a foundation for the broader community of model consumers and developers by: providing an overview of pertinent climate-induced migration research, describing some different types of models and how to select the most relevant one(s), highlighting three perspectives on obtaining data to use in said model(s), and the consequences associated with each. It concludes with two case studies based on recent research that illustrate what can happen when ambitious modeling efforts are undertaken without sufficient planning, oversight, and interdisciplinary collaboration. Lastly, we hope that the broader community can learn from our experiences and apply this knowledge to their own modeling research efforts.

  8. Sedimentation equilibria in polydisperse ferrofluids: critical comparisons between experiment, theory, and computer simulation.

    PubMed

    Elfimova, Ekaterina A; Ivanov, Alexey O; Lakhtina, Ekaterina V; Pshenichnikov, Alexander F; Camp, Philip J

    2016-05-14

    The sedimentation equilibrium of dipolar particles in a ferrofluid is studied using experiment, theory, and computer simulation. A theory of the particle-concentration profile in a dipolar hard-sphere fluid is developed, based on the local-density approximation and accurate expressions from a recently introduced logarithmic free energy approach. The theory is tested critically against Monte Carlo simulation results for monodisperse and bidisperse dipolar hard-sphere fluids in homogeneous gravitational fields. In the monodisperse case, the theory is very accurate over broad ranges of gravitational field strength, volume fraction, and dipolar coupling constant. In the bidisperse case, with realistic dipolar coupling constants and compositions, the theory is excellent at low volume fraction, but is slightly inaccurate at high volume fraction in that it does not capture a maximum in the small-particle concentration profile seen in simulations. Possible reasons for this are put forward. Experimental measurements of the magnetic-susceptibility profile in a real ferrofluid are then analysed using the theory. The concentration profile is linked to the susceptibility profile using the second-order modified mean-field theory. It is shown that the experimental results are not consistent with the sample being monodisperse. By introducing polydispersity in the simplest possible way, namely by assuming the system is a binary mixture, almost perfect agreement between theory and experiment is achieved.

  9. Application of the Sequential Organ Failure Assessment Score to predict outcome in critically ill dogs: preliminary results.

    PubMed

    Ripanti, D; Dino, G; Piovano, G; Farca, A

    2012-08-01

    In human medicine the Sequential Organ Failure Assessment (SOFA) score is one of the most commonly organ dysfunction scoring systems used to assess critically ill patients and to predict the outcome in Intensive Care Units (ICUs). It is composed of scores from six organ systems (respiratory, cardiovascular, hepatic, coagulation, renal, and neurological) graded according to the degree of the dysfunction. The aim of the current study was to describe the applicability of the SOFA score in assessing the outcome of critically ill dogs. A total of 45 dogs admitted to the ICU was enrolled. Among these, 40 dogs completed the study: 50 % survived and left the veterinary clinic. The SOFA score was computed for each dog every 24 hours for the first 3 days of ICU stay, starting on the day of admission. A statistically significant correlation between SOFA score and death or survival was found. Most of the dogs showing an increase of the SOFA score in the first 3 days of hospitalization died, whereas the dogs with a decrease of the score survived. These results suggest that the SOFA score system could be considered a useful indicator of prognosis in ICUs hospitalized dogs.

  10. Critical Thinking and Formative Assessments: Increasing the Rigor in Your Classroom

    ERIC Educational Resources Information Center

    Moore, Betsy; Stanley, Todd

    2010-01-01

    Develop your students' critical thinking skills and prepare them to perform competitively in the classroom, on state tests, and beyond. In this book, Moore and Stanley show you how to effectively instruct your students to think on higher levels, and how to assess their progress. As states move toward common achievement standards, teachers have…

  11. Developing Institutional Standards for Critical Thinking Using the Collegiate Learning Assessment. Research Brief

    ERIC Educational Resources Information Center

    Hardison, Chaitra M.; Vilamovska, Anna-Marie

    2009-01-01

    The Collegiate Learning Assessment (CLA) measures students' critical thinking skills, but some institutions remain uncertain how to interpret the results. RAND researchers designed a method that institutions can use to develop their own standards. It consists of a three-step process and a system of checks to validate the results. This method will…

  12. Development of Critical Thinking Self-Assessment System Using Wearable Device

    ERIC Educational Resources Information Center

    Gotoh, Yasushi

    2015-01-01

    In this research the author defines critical thinking as skills and dispositions which enable one to solve problems logically and to attempt to reflect autonomously by means of meta-cognitive activities on one's own problem-solving processes. The author focuses on providing meta-cognitive knowledge to help with self-assessment. To develop…

  13. Connecting Assessment and Instruction to Help Students Become More Critical Producers of Multimedia

    ERIC Educational Resources Information Center

    Ostenson, Jonathan William

    2012-01-01

    Classroom teachers have been encouraged to incorporate more multimedia production in the classroom as a means of helping students develop critical media literacy skills. However, they have not always been well trained in how to evaluate the work students create; many teachers struggle to know which criteria to use in assessing student work. This…

  14. A Study on Critical Thinking Assessment System of College English Writing

    ERIC Educational Resources Information Center

    Dong, Tian; Yue, Lu

    2015-01-01

    This research attempts to discuss the validity of introducing the evaluation of students' critical thinking skills (CTS) into the assessment system of college English writing through an empirical study. In this paper, 30 College English Test Band 4 (CET-4) writing samples were collected and analyzed. Students' CTS and the final scores of collected…

  15. Critical Thinking and Political Participation: The Development and Assessment of a Causal Model.

    ERIC Educational Resources Information Center

    Guyton, Edith M.

    An assessment of a four-stage conceptual model reveals that critical thinking has indirect positive effects on political participation through its direct effects on personal control, political efficacy, and democratic attitudes. The model establishes causal relationships among selected personality variables (self-esteem, personal control, and…

  16. Quality Is the Key: Critical Issues in Teaching, Learning and Assessment in Vocational Education and Training

    ERIC Educational Resources Information Center

    Mitchell, John; Chappell, Clive; Bateman, Andrea; Roy, Susan

    2006-01-01

    The main finding from research conducted in 2005 into the critical issues in teaching, learning and assessment in vocational education and training (VET) was that "quality is the major issue." While the research identified many issues--such as the need for providers to be increasingly flexible and responsive in meeting the multiple…

  17. Preliminary performance assessment of computer automated facial approximations using computed tomography scans of living individuals.

    PubMed

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2013-12-10

    ReFace (Reality Enhancement Facial Approximation by Computational Estimation) is a computer-automated facial approximation application jointly developed by the Federal Bureau of Investigation and GE Global Research. The application derives a statistically based approximation of a face from a unidentified skull using a dataset of ~400 human head computer tomography (CT) scans of living adult American individuals from four ancestry groups: African, Asian, European and Hispanic (self-identified). To date only one unpublished subjective recognition study has been conducted using ReFace approximations. It indicated that approximations produced by ReFace were recognized above chance rates (10%). This preliminary study assesses: (i) the recognizability of five ReFace approximations; (ii) the recognizability of CT-derived skin surface replicas of the same individuals whose skulls were used to create the ReFace approximations; and (iii) the relationship between recognition performance and resemblance ratings of target individuals. All five skin surface replicas were recognized at rates statistically significant above chance (22-50%). Four of five ReFace approximations were recognized above chance (5-18%), although with statistical significance only at the higher rate. Such results suggest reconsideration of the usefulness of the type of output format utilized in this study, particularly in regard to facial approximations employed as a means of identifying unknown individuals.

  18. Japanese technology assessment: Computer science, opto- and microelectronics mechatronics, biotechnology

    SciTech Connect

    Brandin, D.; Wieder, H.; Spicer, W.; Nevins, J.; Oxender, D.

    1986-01-01

    The series studies Japanese research and development in four high-technology areas - computer science, opto and microelectronics, mechatronics (a term created by the Japanese to describe the union of mechanical and electronic engineering to produce the next generation of machines, robots, and the like), and biotechnology. The evaluations were conducted by panels of U.S. scientists - chosen from academia, government, and industry - actively involved in research in areas of expertise. The studies were prepared for the purpose of aiding the U.S. response to Japan's technological challenge. The main focus of the assessments is on the current status and long-term direction and emphasis of Japanese research and development. Other aspects covered include evolution of the state of the art; identification of Japanese researchers, R and D organizations, and resources; and comparative U.S. efforts. The general time frame of the studies corresponds to future industrial applications and potential commercial impacts spanning approximately the next two decades.

  19. Approaches for the computationally efficient assessment of the plug-in HEV impact on the grid

    NASA Astrophysics Data System (ADS)

    Lee, Tae-Kyung; Filipi, Zoran S.

    2012-11-01

    Realistic duty cycles are critical for design and assessment of hybrid propulsion systems, in particular, plug-in hybrid electric vehicles. The analysis of the PHEV impact requires a large amount of data about daily missions for ensuring realism in predicted temporal loads on the grid. This paper presents two approaches for the reduction of the computational effort while assessing the large scale PHEV impact on the grid, namely 1) "response surface modelling" approach; and 2) "daily driving schedule modelling" approach. The response surface modelling approach replaces the time-consuming vehicle simulations by response surfaces constructed off-line with the consideration of the real-world driving. The daily driving modelling approach establishes a correlation between departure and arrival times, and it predicts representative driving patterns with a significantly reduced number of simulation cases. In both cases, representative synthetic driving cycles are used to capture the naturalistic driving characteristics for a given trip length. The proposed approaches enable construction of 24-hour missions, assessments of charging requirements at the time of plugging-in, and temporal distributions of the load on the grid with high computational efficiency.

  20. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    SciTech Connect

    Lo, P. Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G.; Argula, R.; Strange, C.

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  1. Color calculations for and perceptual assessment of computer graphic images

    SciTech Connect

    Meyer, G.W.

    1986-01-01

    Realistic image synthesis involves the modelling of an environment in accordance with the laws of physics and the production of a final simulation that is perceptually acceptable. To be considered a scientific endeavor, synthetic image generation should also include the final step of experimental verification. This thesis concentrates on the color calculations that are inherent in the production of the final simulation and on the perceptual assessment of the computer graphic images that result. The fundamental spectral sensitivity functions that are active in the human visual system are introduced and are used to address color-blindness issues in computer graphics. A digitally controlled color television monitor is employed to successfully implement both the Farnsworth Munsell 100 hues test and a new color vision test that yields more accurate diagnoses. Images that simulate color blind vision are synthesized and are used to evaluate color scales for data display. Gaussian quadrature is used with a set of opponent fundamental to select the wavelengths at which to perform synthetic image generation.

  2. An assessment of criticality safety at the Department of Energy Rocky Flats Plant, Golden, Colorado, July--September 1989

    SciTech Connect

    Mattson, Roger J.

    1989-09-01

    This is a report on the 1989 independent Criticality Safety Assessment of the Rocky Flats Plant, primarily in response to public concerns that nuclear criticality accidents involving plutonium may have occurred at this nuclear weapon component fabrication and processing plant. The report evaluates environmental issues, fissile material storage practices, ventilation system problem areas, and criticality safety practices. While no evidence of a criticality accident was found, several recommendations are made for criticality safety improvements. 9 tabs.

  3. English Computer Critical Thinking Reading and Writing Interactive Multi-Media Programs for Comparison/Contrast and Analysis.

    ERIC Educational Resources Information Center

    Barkley, Christine

    Two computer programs were developed to enhance community college students' critical thinking skills in the areas of "Comparison and Contrast" and "Analysis." Instructors have several options in using the programs. With access to an LCD panel and an overhead projector, instructors can use the programs in the classroom, manipulating the computer…

  4. How Day of Posting Affects Level of Critical Discourse in Asynchronous Discussions and Computer-Supported Collaborative Argumentation

    ERIC Educational Resources Information Center

    Jeong, Allan; Frazier, Sue

    2008-01-01

    In asynchronous threaded discussions, messages posted near the end of the week provide less time for students to critically examine and respond to ideas presented in the messages than messages posted early in the week. This study examined how the day in which messages are posted (early, midweek and weekend) in computer-supported collaborative…

  5. New Dental Accreditation Standard on Critical Thinking: A Call for Learning Models, Outcomes, Assessments.

    PubMed

    Johnsen, David C; Williams, John N; Baughman, Pauletta Gay; Roesch, Darren M; Feldman, Cecile A

    2015-10-01

    This opinion article applauds the recent introduction of a new dental accreditation standard addressing critical thinking and problem-solving, but expresses a need for additional means for dental schools to demonstrate they are meeting the new standard because articulated outcomes, learning models, and assessments of competence are still being developed. Validated, research-based learning models are needed to define reference points against which schools can design and assess the education they provide to their students. This article presents one possible learning model for this purpose and calls for national experts from within and outside dental education to develop models that will help schools define outcomes and assess performance in educating their students to become practitioners who are effective critical thinkers and problem-solvers.

  6. TRECII: a computer program for transportation risk assessment

    SciTech Connect

    Franklin, A.L.

    1980-05-01

    A risk-based fault tree analysis method has been developed at the Pacific Northwest Laboratory (PNL) for analysis of nuclear fuel cycle operations. This methodology was developed for the Department of Energy (DOE) as a risk analysis tool for evaluating high level waste management systems. A computer package consisting of three programs was written at that time to assist in the performance of risk assessment: ACORN (draws fault trees), MFAULT (analyzes fault trees), and RAFT (calculates risk). This methodology evaluates release consequences and estimates the frequency of occurrence of these consequences. This document describes an additional risk calculating code which can be used in conjunction with two of the three codes for transportation risk assessment. TRECII modifies the definition of risk used in RAFT (prob. x release) to accommodate release consequences in terms of fatalities. Throughout this report risk shall be defined as probability times consequences (fatalities are one possible health effect consequence). This methodology has been applied to a variety of energy material transportation systems. Typically the material shipped has been radioactive, although some adaptation to fossil fuels has occurred. The approach is normally applied to truck or train transport systems with some adaptation to pipelines and aircraft. TRECII is designed to be used primarily in conjunction with MFAULT; however, with a moderate amount of effort by the user, it can be implemented independent of the risk analysis package developed at PNL. Code description and user instructions necessary for the implementation of the TRECII program are provided.

  7. [The importance of assessing the "quality of life" in surgical interventions for critical lower limb ischaemia].

    PubMed

    Papp, László

    2004-02-01

    'Patency' and 'limb salvage' are not automatically valid parameters when the functional outcome of treatment for critical limb ischaemia is assessed. In a small number of patients the functional result is not favourable despite the anatomical patency and limb salvage. The considerable investment of human/financial resources in the treatment of these patients is retrospectively questionable in such cases. Quality of Life questionnaires give valuable information on the functional outcome of any means of treatment for critical ischaemia. The problem with the generic tools in one particular sub-group of patients is the reliability and validity of the tests. The first disease-specific test in critical limb ischaemia is the King's College Vascular Quality of Life (VascuQoL) Questionnaire. Its use is recommended in patients with critical lower limb ischaemia. It is very useful for scientific reporting and is able to show retrospectively that particular group of patients in whom the technical success of the treatment did not result in improvement in quality of life. In general practice the use of the questionnaire can decrease the factor of subjectivity in the assessment of the current status of a patient with newly diagnosed or previously treated critical ischaemia.

  8. Life cycle assessment study of a Chinese desktop personal computer.

    PubMed

    Duan, Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li, Jinhui

    2009-02-15

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps--i.e. the end of life phase--lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study.

  9. Physiologic Assessment of Coronary Artery Disease by Cardiac Computed Tomography

    PubMed Central

    Kochar, Minisha

    2013-01-01

    Coronary artery disease (CAD) remains the leading cause of death and morbidity worldwide. To date, diagnostic evaluation of patients with suspected CAD has relied upon the use of physiologic non-invasive testing by stress electrocardiography, echocardiography, myocardial perfusion imaging (MPI) and magnetic resonance imaging. Indeed, the importance of physiologic evaluation of CAD has been highlighted by large-scale randomized trials that demonstrate the propitious benefit of an integrated anatomic-physiologic evaluation method by performing lesion-specific ischemia assessment by fractional flow reserve (FFR)-widely considered the "gold" standard for ischemia assessment-at the time of invasive angiography. Coronary CT angiography (CCTA) has emerged as an attractive non-invasive test for anatomic illustration of the coronary arteries and atherosclerotic plaque. In a series of prospective multicenter trials, CCTA has been proven as having high diagnostic performance for stenosis detection as compared to invasive angiography. Nevertheless, CCTA evaluation of obstructive stenoses is prone to overestimation of severity and further, detection of stenoses by CCTA does not reliably determine the hemodynamic significance of the visualized lesions. Recently, a series of technological innovations have advanced the possibility of CCTA to enable physiologic evaluation of CAD, thereby creating the potential of this test to provide an integrated anatomic-physiologic assessment of CAD. These advances include rest-stress MPI by CCTA as well as the use of computational fluid dynamics to non-invasively calculate FFR from a typically acquired CCTA. The purpose of this review is to summarize the most recent data addressing these 2 physiologic methods of CAD evaluation by CCTA. PMID:23964289

  10. A national critical loads framework for atmospheric deposition effects assessment: II. Defining assessment end points, indicators, and functional subregions

    NASA Astrophysics Data System (ADS)

    Hunsaker, Carolyn; Graham, Robin; Turner, Robert S.; Ringold, Paul L.; Holdren, George R.; Strickland, Timothy C.

    1993-05-01

    The United States Environmental Protection Agency, with support from the US Department of Energy and the National Oceanographic and Atmospheric Administration, has been evaluating the feasibility of an effects-based (critical loads) approach to atmospheric pollutant regulation and abatement. The rationale used to develop three of the six steps in a flexible assessment framework (Strickland and others, 1992) is presented along with a discussion of a variety of implementation approaches and their ramifications. The rationale proposes that it is necessary to provide an explicit statement of the condition of the resource that is considered valuable (assessment end point) because: (1) individual ecosystem components may be more or less sensitive to deposition, (2) it is necessary to select indicators of ecosystem condition that can be objectively measured and that reflect changes in the quality of the assessment end point, and (3) acceptable status (i.e., value of indicator and quality of assessment end point at critical load) must be defined. The rationale also stresses the importance of defining the assessment regions and subregions to improve the analysis and understanding of the indicator response to deposition. Subregional definition can be based on a variety of criteria, including informed judgment or quantitative procedures. It also depends on the geographic scale at which exposure and effects models are accurate and on data availability, resolution, and quality.

  11. Computer Science: A Historical Perspective and a Current Assessment

    NASA Astrophysics Data System (ADS)

    Wirth, Niklaus

    We begin with a brief review of the early years of Computer Science. This period was dominated by large, remote computers and the struggle to master the complex problems of programming. The remedy was found in programming languages providing suitable abstractions and programming models. Outstanding was the language Algol 60, designed by an international committee, and intended as a publication language for algorithms. The early period ends with the advent of the microcomputer in the mid 1970s, bringing computing into homes and schools. The outstanding computer was the Alto, the first personal computer with substantial computing power. It changed the world of computing.

  12. Assessing Critical Thinking Outcomes of Dental Hygiene Students Utilizing Virtual Patient Simulation: A Mixed Methods Study.

    PubMed

    Allaire, Joanna L

    2015-09-01

    Dental hygiene educators must determine which educational practices best promote critical thinking, a quality necessary to translate knowledge into sound clinical decision making. The aim of this small pilot study was to determine whether virtual patient simulation had an effect on the critical thinking of dental hygiene students. A pretest-posttest design using the Health Science Reasoning Test was used to evaluate the critical thinking skills of senior dental hygiene students at The University of Texas School of Dentistry at Houston Dental Hygiene Program before and after their experience with computer-based patient simulation cases. Additional survey questions sought to identify the students' perceptions of whether the experience had helped develop their critical thinking skills and improved their ability to provide competent patient care. A convenience sample of 31 senior dental hygiene students completed both the pretest and posttest (81.5% of total students in that class); 30 senior dental hygiene students completed the survey on perceptions of the simulation (78.9% response rate). Although the results did not show a significant increase in mean scores, the students reported feeling that the use of virtual patients was an effective teaching method to promote critical thinking, problem-solving, and confidence in the clinical realm. The results of this pilot study may have implications to support the use of virtual patient simulations in dental hygiene education. Future research could include a larger controlled study to validate findings from this study.

  13. Infrastructure Suitability Assessment Modeling for Cloud Computing Solutions

    DTIC Science & Technology

    2011-09-01

    implementations of the cloud com- puting paradigm, dissolving the need to co-locate user and computing power by providing desired services through the...increased imple- mentations of the cloud computing paradigm, dissolving the need to co-locate user and computing power by providing desired services...technologies, such as the widespread availability of fast computer networks, inexpensive computing power provided by small-form factor servers and

  14. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  15. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  16. Benchmark Problems Used to Assess Computational Aeroacoustics Codes

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Envia, Edmane

    2005-01-01

    The field of computational aeroacoustics (CAA) encompasses numerical techniques for calculating all aspects of sound generation and propagation in air directly from fundamental governing equations. Aeroacoustic problems typically involve flow-generated noise, with and without the presence of a solid surface, and the propagation of the sound to a receiver far away from the noise source. It is a challenge to obtain accurate numerical solutions to these problems. The NASA Glenn Research Center has been at the forefront in developing and promoting the development of CAA techniques and methodologies for computing the noise generated by aircraft propulsion systems. To assess the technological advancement of CAA, Glenn, in cooperation with the Ohio Aerospace Institute and the AeroAcoustics Research Consortium, organized and hosted the Fourth CAA Workshop on Benchmark Problems. Participants from industry and academia from both the United States and abroad joined to present and discuss solutions to benchmark problems. These demonstrated technical progress ranging from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The results are documented in the proceedings of the workshop. Problems were solved in five categories. In three of the five categories, exact solutions were available for comparison with CAA results. A fourth category of problems representing sound generation from either a single airfoil or a blade row interacting with a gust (i.e., problems relevant to fan noise) had approximate analytical or completely numerical solutions. The fifth category of problems involved sound generation in a viscous flow. In this case, the CAA results were compared with experimental data.

  17. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  18. Understanding the Critics of Educational Technology: Gender Inequities and Computers 1983-1993.

    ERIC Educational Resources Information Center

    Mangione, Melissa

    Although many view computers purely as technological tools to be utilized in the classroom and workplace, attention has been drawn to the social differences computers perpetuate, including those of race, class, and gender. This paper focuses on gender and computing by examining recent analyses in regards to content, form, and usage concerns. The…

  19. The Computing Alliance of Hispanic-Serving Institutions: Supporting Hispanics at Critical Transition Points

    ERIC Educational Resources Information Center

    Gates, Ann Quiroz; Hug, Sarah; Thiry, Heather; Alo, Richard; Beheshti, Mohsen; Fernandez, John; Rodriguez, Nestor; Adjouadi, Malek

    2011-01-01

    Hispanics have the highest growth rates among all groups in the U.S., yet they remain considerably underrepresented in computing careers and in the numbers who obtain advanced degrees. Hispanics constituted about 7% of undergraduate computer science and computer engineering graduates and 1% of doctoral graduates in 2007-2008. The small number of…

  20. Intelligent computer based reliability assessment of multichip modules

    NASA Astrophysics Data System (ADS)

    Grosse, Ian R.; Katragadda, Prasanna; Bhattacharya, Sandeepan; Kulkarni, Sarang

    1994-04-01

    To deliver reliable Multichip (MCM's) in the face of rapidly changing technology, computer-based tools are needed for predicting the thermal mechanical behavior of various MCM package designs and selecting the most promising design in terms of performance, robustness, and reliability. The design tool must be able to address new design technologies manufacturing processes, novel materials, application criteria, and thermal environmental conditions. Reliability is one of the most important factors for determining design quality and hence must be a central condition in the design of Multichip Module packages. Clearly, design engineers need computer based simulation tools for rapid and efficient electrical, thermal, and mechanical modeling and optimization of advanced devices. For three dimensional thermal and mechanical simulation of advanced devices, the finite element method (FEM) is increasingly becoming the numerical method of choice. FEM is a versatile and sophisticated numerical techniques for solving the partial differential equations that describe the physical behavior of complex designs. AUTOTHERM(TM) is a MCM design tool developed by Mentor Graphics for Motorola, Inc. This tool performs thermal analysis of MCM packages using finite element analysis techniques. The tools used the philosophy of object oriented representation of components and simplified specification of boundary conditions for the thermal analysis so that the user need not be an expert in using finite element techniques. Different package types can be assessed and environmental conditions can be modeled. It also includes a detailed reliability module which allows the user to choose a desired failure mechanism (model). All the current tools perform thermal and/or stress analysis and do not address the issues of robustness and optimality of the MCM designs and the reliability prediction techniques are based on closed form analytical models and can often fail to predict the cycles of failure (N

  1. Improving Student Performance through Computer-Based Assessment: Insights from Recent Research.

    ERIC Educational Resources Information Center

    Ricketts, C.; Wilks, S. J.

    2002-01-01

    Compared student performance on computer-based assessment to machine-graded multiple choice tests. Found that performance improved dramatically on the computer-based assessment when students were not required to scroll through the question paper. Concluded that students may be disadvantaged by the introduction of online assessment unless care is…

  2. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    NASA Astrophysics Data System (ADS)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  3. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  4. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  5. Focused assessment with sonography for trauma (FAST) versus multidetector computed tomography in hemodynamically stable emergency patients.

    PubMed

    Fornell Pérez, R

    2017-02-10

    This critically appraised topic (CAT) study aims to evaluate the quality and extent of the scientific evidence that supports the use of focused assessment with sonography for trauma (FAST) versus multidetector computed tomography (MDCT) in hemodynamically stable trauma patients in the emergency room. An efficient search of the literature yielded several recent articles with a high level of evidence. The CAT study concludes that FAST is an acceptable initial imaging test in hemodynamically stable patients, although its performance is limited in certain circumstances. The decision whether to use MDCT should be determined by evaluating the patient's degree of instability and the distance to the MDCT scanner. Nevertheless, few articles address the question of the distance to MDCT scanners in emergency departments.

  6. Criticality safety evaluations - a {open_quotes}stalking horse{close_quotes} for integrated safety assessment

    SciTech Connect

    Williams, R.A.

    1995-12-31

    The Columbia Fuel Fabrication Facility of the Westinghouse Commercial Nuclear Fuel Division manufactures low-enriched uranium fuel and associated components for use in commercial pressurized water power reactors. To support development of a comprehensive integrated safety assessment (ISA) for the facility, as well as to address increasing U.S. Nuclear Regulatory Commission (NRC) expectations regarding such a facility`s criticality safety assessments, a project is under way to complete criticality safety evaluations (CSEs) of all plant systems used in processing nuclear materials. Each CSE is made up of seven sections, prepared by a multidisciplinary team of process engineers, systems engineers, safety engineers, maintenance representatives, and operators. This paper provides a cursory outline of the type of information presented in a CSE.

  7. Scientific and social significance of assessing individual differences: "sinking shafts at a few critical points".

    PubMed

    Lubinski, D

    2000-01-01

    This chapter reviews empirical findings on the importance of assessing individual differences in human behavior. Traditional dimensions of human abilities, personality, and vocational interests play critical roles in structuring a variety of important behaviors and outcomes (e.g. achieved socioeconomic status, educational choices, work performance, delinquency, health risk behaviors, and income). In the review of their importance, the construct of general intelligence is featured, but attributes that routinely add incremental validity to cognitive assessments are also discussed. Recent experimental and methodological advances for better understanding how these dimensions may contribute to other psychological frameworks are reviewed, as are ways for determining their scientific significance within domains where they are not routinely assessed. Finally, some noteworthy models are outlined that highlight the importance of assessing relatively distinct classes of individual-differences attributes simultaneously. For understanding fully complex human phenomena such as crime, eminence, and educational-vocational development, such a multifaceted approach is likely to be the most productive.

  8. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  9. Critical assessment of the impurity diffusivities in solid and liquid silicon

    NASA Astrophysics Data System (ADS)

    Tang, Kai; Øvrelid, Eivind J.; Tranell, Gabriella; Tangstad, Merete

    2009-11-01

    The diffusion of impurities in solid and liquid silicon is critically reviewed and assessed in this paper. The activation energies and pre-exponential factors in theArrhenius equation have been evaluated using the least-squares analysis and semi-empirical correlations. Impurity diffusion coefficients for Ag, Al, As, Au, B, Bi, C, Co, Cr, Cu, Fe, Ga, In, Li, Mn, N, Ni, 0, P, S, Sb, Te, Ti, and Zn in both solid and liquid silicon have been obtained. The current assessed impurity diffusivities can be coupled with the assessed thermochemical properties for the simulation of diffusion phenomena in the production of solar grade cell silicon feedstock. The assessed diffusivities have been applied to simulate the impurity diffusion profiles and the denuded zone in the intrinsic gettering annealing.

  10. Assessment team report on flight-critical systems research at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Siewiorek, Daniel P. (Compiler); Dunham, Janet R. (Compiler)

    1989-01-01

    The quality, coverage, and distribution of effort of the flight-critical systems research program at NASA Langley Research Center was assessed. Within the scope of the Assessment Team's review, the research program was found to be very sound. All tasks under the current research program were at least partially addressing the industry needs. General recommendations made were to expand the program resources to provide additional coverage of high priority industry needs, including operations and maintenance, and to focus the program on an actual hardware and software system that is under development.

  11. An Integrated Soft Computing Approach to Hughes Syndrome Risk Assessment.

    PubMed

    Vilhena, João; Rosário Martins, M; Vicente, Henrique; Grañeda, José M; Caldeira, Filomena; Gusmão, Rodrigo; Neves, João; Neves, José

    2017-03-01

    The AntiPhospholipid Syndrome (APS) is an acquired autoimmune disorder induced by high levels of antiphospholipid antibodies that cause arterial and veins thrombosis, as well as pregnancy-related complications and morbidity, as clinical manifestations. This autoimmune hypercoagulable state, usually known as Hughes syndrome, has severe consequences for the patients, being one of the main causes of thrombotic disorders and death. Therefore, it is required to be preventive; being aware of how probable is to have that kind of syndrome. Despite the updated of antiphospholipid syndrome classification, the diagnosis remains difficult to establish. Additional research on clinically relevant antibodies and standardization of their quantification are required in order to improve the antiphospholipid syndrome risk assessment. Thus, this work will focus on the development of a diagnosis decision support system in terms of a formal agenda built on a Logic Programming approach to knowledge representation and reasoning, complemented with a computational framework based on Artificial Neural Networks. The proposed model allows for improving the diagnosis, classifying properly the patients that really presented this pathology (sensitivity higher than 85%), as well as classifying the absence of APS (specificity close to 95%).

  12. Self-motion perception: assessment by computer-generated animations

    NASA Technical Reports Server (NTRS)

    Parker, D. E.; Harm, D. L.; Sandoz, G. R.; Skinner, N. C.

    1998-01-01

    The goal of this research is more precise description of adaptation to sensory rearrangements, including microgravity, by development of improved procedures for assessing spatial orientation perception. Thirty-six subjects reported perceived self-motion following exposure to complex inertial-visual motion. Twelve subjects were assigned to each of 3 perceptual reporting procedures: (a) animation movie selection, (b) written report selection and (c) verbal report generation. The question addressed was: do reports produced by these procedures differ with respect to complexity and reliability? Following repeated (within-day and across-day) exposures to 4 different "motion profiles," subjects either (a) selected movies presented on a laptop computer, or (b) selected written descriptions from a booklet, or (c) generated self-motion verbal descriptions that corresponded most closely with their motion experience. One "complexity" and 2 reliability "scores" were calculated. Contrary to expectations, reliability and complexity scores were essentially equivalent for the animation movie selection and written report selection procedures. Verbal report generation subjects exhibited less complexity than did subjects in the other conditions and their reports were often ambiguous. The results suggest that, when selecting from carefully written descriptions and following appropriate training, people may be better able to describe their self-motion experience with words than is usually believed.

  13. Multi-intelligence critical rating assessment of fusion techniques (MiCRAFT)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik

    2015-06-01

    Assessment of multi-intelligence fusion techniques includes credibility of algorithm performance, quality of results against mission needs, and usability in a work-domain context. Situation awareness (SAW) brings together low-level information fusion (tracking and identification), high-level information fusion (threat and scenario-based assessment), and information fusion level 5 user refinement (physical, cognitive, and information tasks). To measure SAW, we discuss the SAGAT (Situational Awareness Global Assessment Technique) technique for a multi-intelligence fusion (MIF) system assessment that focuses on the advantages of MIF against single intelligence sources. Building on the NASA TLX (Task Load Index), SAGAT probes, SART (Situational Awareness Rating Technique) questionnaires, and CDM (Critical Decision Method) decision points; we highlight these tools for use in a Multi-Intelligence Critical Rating Assessment of Fusion Techniques (MiCRAFT). The focus is to measure user refinement of a situation over the information fusion quality of service (QoS) metrics: timeliness, accuracy, confidence, workload (cost), and attention (throughput). A key component of any user analysis includes correlation, association, and summarization of data; so we also seek measures of product quality and QuEST of information. Building a notion of product quality from multi-intelligence tools is typically subjective which needs to be aligned with objective machine metrics.

  14. Examining the Critical Thinking Dispositions and the Problem Solving Skills of Computer Engineering Students

    ERIC Educational Resources Information Center

    Özyurt, Özcan

    2015-01-01

    Problem solving is an indispensable part of engineering. Improving critical thinking dispositions for solving engineering problems is one of the objectives of engineering education. In this sense, knowing critical thinking and problem solving skills of engineering students is of importance for engineering education. This study aims to determine…

  15. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al

  16. Two-layer critical dimensions and overlay process window characterization and improvement in full-chip computational lithography

    NASA Astrophysics Data System (ADS)

    Sturtevant, John L.; Liubich, Vlad; Gupta, Rachit

    2016-04-01

    Edge placement error (EPE) was a term initially introduced to describe the difference between predicted pattern contour edge and the design target for a single design layer. Strictly speaking, this quantity is not directly measurable in the fab. What is of vital importance is the relative edge placement errors between different design layers, and in the era of multipatterning, the different constituent mask sublayers for a single design layer. The critical dimensions (CD) and overlay between two layers can be measured in the fab, and there has always been a strong emphasis on control of overlay between design layers. The progress in this realm has been remarkable, accelerated in part at least by the proliferation of multipatterning, which reduces the available overlay budget by introducing a coupling of overlay and CD errors for the target layer. Computational lithography makes possible the full-chip assessment of two-layer edge to edge distances and two-layer contact overlap area. We will investigate examples of via-metal model-based analysis of CD and overlay errors. We will investigate both single patterning and double patterning. For single patterning, we show the advantage of contour-to-contour simulation over contour to target simulation, and how the addition of aberrations in the optical models can provide a more realistic CD-overlay process window (PW) for edge placement errors. For double patterning, the interaction of 4-layer CD and overlay errors is very complex, but we illustrate that not only can full-chip verification identify potential two-layer hotspots, the optical proximity correction engine can act to mitigate such hotspots and enlarge the joint CD-overlay PW.

  17. Critical Thinking Traits of Top-Tier Experts and Implications for Computer Science Education

    DTIC Science & Technology

    2007-08-01

    A documented shortage of technical leadership and top-tier performers in computer science jeopardizes the technological edge, security, and economic...are reevaluating the traditional academic standards they have used to predict success for their top-tier performers in computer science . Previous...research in the computer science field has focused either on the programming skills of its experts or has attempted to predict the academic success of

  18. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Baker, Ann E; Bland, Arthur S Buddy; Hack, James J; Barker, Ashley D; Boudwin, Kathlyn J.; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; Wells, Jack C; White, Julia C

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and where

  19. Combining destination diversion decisions and critical in-flight event diagnosis in computer aided testing of pilots

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.; Romer, D. J.

    1984-01-01

    Rockwell and Giffin (1982) and Giffin and Rockwell (1983) have discussed the use of computer aided testing (CAT) in the study of pilot response to critical in-flight events. The present investigation represents an extension of these earlier studies. In testing pilot responses to critical in-flight events, use is made of a Plato-touch CRT system operating on a menu based format. In connection with the typical diagnostic problem, the pilot was presented with symptoms within a flight scenario. In one problem, the pilot has four minutes for obtaining the information which is needed to make a diagnosis of the problem. In the reported research, the attempt has been made to combine both diagnosis and diversion scenario into a single computer aided test. Tests with nine subjects were conducted. The obtained results and their significance are discussed.

  20. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    SciTech Connect

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.; Miley, Terri B.; Nichols, William E.; Strenge, Dennis L.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  1. Using quality assessment tools to critically appraise ageing research: a guide for clinicians.

    PubMed

    Harrison, Jennifer Kirsty; Reid, James; Quinn, Terry J; Shenkin, Susan Deborah

    2016-12-07

    Evidence based medicine tells us that we should not accept published research at face value. Even research from established teams published in the highest impact journals can have methodological flaws, biases and limited generalisability. The critical appraisal of research studies can seem daunting, but tools are available to make the process easier for the non-specialist. Understanding the language and process of quality assessment is essential when considering or conducting research, and is also valuable for all clinicians who use published research to inform their clinical practice.We present a review written specifically for the practising geriatrician. This considers how quality is defined in relation to the methodological conduct and reporting of research. Having established why quality assessment is important, we present and critique tools which are available to standardise quality assessment. We consider five study designs: RCTs, non-randomised studies, observational studies, systematic reviews and diagnostic test accuracy studies. Quality assessment for each of these study designs is illustrated with an example of published cognitive research. The practical applications of the tools are highlighted, with guidance on their strengths and limitations. We signpost educational resources and offer specific advice for use of these tools.We hope that all geriatricians become comfortable with critical appraisal of published research and that use of the tools described in this review - along with awareness of their strengths and limitations - become a part of teaching, journal clubs and practice.

  2. Critical validity assessment of theoretical models: charge-exchange at intermediate and high energies

    NASA Astrophysics Data System (ADS)

    Belkić, Dževad

    1999-06-01

    Exact comprehensive computations are carried out by means of four leading second-order approximations yielding differential cross sections dQ/ dΩ for the basic charge exchange process H ++H(1s)→H(1s)+H + at intermediate and high energies. The obtained extensive set of results is thoroughly tested against all the existing experimental data with the purpose of critically assessing the validity of the boundary corrected second-Born (CB2), continuum-distorted wave (CDW), impulse approximation (IA) and the reformulated impulse approximation (RIA). The conclusion which emerges from this comparative study clearly indicates that the RIA agrees most favorably with the measurements available over a large energy range 25 keV-5 MeV. Such a finding reaffirms the few-particle quantum scattering theory which imposes several strict conditions on adequate second-order methods. These requirements satisfied by the RIA are: (i) normalisations of all the scattering wave functions, (ii) correct boundary conditions in both entrance and exit channels, (iii) introduction of a mathematically justified two-center continuum state for the sum of an attractive and a repulsive Coulomb potential with the same interaction strength, (iv) inclusion of the multiple scattering effects neglected in the IA, (v) a proper description of the Thomas double scattering in good agreement with the experiments and without any unobserved peak splittings. Nevertheless, the performed comparative analysis of the above four approximations indicates that none of the methods is free from some basic shortcomings. Despite its success, the RIA remains essentially a high-energy model like the other three methods under study. More importantly, their perturbative character leaves virtually no room for further systematic improvements, since the neglected higher-order terms are prohibitively tedious for practical purposes and have never been computed exactly. To bridge this gap, we presently introduce the variational Pad

  3. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    SciTech Connect

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent on the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.

  4. Actuarial risk assessment models: a review of critical issues related to violence and sex-offender recidivism assessments.

    PubMed

    Sreenivasan, S; Kirkish, P; Garrick, T; Weinberger, L E; Phenix, A

    2000-01-01

    Risk assessment in the area of identification of violence has been dichotomized by several prominent researchers as the "clinical approach" versus the "actuarial method". The proponents of the actuarial approach argue for actuarially derived decisions to replace existing clinical practice. The actuarial method requires no clinical input, just a translation of the relevant material from the records to calculate the risk score. A risk appraisal approach based upon a sole actuarial method raises several questions: those of public safety, peer-accepted standards of practice, liability issues, and concordance with evidence-based medicine practice. We conclude that the sole actuarial approach fails to satisfy these critical issues.

  5. Quantum wavepacket ab initio molecular dynamics: an approach for computing dynamically averaged vibrational spectra including critical nuclear quantum effects.

    PubMed

    Sumner, Isaiah; Iyengar, Srinivasan S

    2007-10-18

    We have introduced a computational methodology to study vibrational spectroscopy in clusters inclusive of critical nuclear quantum effects. This approach is based on the recently developed quantum wavepacket ab initio molecular dynamics method that combines quantum wavepacket dynamics with ab initio molecular dynamics. The computational efficiency of the dynamical procedure is drastically improved (by several orders of magnitude) through the utilization of wavelet-based techniques combined with the previously introduced time-dependent deterministic sampling procedure measure to achieve stable, picosecond length, quantum-classical dynamics of electrons and nuclei in clusters. The dynamical information is employed to construct a novel cumulative flux/velocity correlation function, where the wavepacket flux from the quantized particle is combined with classical nuclear velocities to obtain the vibrational density of states. The approach is demonstrated by computing the vibrational density of states of [Cl-H-Cl]-, inclusive of critical quantum nuclear effects, and our results are in good agreement with experiment. A general hierarchical procedure is also provided, based on electronic structure harmonic frequencies, classical ab initio molecular dynamics, computation of nuclear quantum-mechanical eigenstates, and employing quantum wavepacket ab initio dynamics to understand vibrational spectroscopy in hydrogen-bonded clusters that display large degrees of anharmonicities.

  6. A Simple Widespread Computer Help Improves Nutrition Support Orders and Decreases Infection Complications in Critically Ill Patients

    PubMed Central

    Conseil, Mathieu; Carr, Julie; Molinari, Nicolas; Coisel, Yannaël; Cissé, Moussa; Belafia, Fouad; Delay, Jean-Marc; Jung, Boris; Jaber, Samir; Chanques, Gérald

    2013-01-01

    Aims To assess the impact of a simple computer-based decision-support system (computer help) on the quality of nutrition support orders and patients' outcome in Intensive-Care Unit (ICU). Methods This quality-improvement study was carried out in a 16-bed medical-surgical ICU in a French university hospital. All consecutive patients who stayed in ICU more than 10 days with non-oral feeding for more than 5 days were retrospectively included during two 12-month periods. Prescriptions of nutrition support were collected and compared to French national guidelines as a quality-improvement process. A computer help was constructed using a simple Excel-sheet (MicrosoftTM) to guide physicians' prescriptions according to guidelines. This computer help was displayed in computers previously used for medical orders. Physicians were informed but no systematic protocol was implemented. Patients included during the first (control group) and second period (computer help group) were compared for achievement of nutrition goals and ICU outcomes. Results The control and computer help groups respectively included 71 and 95 patients. Patients' characteristics were not significantly different between groups. In the computer help group, prescriptions achieved significantly more often 80% of nutrition goals for calorie (45% vs. 79% p<0.001) and nitrogen intake (3% vs. 37%, p<0.001). Incidence of nosocomial infections decreased significantly between the two groups (59% vs. 41%, p = 0.03). Mortality did not significantly differ between control (21%) and computer help groups (15%, p = 0.30). Conclusions Use of a widespread inexpensive computer help is associated with significant improvements in nutrition support orders and decreased nosocomial infections in ICU patients. This computer-help is provided in electronic supplement. PMID:23737948

  7. Assessment of Computer Self-Efficacy: Instrument Development and Validation.

    ERIC Educational Resources Information Center

    Murphy, Christine A.; And Others

    A 32-item Computer Self-Efficacy Scale (CSE) was developed to measure perceptions of capability regarding specific computer-related knowledge and skills. Bandura's theory of self-efficacy (1986) and Schunk's model of classroom learning (1985) guided the development of the CSE. Each of the skill-related items is preceded by the phrase "I feel…

  8. Assessing Social Presence in Asynchronous Text-based Computer Conferencing.

    ERIC Educational Resources Information Center

    Rourke, Liam; Anderson, Terry; Garrison, D. Randy; Archer, Walter

    1999-01-01

    Discusses computer conferencing in higher education, presents a community of inquiry model that includes benefits of computer conferencing, and discusses social presence, defined as the ability of learners to project themselves socially and affectively into a community of inquiry. Topics include teacher immediacy, coding, and content analysis of…

  9. Assessment of Competencies for Computer Information Systems Curricula.

    ERIC Educational Resources Information Center

    Womble, Myra N.

    1993-01-01

    In a survey of 80 managerial and 130 entry-level computer professionals, most entry workers believed they possessed competencies identified in Association for Computing Machinery (ACM) curricula; most managers did not agree. Most managers rated 28% of ACM competencies moderately to not important; 63% were so rated by entry workers. (SK)

  10. Assessment of Examinations in Computer Science Doctoral Education

    ERIC Educational Resources Information Center

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types…

  11. Effective Instruction and Assessment Methods That Lead to Gains in Critical Thinking as Measured by the Critical Thinking Assessment Test (CAT)

    ERIC Educational Resources Information Center

    Leming, Katie P.

    2016-01-01

    Previous qualitative research on educational practices designed to improve critical thinking has relied on anecdotal or student self-reports of gains in critical thinking. Unfortunately, student self-report data have been found to be unreliable proxies for measuring critical thinking gains. Therefore, in the current interpretivist study, five…

  12. How students measure up: An assessment instrument for introductory computer science

    NASA Astrophysics Data System (ADS)

    Decker, Adrienne

    This dissertation presents an assessment instrument specifically designed for programming-first introductory sequences in computer science as given in Computing Curricula 2001: Computer Science Volume. The first-year computer science course has been the focus of many recent innovations and many recent debates in the computer science curriculum. There is significant disagreement as to effective methodology in the first year of computing, and there has been no shortage of ideas as to what predicts student success in the first year of the computing curriculum. However, most investigations into predictors of success lack an appropriately validated assessment instrument to support or refute their findings. This is presumably due to the fact that there are very few validated assessment instruments available for assessing student performance in the first year of computing instruction. The instrument presented here is not designed to test particular language constructs, but rather the underlying principles of the first year of computing instruction. It has been administered to students at the end of their first year of an introductory computer science curriculum. Data needed for analysis of the instrument for reliability and validity was collected and analyzed. Use of this instrument enables validated assessment of student progress at the end of their first year, and also enables the study of further innovations in the curriculum for the first year computer science courses.

  13. New Guidelines for Assessment of Malnutrition in Adults: Obese Critically Ill Patients.

    PubMed

    Mauldin, Kasuen; O'Leary-Kelley, Colleen

    2015-08-01

    Recently released recommendations for detection and documentation of malnutrition in adults in clinical practice define 3 types of malnutrition: starvation related, acute disease or injury related, and chronic disease related. The first 2 are more easily recognized, but the third may be more often unnoticed, particularly in obese patients. Critical care patients tend to be at high risk for malnutrition and thus require a thorough nutritional assessment. Compared with patients of earlier times, intensive care unit patients today tend to be older, have more complex medical and comorbid conditions, and often are obese. Missed or delayed detection of malnutrition in these patients may contribute to increases in hospital morbidity and longer hospital stays. Critical care nurses are in a prime position to screen patients at risk for malnutrition and to work with members of the interprofessional team in implementing nutritional intervention plans.

  14. Implementing and assessing computational modeling in introductory mechanics

    NASA Astrophysics Data System (ADS)

    Caballero, Marcos D.; Kohlmyer, Matthew A.; Schatz, Michael F.

    2012-12-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term, 1357 students in this course solved a suite of 14 computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated with a proctored assignment involving a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation, and the implications for computational instruction in introductory science, technology, engineering, and mathematics (STEM) courses.

  15. A critical evaluation of the predictions of the NASA-Lockheed multielement airfoil computer program

    NASA Technical Reports Server (NTRS)

    Brune, G. W.; Manke, J. W.

    1978-01-01

    Theoretical predictions of several versions of the multielement airfoil computer program are evaluated. The computed results are compared with experimental high lift data of general aviation airfoils with a single trailing edge flap, and of airfoils with a leading edge flap and double slotted trailing edge flaps. Theoretical and experimental data include lift, pitching moment, profile drag and surface pressure distributions, boundary layer integral parameters, skin friction coefficients, and velocity profiles.

  16. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  17. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  18. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  19. Use of Bioelectrical Impedance Analysis for the Assessment of Nutritional Status in Critically Ill Patients

    PubMed Central

    Lee, Yoojin; Kwon, Oran; Shin, Cheung Soo

    2015-01-01

    Malnutrition is common in the critically ill patients and known to cause a variety of negative clinical outcomes. However, various conventional methods for nutrition assessment have several limitations. We hypothesized that body composition data, as measured using bioelectrical impedance analysis (BIA), may have a significant role in evaluating nutritional status and predicting clinical outcomes in critically ill patients. We gathered clinical, biochemical, and BIA data from 66 critically ill patients admitted to an intensive care unit. Patients were divided into three nutritional status groups according to their serum albumin level and total lymphocyte counts. The BIA results, conventional indicators of nutrition status, and clinical outcomes were compared and analyzed retrospectively. Results showed that the BIA indices including phase angle (PhA), extracellular water (ECW), and ECW/total body water (TBW) were significantly associated with the severity of nutritional status. Particularly, PhA, an indicator of the health of the cell membrane, was higher in the well-nourished patient group, whereas the edema index (ECW/TBW) was higher in the severely malnourished patient group. PhA was positively associated with albumin and ECW/TBW was negatively associated with serum albumin, hemoglobin, and duration of mechanical ventilation. In non-survivors, PhA was significantly lower and both ECW/TBW and %TBW/fat free mass were higher than in survivors. In conclusion, several BIA indexes including PhA and ECW/TBW may be useful for nutritional assessment and represent significant prognostic factors in the care of critically ill patients. PMID:25713790

  20. Computer-Assisted Assessment in Higher Education. Staff and Educational Development Series.

    ERIC Educational Resources Information Center

    Brown, Sally, Ed.; Race, Phil, Ed.; Bull, Joanna, Ed.

    This book profiles how computer-assisted assessment can help both staff and students by drawing on the experience and expertise of practitioners, in the United Kingdom and internationally, who are already using computer-assisted assessment. The publication is organized into three main sections--"Pragmatics and Practicalities of CAA,""Using CAA for…

  1. Effects of Computer versus Paper Administration of an Adult Functional Writing Assessment

    ERIC Educational Resources Information Center

    Chen, Jing; White, Sheida; McCloskey, Michael; Soroui, Jaleh; Chun, Young

    2011-01-01

    This study investigated the comparability of paper and computer versions of a functional writing assessment administered to adults 16 and older. Three writing tasks were administered in both paper and computer modes to volunteers in the field test of an assessment of adult literacy in 2008. One set of analyses examined mode effects on scoring by…

  2. Critical elements for human health risk assessment of less than lifetime exposures.

    PubMed

    Geraets, Liesbeth; Nijkamp, Monique M; Ter Burg, Wouter

    2016-11-01

    Less than lifetime exposure has confronted risk assessors as to how to interpret the risks for human health in case a chronic health-based limit is exceeded. Intermittent, fluctuating and peak exposures do not match with the basis of the chronic limit values possibly leading to conservative outcomes. This paper presents guidance on how to deal with human risk assessment of less than lifetime exposure. Important steps to be considered are characterization of the human exposure situation, evaluation whether the human less than lifetime exposure scenario corresponds to a non-chronic internal exposure: toxicokinetic and toxicodynamic considerations, and, finally, re-evaluation of the risk assessment. Critical elements for these steps are the mode of action, Haber's rule, and toxicokinetics (ADME) amongst others. Previous work for the endpoints non-genotoxic carcinogenicity and developmental toxicity is included in the guidance. The guidance provides a way to consider the critical elements, without setting default factors to correct for the less than lifetime exposure in risk assessment.

  3. Embedded assessment algorithms within home-based cognitive computer game exercises for elders.

    PubMed

    Jimison, Holly; Pavel, Misha

    2006-01-01

    With the recent consumer interest in computer-based activities designed to improve cognitive performance, there is a growing need for scientific assessment algorithms to validate the potential contributions of cognitive exercises. In this paper, we present a novel methodology for incorporating dynamic cognitive assessment algorithms within computer games designed to enhance cognitive performance. We describe how this approach works for variety of computer applications and describe cognitive monitoring results for one of the computer game exercises. The real-time cognitive assessments also provide a control signal for adapting the difficulty of the game exercises and providing tailored help for elders of varying abilities.

  4. Evaluation of critical materials for five advanced design photovoltaic cells with an assessment of indium and gallium

    SciTech Connect

    Watts, R.L.; Gurwell, W.E.; Jamieson, W.M.; Long, L.W.; Pawlewicz, W.T.; Smith, S.A.; Teeter, R.R.

    1980-05-01

    The objective of this study is to identify potential material supply constraints due to the large-scale deployment of five advanced photovoltaic (PV) cell designs, and to suggest strategies to reduce the impacts of these production capacity limitations and potential future material shortages. This report presents the results of the screening of the five following advanced PV cell designs: polycrystalline silicon, amorphous silicon, cadmium sulfide/copper sulfide frontwall, polycrystalline gallium arsenide MIS, and advanced concentrator-500X. Each of these five cells is screened individually assuming that they first come online in 1991, and that 25 GWe of peak capacity is online by the year 2000. A second computer screening assumes that each cell first comes online in 1991 and that each cell has 5 GWe of peak capacity by the year 2000, so that the total online cpacity for the five cells is 25 GWe. Based on a review of the preliminary basline screening results, suggestions were made for varying such parameters as the layer thickness, cell production processes, etc. The resulting PV cell characterizations were then screened again by the CMAP computer code. Earlier DOE sponsored work on the assessment of critical materials in PV cells conclusively identtified indium and gallium as warranting further investigation as to their availability. Therefore, this report includes a discussion of the future availability of gallium and indium. (WHK)

  5. Development of a structural health monitoring system for the life assessment of critical transportation infrastructure.

    SciTech Connect

    Roach, Dennis Patrick; Jauregui, David Villegas; Daumueller, Andrew Nicholas

    2012-02-01

    Recent structural failures such as the I-35W Mississippi River Bridge in Minnesota have underscored the urgent need for improved methods and procedures for evaluating our aging transportation infrastructure. This research seeks to develop a basis for a Structural Health Monitoring (SHM) system to provide quantitative information related to the structural integrity of metallic structures to make appropriate management decisions and ensuring public safety. This research employs advanced structural analysis and nondestructive testing (NDT) methods for an accurate fatigue analysis. Metal railroad bridges in New Mexico will be the focus since many of these structures are over 100 years old and classified as fracture-critical. The term fracture-critical indicates that failure of a single component may result in complete collapse of the structure such as the one experienced by the I-35W Bridge. Failure may originate from sources such as loss of section due to corrosion or cracking caused by fatigue loading. Because standard inspection practice is primarily visual, these types of defects can go undetected due to oversight, lack of access to critical areas, or, in riveted members, hidden defects that are beneath fasteners or connection angles. Another issue is that it is difficult to determine the fatigue damage that a structure has experienced and the rate at which damage is accumulating due to uncertain history and load distribution in supporting members. A SHM system has several advantages that can overcome these limitations. SHM allows critical areas of the structure to be monitored more quantitatively under actual loading. The research needed to apply SHM to metallic structures was performed and a case study was carried out to show the potential of SHM-driven fatigue evaluation to assess the condition of critical transportation infrastructure and to guide inspectors to potential problem areas. This project combines the expertise in transportation infrastructure at New

  6. Patients’ Expectations Regarding Medical Treatment: A Critical Review of Concepts and Their Assessment

    PubMed Central

    Laferton, Johannes A. C.; Kube, Tobias; Salzmann, Stefan; Auer, Charlotte J.; Shedden-Mora, Meike C.

    2017-01-01

    Patients’ expectations in the context of medical treatment represent a growing area of research, with accumulating evidence suggesting their influence on health outcomes across a variety of medical conditions. However, the aggregation of evidence is complicated due to an inconsistent and disintegrated application of expectation constructs and the heterogeneity of assessment strategies. Therefore, based on current expectation concepts, this critical review provides an integrated model of patients’ expectations in medical treatment. Moreover, we review existing assessment tools in the context of the integrative model of expectations and provide recommendations for improving future assessment. The integrative model includes expectations regarding treatment and patients’ treatment-related behavior. Treatment and behavior outcome expectations can relate to aspects regarding benefits and side effects and can refer to internal (e.g., symptoms) and external outcomes (e.g., reactions of others). Furthermore, timeline, structural and process expectations are important aspects with respect to medical treatment. Additionally, generalized expectations such as generalized self-efficacy or optimism have to be considered. Several instruments assessing different aspects of expectations in medical treatment can be found in the literature. However, many were developed without conceptual standardization and psychometric evaluation. Moreover, they merely assess single aspects of expectations, thus impeding the integration of evidence regarding the differential aspects of expectations. As many instruments assess treatment-specific expectations, they are not comparable between different conditions. To generate a more comprehensive understanding of expectation effects in medical treatments, we recommend that future research should apply standardized, psychometrically evaluated measures, assessing multidimensional aspects of patients’ expectations that are applicable across various

  7. Profiling of energy deposition fields in a modular HTHR with annular core: Computational/experimental studies at the ASTRA critical facility

    SciTech Connect

    Boyarinov, V. F.; Garin, V. P.; Glushkov, E. S.; Zimin, A. A.; Kompaniets, G. V.; Nevinitsa, V. A.; Polyakov, D. N.; Ponomarev, A. S.; Ponomarev-Stepnoi, N. N.; Smirnov, O. N.; Fomichenko, P. A.; Chunyaev, E. I.; Marova, E. V.; Sukharev, Yu. P.

    2010-12-15

    The paper presents the results obtained from the computational/experimental studies of the spatial distribution of the {sup 235}U fission reaction rate in a critical assembly with an annular core and poison profiling elements inserted into the inner graphite reflector. The computational analysis was carried out with the codes intended for design computation of an HTHR-type reactor.

  8. Volcanic hazards at distant critical infrastructure: A method for bespoke, multi-disciplinary assessment

    NASA Astrophysics Data System (ADS)

    Odbert, H. M.; Aspinall, W.; Phillips, J.; Jenkins, S.; Wilson, T. M.; Scourse, E.; Sheldrake, T.; Tucker, P.; Nakeshree, K.; Bernardara, P.; Fish, K.

    2015-12-01

    Societies rely on critical services such as power, water, transport networks and manufacturing. Infrastructure may be sited to minimise exposure to natural hazards but not all can be avoided. The probability of long-range transport of a volcanic plume to a site is comparable to other external hazards that must be considered to satisfy safety assessments. Recent advances in numerical models of plume dispersion and stochastic modelling provide a formalized and transparent approach to probabilistic assessment of hazard distribution. To understand the risks to critical infrastructure far from volcanic sources, it is necessary to quantify their vulnerability to different hazard stressors. However, infrastructure assets (e.g. power plantsand operational facilities) are typically complex systems in themselves, with interdependent components that may differ in susceptibility to hazard impact. Usually, such complexity means that risk either cannot be estimated formally or that unsatisfactory simplifying assumptions are prerequisite to building a tractable risk model. We present a new approach to quantifying risk by bridging expertise of physical hazard modellers and infrastructure engineers. We use a joint expert judgment approach to determine hazard model inputs and constrain associated uncertainties. Model outputs are chosen on the basis of engineering or operational concerns. The procedure facilitates an interface between physical scientists, with expertise in volcanic hazards, and infrastructure engineers, with insight into vulnerability to hazards. The result is a joined-up approach to estimating risk from low-probability hazards to critical infrastructure. We describe our methodology and show preliminary results for vulnerability to volcanic hazards at a typical UK industrial facility. We discuss our findings in the context of developing bespoke assessment of hazards from distant sources in collaboration with key infrastructure stakeholders.

  9. Approaches for assessment of vulnerability of critical infrastructures to weather-related hazards

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Uzielli, Marco; Vidar Vangelsten, Bjørn

    2016-04-01

    Critical infrastructures are essential components for the modern society to maintain its function, and malfunctioning of one of the critical infrastructure systems may have far-reaching consequences. Climate changes may lead to increase in frequency and intensity of weather-related hazards, creating challenges for the infrastructures. This paper outlines approaches to assess vulnerability posed by weather-related hazards to infrastructures. The approaches assess factors that affect the probability of a malfunctioning of the infrastructure should a weather-related threat occur, as well factors that affect the societal consequences of the infrastructure malfunctioning. Even if vulnerability factors are normally very infrastructure specific and hazard dependent, generic factors could be defined and analyzed. For the vulnerability and resilience of the infrastructure, such factors include e.g. robustness, buffer capacity, protection, quality, age, adaptability and transparency. For the vulnerability of the society in relation to the infrastructure, such factors include e.g. redundancy, substitutes and cascading effects. A semi-quantitative, indicator-based approach is proposed, providing schemes for ranking of the most important vulnerability indicators relevant for weather-related hazards on a relative scale. The application of the indicators in a semi-quantitative risk assessment is also demonstrated. In addition, a quantitative vulnerability model is proposed in terms of vulnerability (representing degree of loss) as a function of intensity, which is adaptable to different types of degree of loss (e.g. fraction of infrastructure users that lose their service, fraction of repair costs to full reconstruction costs). The vulnerability model can be calibrated with empirical data using deterministic calibration or a variety of probabilistic calibration approaches to account for the uncertainties within the model. The research leading to these results has received funding

  10. Identifying Reading Problems with Computer-Adaptive Assessments

    ERIC Educational Resources Information Center

    Merrell, C.; Tymms, P.

    2007-01-01

    This paper describes the development of an adaptive assessment called Interactive Computerised Assessment System (InCAS) that is aimed at children of a wide age and ability range to identify specific reading problems. Rasch measurement has been used to create the equal interval scales that form each part of the assessment. The rationale for the…

  11. QAM: A Competency Based Need Assessment Methodology and Computer Program.

    ERIC Educational Resources Information Center

    Gale, Larrie E.

    A needs assessment methodology is described which can be used (1) to assess the competencies required for functioning in a particular position, (2) to provide data for planning inservice and preservice educational programs, (3) to assess job performance, and (4) to provide information for personnel planners. Quadrants are formed using four…

  12. Assessment of gene order computing methods for Alzheimer's disease

    PubMed Central

    2013-01-01

    Background Computational genomics of Alzheimer disease (AD), the most common form of senile dementia, is a nascent field in AD research. The field includes AD gene clustering by computing gene order which generates higher quality gene clustering patterns than most other clustering methods. However, there are few available gene order computing methods such as Genetic Algorithm (GA) and Ant Colony Optimization (ACO). Further, their performance in gene order computation using AD microarray data is not known. We thus set forth to evaluate the performances of current gene order computing methods with different distance formulas, and to identify additional features associated with gene order computation. Methods Using different distance formulas- Pearson distance and Euclidean distance, the squared Euclidean distance, and other conditions, gene orders were calculated by ACO and GA (including standard GA and improved GA) methods, respectively. The qualities of the gene orders were compared, and new features from the calculated gene orders were identified. Results Compared to the GA methods tested in this study, ACO fits the AD microarray data the best when calculating gene order. In addition, the following features were revealed: different distance formulas generated a different quality of gene order, and the commonly used Pearson distance was not the best distance formula when used with both GA and ACO methods for AD microarray data. Conclusion Compared with Pearson distance and Euclidean distance, the squared Euclidean distance generated the best quality gene order computed by GA and ACO methods. PMID:23369541

  13. Cone beam computed tomography radiation dose and image quality assessments.

    PubMed

    Lofthag-Hansen, Sara

    2010-01-01

    Diagnostic radiology has undergone profound changes in the last 30 years. New technologies are available to the dental field, cone beam computed tomography (CBCT) as one of the most important. CBCT is a catch-all term for a technology comprising a variety of machines differing in many respects: patient positioning, volume size (FOV), radiation quality, image capturing and reconstruction, image resolution and radiation dose. When new technology is introduced one must make sure that diagnostic accuracy is better or at least as good as the one it can be expected to replace. The CBCT brand tested was two versions of Accuitomo (Morita, Japan): 3D Accuitomo with an image intensifier as detector, FOV 3 cm x 4 cm and 3D Accuitomo FPD with a flat panel detector, FOVs 4 cm x 4 cm and 6 cm x 6 cm. The 3D Accuitomo was compared with intra-oral radiography for endodontic diagnosis in 35 patients with 46 teeth analyzed, of which 41 were endodontically treated. Three observers assessed the images by consensus. The result showed that CBCT imaging was superior with a higher number of teeth diagnosed with periapical lesions (42 vs 32 teeth). When evaluating 3D Accuitomo examinations in the posterior mandible in 30 patients, visibility of marginal bone crest and mandibular canal, important anatomic structures for implant planning, was high with good observer agreement among seven observers. Radiographic techniques have to be evaluated concerning radiation dose, which requires well-defined and easy-to-use methods. Two methods: CT dose index (CTDI), prevailing method for CT units, and dose-area product (DAP) were evaluated for calculating effective dose (E) for both units. An asymmetric dose distribution was revealed when a clinical situation was simulated. Hence, the CTDI method was not applicable for these units with small FOVs. Based on DAP values from 90 patient examinations effective dose was estimated for three diagnostic tasks: implant planning in posterior mandible and

  14. Critical Thinking in and through Interactive Computer Hypertext and Art Education

    ERIC Educational Resources Information Center

    Taylor, Pamela G.

    2006-01-01

    As part of a two-year study, Pamela G. Taylor's high school art students constructed hypertext webs that linked the subject matter they studied in class to their own independent research and personal experiences, which in turn encouraged them to think critically about the material. Taylor bases this use of hypertext on the thinking of Paulo Freire…

  15. Critical Difference Table for Word Recognition Testing Derived Using Computer Simulation

    ERIC Educational Resources Information Center

    Carney, Edward; Schlauch, Robert S.

    2007-01-01

    Purpose: To construct a table for upper and lower limits of the 95% critical range for changes in word recognition scores obtained with monosyllabic word lists (of lengths 10, 25, 50, and 100 words) using newly available methods. Although such a table has been available for nearly 30 years (A. R. Thornton & M. J. M. Raffin, 1978), the earlier…

  16. A nuclear criticality safety assessment of the loss of moderation control in 2 1/2 and 10-ton cylinders containing enriched UF{sub 6}

    SciTech Connect

    Newvahner, R.L.; Pryor, W.A.

    1991-12-31

    Moderation control for maintaining nuclear criticality safety in 2 {1/2}-ton, 10-ton, and 14-ton cylinders containing enriched uranium hexafluoride (UF{sub 6}) has been used safely within the nuclear industry for over thirty years, and is dependent on cylinder integrity and containment. This assessment evaluates the loss of moderation control by the breaching of containment and entry of water into the cylinders. The first objective of this study was to estimate the required amounts of water entering these large UF{sub 6} cylinders to react with, and to moderate the uranium compounds sufficiently to cause criticality. Hypothetical accident situations were modeled as a uranyl fluoride (UO{sub 2}F{sub 2}) slab above a UF{sub 6} hemicylinder, and a UO{sub 2}F{sub 2} sphere centered within a UF{sub 6} hemicylinder. These situations were investigated by computational analyses utilizing the KENO V.a Monte Carlo Computer Code. The results were used to estimate both the masses of water required for criticality, and the limiting masses of water that could be considered safe. The second objective of the assessment was to calculate the time available for emergency control actions before a criticality would occur, i.e., a {open_quotes}safetime{close_quotes}, for various sources of water and different size openings in a breached cylinder. In the situations considered, except the case for a fire hose, the safetime appears adequate for emergency control actions. The assessment shows that current practices for handling moderation controlled cylinders of low enriched UF{sub 6}, along with the continuation of established personnel training programs, ensure nuclear criticality safety for routine and emergency operations.

  17. A critical assessment of regulatory triggers for products of biotechnology: Product vs. process.

    PubMed

    McHughen, Alan

    2016-10-01

    Regulatory policies governing the safety of genetic engineering (rDNA) and the resulting products (GMOs) have been contentious and divisive, especially in agricultural applications of the technologies. These tensions led to vastly different approaches to safety regulation in different jurisdictions, even though the intent of regulations-to assure public and environmental safety-are common worldwide, and even though the international scientific communities agree on the basic principles of risk assessment and risk management. So great are the political divisions that jurisdictions cannot even agree on the appropriate triggers for regulatory capture, whether product or process. This paper reviews the historical policy and scientific implications of agricultural biotechnology regulatory approaches taken by the European Union, USA and Canada, using their respective statutes and regulations, and then critically assesses the scientific underpinnings of each.

  18. [Chemical risk assessment in the construction industry: principles and critical issues].

    PubMed

    Manno, M

    2012-01-01

    Risk assessment (RA) represents the first step to ensure the protection of the workers' health in all work sectors, production and services included. For this reason RA has become a legal duty for the occupational physician in his/her professional activity. The basic concepts of RA have been developed as a formal procedure for the management of chemical risks but they are currently applied to protect human health against all types of occupational and environmental risk factors. In the construction industry, in particular, chemical risk assessment is specially difficult due to the complexity of the working condition, and the variability and multiplicity of exposure. The critical aspects of RA in the construction industry will be discussed here, in the attempt to highlight how the occupational physician, making use of traditional and new tools, including biological monitoring, can address and partly overcome them.

  19. Evidence based clinical assessment of child and adolescent social phobia: a critical review of rating scales.

    PubMed

    Tulbure, Bogdan T; Szentagotai, Aurora; Dobrean, Anca; David, Daniel

    2012-10-01

    Investigating the empirical support of various assessment instruments, the evidence based assessment approach expands the scientific basis of psychotherapy. Starting from Hunsley and Mash's evaluative framework, we critically reviewed the rating scales designed to measure social anxiety or phobia in youth. Thirteen of the most researched social anxiety scales for children and adolescents were identified. An overview about the scientific support accumulated by these scales is offered. Our main results are consistent with recent reviews that consider the Social Phobia and Anxiety Scale for Children (SPAI-C) and the Social Anxiety Scale for Adolescents (SAS-A) among the most pertinent and empirically supported measures of social anxiety for youngsters. However, after considering the existing evidence, we highly recommend another couple of scales that proved to be empirically supported (i.e., the Social Phobia Inventory-SPIN, and the Liebowitz Social Anxiety Scale for Children and Adolescents-LSAS-CA).

  20. Spoilt for choice: A critical review on the chemical and biological assessment of current wastewater treatment technologies.

    PubMed

    Prasse, Carsten; Stalter, Daniel; Schulte-Oehlmann, Ulrike; Oehlmann, Jörg; Ternes, Thomas A

    2015-12-15

    The knowledge we have gained in recent years on the presence and effects of compounds discharged by wastewater treatment plants (WWTPs) brings us to a point where we must question the appropriateness of current water quality evaluation methodologies. An increasing number of anthropogenic chemicals is detected in treated wastewater and there is increasing evidence of adverse environmental effects related to WWTP discharges. It has thus become clear that new strategies are needed to assess overall quality of conventional and advanced treated wastewaters. There is an urgent need for multidisciplinary approaches combining expertise from engineering, analytical and environmental chemistry, (eco)toxicology, and microbiology. This review summarizes the current approaches used to assess treated wastewater quality from the chemical and ecotoxicological perspective. Discussed chemical approaches include target, non-target and suspect analysis, sum parameters, identification and monitoring of transformation products, computational modeling as well as effect directed analysis and toxicity identification evaluation. The discussed ecotoxicological methodologies encompass in vitro testing (cytotoxicity, genotoxicity, mutagenicity, endocrine disruption, adaptive stress response activation, toxicogenomics) and in vivo tests (single and multi species, biomonitoring). We critically discuss the benefits and limitations of the different methodologies reviewed. Additionally, we provide an overview of the current state of research regarding the chemical and ecotoxicological evaluation of conventional as well as the most widely used advanced wastewater treatment technologies, i.e., ozonation, advanced oxidation processes, chlorination, activated carbon, and membrane filtration. In particular, possible directions for future research activities in this area are provided.

  1. Development of computer-based analytical tool for assessing physical protection system

    SciTech Connect

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  2. Quality assessment and authentication of virgin olive oil by NMR spectroscopy: a critical review.

    PubMed

    Dais, Photis; Hatzakis, Emmanuel

    2013-02-26

    Nuclear Magnetic Resonance (NMR) Spectroscopy has been extensively used for the analysis of olive oil and it has been established as a valuable tool for its quality assessment and authenticity. To date, a large number of research and review articles have been published with regards to the analysis of olive oil reflecting the potential of the NMR technique in these studies. In this critical review, we cover recent results in the field and discuss deficiencies and precautions of the three NMR techniques ((1)H, (13)C, (31)P) used for the analysis of olive oil. The two methodological approaches of metabonomics, metabolic profiling and metabolic fingerprinting, and the statistical methods applied for the classification of olive oils will be discussed in critical way. Some useful information about sample preparation, the required instrumentation for an effective analysis, the experimental conditions and data processing for obtaining high quality spectra will be presented as well. Finally, a constructive criticism will be exercised on the present methodologies used for the quality control and authentication of olive oil.

  3. Identifying Critical Learner Traits in a Dynamic Computer-Based Geometry Program.

    ERIC Educational Resources Information Center

    Hannafin, Robert D.; Scott, Barry N.

    1998-01-01

    Investigated the effects of student working-memory capacity, preference for amount of instruction, spatial problem-solving ability, and school mathematics grades on eighth graders' recall of factual information and conceptual understanding. Pairs of students worked through 16 activities using a dynamic, computer-based geometry program. Presents…

  4. The Use of Computer Technology in University Teaching and Learning: A Critical Perspective

    ERIC Educational Resources Information Center

    Selwyn, N.

    2007-01-01

    Despite huge efforts to position information and communication technology (ICT) as a central tenet of university teaching and learning, the fact remains that many university students and faculty make only limited formal academic use of computer technology. Whilst this is usually attributed to a variety of operational deficits on the part of…

  5. Evaluating How the Computer-Supported Collaborative Learning Community Fosters Critical Reflective Practices

    ERIC Educational Resources Information Center

    Ma, Ada W.W.

    2013-01-01

    In recent research, little attention has been paid to issues of methodology and analysis methods to evaluate the quality of the collaborative learning community. To address such issues, an attempt is made to adopt the Activity System Model as an analytical framework to examine the relationship between computer supported collaborative learning…

  6. Fostering Critical Reflection in a Computer-Based, Asynchronously Delivered Diversity Training Course

    ERIC Educational Resources Information Center

    Givhan, Shawn T.

    2013-01-01

    This dissertation study chronicles the creation of a computer-based, asynchronously delivered diversity training course for a state agency. The course format enabled efficient delivery of a mandatory curriculum to the Massachusetts Department of State Police workforce. However, the asynchronous format posed a challenge to achieving the learning…

  7. The statistical-thermodynamic basis for computation of binding affinities: a critical review.

    PubMed Central

    Gilson, M K; Given, J A; Bush, B L; McCammon, J A

    1997-01-01

    Although the statistical thermodynamics of noncovalent binding has been considered in a number of theoretical papers, few methods of computing binding affinities are derived explicitly from this underlying theory. This has contributed to uncertainty and controversy in certain areas. This article therefore reviews and extends the connections of some important computational methods with the underlying statistical thermodynamics. A derivation of the standard free energy of binding forms the basis of this review. This derivation should be useful in formulating novel computational methods for predicting binding affinities. It also permits several important points to be established. For example, it is found that the double-annihilation method of computing binding energy does not yield the standard free energy of binding, but can be modified to yield this quantity. The derivation also makes it possible to define clearly the changes in translational, rotational, configurational, and solvent entropy upon binding. It is argued that molecular mass has a negligible effect upon the standard free energy of binding for biomolecular systems, and that the cratic entropy defined by Gurney is not a useful concept. In addition, the use of continuum models of the solvent in binding calculations is reviewed, and a formalism is presented for incorporating a limited number of solvent molecules explicitly. PMID:9138555

  8. Computer Simulation as a Tool for Assessing Decision-Making in Pandemic Influenza Response Training

    PubMed Central

    Leaming, James M.; Adoff, Spencer; Terndrup, Thomas E.

    2013-01-01

    Introduction: We sought to develop and test a computer-based, interactive simulation of a hypothetical pandemic influenza outbreak. Fidelity was enhanced with integrated video and branching decision trees, built upon the 2007 federal planning assumptions. We conducted a before-and-after study of the simulation effectiveness to assess the simulations' ability to assess participants' beliefs regarding their own hospitals' mass casualty incident preparedness. Methods: Development: Using a Delphi process, we finalized a simulation that serves up a minimum of over 50 key decisions to 6 role-players on networked laptops in a conference area. The simulation played out an 8-week scenario, beginning with pre-incident decisions. Testing: Role-players and trainees (N=155) were facilitated to make decisions during the pandemic. Because decision responses vary, the simulation plays out differently, and a casualty counter quantifies hypothetical losses. The facilitator reviews and critiques key factors for casualty control, including effective communications, working with external organizations, development of internal policies and procedures, maintaining supplies and services, technical infrastructure support, public relations and training. Pre- and post-survey data were compared on trainees. Results: Post-simulation trainees indicated a greater likelihood of needing to improve their organization in terms of communications, mass casualty incident planning, public information and training. Participants also recognized which key factors required immediate attention at their own home facilities. Conclusion: The use of a computer-simulation was effective in providing a facilitated environment for determining the perception of preparedness, evaluating general preparedness concepts and introduced participants to critical decisions involved in handling a regional pandemic influenza surge. PMID:23687542

  9. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    ERIC Educational Resources Information Center

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  10. Ebola preparedness: a rapid needs assessment of critical care in a tertiary hospital

    PubMed Central

    Sutherland, Stephanie; Robillard, Nicholas; Kim, John; Dupuis, Kirsten; Thornton, Mary; Mansour, Marlene; Cardinal, Pierre

    2015-01-01

    Background: The current outbreak of Ebola has been declared a public health emergency of international concern. We performed a rigorous and rapid needs assessment to identify the desired results, the gaps in current practice, and the barriers and facilitators to the development of solutions in the provision of critical care to patients with suspected or confirmed Ebola. Methods: We conducted a qualitative study with an emergent design at a tertiary hospital in Ontario, Canada, recently designated as an Ebola centre, from Oct. 21 to Nov. 7, 2014. Participants included physicians, nurses, respiratory therapists, and staff from infection control, housekeeping, waste management, administration, facilities, and occupational health and safety. Data collection included document analysis, focus groups, interviews and walk-throughs of critical care areas with key stakeholders. Results: Fifteen themes and 73 desired results were identified, of which 55 had gaps. During the study period, solutions were implemented to fully address 8 gaps and partially address 18 gaps. Themes identified included the following: screening; response team activation; personal protective equipment; postexposure to virus; patient placement, room setup, logging and signage; intrahospital patient movement; interhospital patient movement; critical care management; Ebola-specific diagnosis and treatment; critical care staffing; visitation and contacts; waste management, environmental cleaning and management of linens; postmortem; conflict resolution; and communication. Interpretation: This investigation identified widespread gaps across numerous themes; as such, we have been able to develop a set of credible and measureable results. All hospitals need to be prepared for contact with a patient with Ebola, and the preparedness plan will need to vary based on local context, resources and site designation. PMID:26389098

  11. A Computer Assessment Tool for Structural Communication Grid

    ERIC Educational Resources Information Center

    Durmus, Soner; Karakirik, Erol

    2005-01-01

    Assessment is one of the most important part of educational process that directs teaching, learning as well as curriculum development. However, widely used classical assessment techniques such as multiple choice tests are not adequate to provide neither a correct picture of students' performances nor the effectiveness of the teaching process.…

  12. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  13. A Model for Computer-based Assessment: The Catherine Wheel Principle.

    ERIC Educational Resources Information Center

    Zakrzewski, Stan; Steven, Christine

    2000-01-01

    This paper proposes a model for computer-based assessment systems that utilizes a step-wise approach to assessment design and implementation, within which the management and assessment of operational, technical, pedagogic, and financial risks are made explicit. The cyclic model has five components: planning, risk analysis and management,…

  14. An assessment of future computer system needs for large-scale computation

    NASA Technical Reports Server (NTRS)

    Lykos, P.; White, J.

    1980-01-01

    Data ranging from specific computer capability requirements to opinions about the desirability of a national computer facility are summarized. It is concluded that considerable attention should be given to improving the user-machine interface. Otherwise, increased computer power may not improve the overall effectiveness of the machine user. Significant improvement in throughput requires highly concurrent systems plus the willingness of the user community to develop problem solutions for that kind of architecture. An unanticipated result was the expression of need for an on-going cross-disciplinary users group/forum in order to share experiences and to more effectively communicate needs to the manufacturers.

  15. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  16. The Application of Web-based Computer-assisted Instruction Courseware within Health Assessment

    NASA Astrophysics Data System (ADS)

    Xiuyan, Guo

    Health assessment is a clinical nursing course and places emphasis on clinical skills. The application of computer-assisted instruction in the field of nursing teaching solved the problems in the traditional lecture class. This article stated teaching experience of web-based computer-assisted instruction, based upon a two-year study of computer-assisted instruction courseware use within the course health assessment. The computer-assisted instruction courseware could develop teaching structure, simulate clinical situations, create teaching situations and facilitate students study.

  17. Ensuring critical event sequences in high consequence computer based systems as inspired by path expressions

    SciTech Connect

    Kidd, M.E.C.

    1997-02-01

    The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.

  18. Implementing and Assessing Computational Modeling in Introductory Mechanics

    ERIC Educational Resources Information Center

    Caballero, Marcos D.; Kohlmyer, Matthew A.; Schatz, Michael F.

    2012-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term, 1357 students in this course solved a suite of 14 computational…

  19. Evaluation and Assessment of a Biomechanics Computer-Aided Instruction.

    ERIC Educational Resources Information Center

    Washington, N.; Parnianpour, M.; Fraser, J. M.

    1999-01-01

    Describes the Biomechanics Tutorial, a computer-aided instructional tool that was developed at Ohio State University to expedite the transition from lecture to application for undergraduate students. Reports evaluation results that used statistical analyses and student questionnaires to show improved performance on posttests as well as positive…

  20. Live Application Testing: Performance Assessment with Computer-Based Delivery.

    ERIC Educational Resources Information Center

    Adair, James H.; Berkowitz, Nancy F.

    To measure workplace skills more realistically for certification purposes, two computer-delivered performance examinations, termed "Live Application" exams, were developed to test job-related competencies in a specific software product, Lotus Notes. As in the real world, success on examination tasks was determined by the examinee's final…

  1. Computer Specialist Need Assessment for Oakland Community College: Final Report.

    ERIC Educational Resources Information Center

    Austin, Henry

    Prepared as part of an effort by Oakland Community College (OCC) to monitor the evolution of information technology in relation to the college's mission and purpose, this report describes OCC's current programs in data processing, examines the fit between the current curriculum and the employment market for computer specialists in southeast…

  2. Computational assessment of visual search strategies in volumetric medical images

    PubMed Central

    Wen, Gezheng; Aizenman, Avigael; Drew, Trafton; Wolfe, Jeremy M.; Haygood, Tamara Miner; Markey, Mia K.

    2016-01-01

    Abstract. When searching through volumetric images [e.g., computed tomography (CT)], radiologists appear to use two different search strategies: “drilling” (restrict eye movements to a small region of the image while quickly scrolling through slices), or “scanning” (search over large areas at a given depth before moving on to the next slice). To computationally identify the type of image information that is used in these two strategies, 23 naïve observers were instructed with either “drilling” or “scanning” when searching for target T’s in 20 volumes of faux lung CTs. We computed saliency maps using both classical two-dimensional (2-D) saliency, and a three-dimensional (3-D) dynamic saliency that captures the characteristics of scrolling through slices. Comparing observers’ gaze distributions with the saliency maps showed that search strategy alters the type of saliency that attracts fixations. Drillers’ fixations aligned better with dynamic saliency and scanners with 2-D saliency. The computed saliency was greater for detected targets than for missed targets. Similar results were observed in data from 19 radiologists who searched five stacks of clinical chest CTs for lung nodules. Dynamic saliency may be superior to the 2-D saliency for detecting targets embedded in volumetric images, and thus “drilling” may be more efficient than “scanning.” PMID:26759815

  3. Computers will become increasingly important for psychological assessment: not that there's anything wrong with that!

    PubMed

    Garb, H N

    2000-03-01

    Though one can expect that computer programs will become increasingly important for psychological assessment, current automated assessment programs and statistical-prediction rules are of limited value. Validity has not been clearly established for many automated assessment programs. Statistical-prediction rules are of limited value because they have typically been based on limited information that has not been demonstrated to be optimal and they have almost never been shown to be powerful. Recommendations are made for building and evaluating new computer programs. Finally, comments are made about the ethics of using computers to make judgments.

  4. Critical anatomic region of nasopalatine canal based on tridimensional analysis: cone beam computed tomography.

    PubMed

    Fernández-Alonso, Ana; Suárez-Quintanilla, Juan Antonio; Muinelo-Lorenzo, Juan; Varela-Mallou, Jesús; Smyth Chamosa, Ernesto; Suárez-Cunqueiro, María Mercedes

    2015-08-06

    The study aim of this was to define the critical anatomic region of the premaxilla by evaluating dimensions of nasopalatine canal, buccal bone plate (BBP) and palatal bone plate (PBP). 230 CBCTs were selected with both, one or no upper central incisors present (+/+, -/+, -/-) and periodontal condition was evaluated. T-student test, ANOVA, Pearson's correlation and a multivariant-linear regression model (MLRM) were used. Regarding gender, significant differences at level 1 (lower NC) were found for: buccal-palatal, transversal and sagittal NC diameters, and NC length (NCL). Regarding dental status, significant differences were found for: total BBP length (tBL) and PBP width (PW2) at level 2 (NCL midpoint). NCL was correlated with PW2, tBL, and PBP length at level 3 (foramina of Stenson level). An MLRM had a high prediction value for NCL (69.3%). Gender is related to NC dimensions. Dental status has an influence on BBP dimensions, but does not influence on NC and PBP. Periodontal condition should be evaluated for precise premaxillae analysis NC diameters at the three anatomical planes are related to each other, while NCL is related to BBP and PBP lengths. A third of premaxilla is taken up by NC, thus, establishing the critical anatomic region.

  5. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    NASA Astrophysics Data System (ADS)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  6. Use of computer-aided testing in the investigation of pilot response to critical in-flight events. Volume 2: Appendix

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.

    1982-01-01

    Computer displays using PLATO are illustrated. Diagnostic scenarios are described. A sample of subject data is presented. Destination diversion displays, a combined destination, diversion scenario, and critical in-flight event (CIFE) data collection/subject testing system are presented.

  7. A critical review of environmental assessment tools for sustainable urban design

    SciTech Connect

    Ameen, Raed Fawzi Mohammed; Mourshed, Monjur; Li, Haijiang

    2015-11-15

    Cities are responsible for the depletion of natural resources and agricultural lands, and 70% of global CO{sub 2} emissions. There are significant risks to cities from the impacts of climate change in addition to existing vulnerabilities, primarily because of rapid urbanization. Urban design and development are generally considered as the instrument to shape the future of the city and they determine the pattern of a city's resource usage and resilience to change, from climate or otherwise. Cities are inherently dynamic and require the participation and engagement of their diverse stakeholders for the effective management of change, which enables wider stakeholder involvement and buy-in at various stages of the development process. Sustainability assessment of urban design and development is increasingly being seen as indispensable for informed decision-making. A sustainability assessment tool also acts as a driver for the uptake of sustainable pathways by recognizing excellence through their rating system and by creating a market demand for sustainable products and processes. This research reviews six widely used sustainability assessment tools for urban design and development: BREEAM Communities, LEED-ND, CASBEE-UD, SBTool{sup PT}–UP, Pearl Community Rating System (PCRS) and GSAS/QSAS, to identify, compare and contrast the aim, structure, assessment methodology, scoring, weighting and suitability for application in different geographical contexts. Strengths and weaknesses of each tool are critically discussed. The study highlights the disparity in local and international contexts for global sustainability assessment tools. Despite their similarities in aim on environmental aspects, differences exist in the relative importance and share of mandatory vs optional indicators in both environmental and social dimensions. PCRS and GSAS/QSAS are new incarnations, but have widely varying shares of mandatory indicators, at 45.4% and 11.36% respectively, compared to 30% in

  8. Research on Computer-Based Education for Reading Teachers: A 1989 Update. Results of the First National Assessment of Computer Competence.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    Results of the 1985-86 National Assessment of Educational Progress (NAEP) survey of American students' knowledge of computers suggest that American schools have a long way to go before computers can be said to have made a significant impact. The survey covered the 3rd, 7th, and 11th grade levels and assessed competence in knowledge of computers,…

  9. Cone beam computed tomography aided diagnosis and treatment of endodontic cases: Critical analysis

    PubMed Central

    Yılmaz, Funda; Kamburoglu, Kıvanç; Yeta, Naz Yakar; Öztan, Meltem Dartar

    2016-01-01

    Although intraoral radiographs still remain the imaging method of choice for the evaluation of endodontic patients, in recent years, the utilization of cone beam computed tomography (CBCT) in endodontics showed a significant jump. This case series presentation shows the importance of CBCT aided diagnosis and treatment of complex endodontic cases such as; root resorption, missed extra canal, fusion, oblique root fracture, non-diagnosed periapical pathology and horizontal root fracture. CBCT may be a useful diagnostic method in several endodontic cases where intraoral radiography and clinical examination alone are unable to provide sufficient information. PMID:27551342

  10. Critical analysis of outcome measures used in the assessment of carpal tunnel syndrome

    PubMed Central

    Priyanka, P.; Gul, Arif; Ilango, Balakrishnan

    2007-01-01

    Clinicians and researchers are confounded by the various outcome measures used for the assessment of carpal tunnel syndrome (CTS). In this study, we critically analysed the conceptual framework, validity, reliability, responsiveness and appropriateness of some of the commonly used CTS outcome measures. Initially, we conducted an extensive literature search to identify all of the outcome measures used in the assessment of CTS patients, which revealed six different carpal tunnel outcome measures [Boston Carpal Tunnel Questionnaire (BCTQ), Michigan Hand Outcome Questionnaire (MHQ), Disability of Arm, Shoulder and Hand (DASH), Patient Evaluation Measure (PEM), clinical rating scale (Historical-Objective (Hi-Ob) scale) and Upper Extremity Functional Scale (UEFS)]. We analysed the construction framework, development process, validation process, reliability, internal consistency (IC), responsiveness and limitations of each of these outcome measures. Our analysis reveals that BCTQ, MHQ and PEM have comprehensive frameworks, good validity, reliability and responsiveness both in the hands of the developers, as well as independent researchers. The UEFS and Hi-Ob scale need validation and reliability testing by independent researchers. Region-specific measures like DASH have good frameworks and, hence, a potential role in the assessment of CTS but they require more validation in exclusive carpal tunnel patients. PMID:17370071

  11. Critical analysis of outcome measures used in the assessment of carpal tunnel syndrome.

    PubMed

    Sambandam, Senthil Nathan; Priyanka, P; Gul, Arif; Ilango, Balakrishnan

    2008-08-01

    Clinicians and researchers are confounded by the various outcome measures used for the assessment of carpal tunnel syndrome (CTS). In this study, we critically analysed the conceptual framework, validity, reliability, responsiveness and appropriateness of some of the commonly used CTS outcome measures. Initially, we conducted an extensive literature search to identify all of the outcome measures used in the assessment of CTS patients, which revealed six different carpal tunnel outcome measures [Boston Carpal Tunnel Questionnaire (BCTQ), Michigan Hand Outcome Questionnaire (MHQ), Disability of Arm, Shoulder and Hand (DASH), Patient Evaluation Measure (PEM), clinical rating scale (Historical-Objective (Hi-Ob) scale) and Upper Extremity Functional Scale (UEFS)]. We analysed the construction framework, development process, validation process, reliability, internal consistency (IC), responsiveness and limitations of each of these outcome measures. Our analysis reveals that BCTQ, MHQ and PEM have comprehensive frameworks, good validity, reliability and responsiveness both in the hands of the developers, as well as independent researchers. The UEFS and Hi-Ob scale need validation and reliability testing by independent researchers. Region-specific measures like DASH have good frameworks and, hence, a potential role in the assessment of CTS but they require more validation in exclusive carpal tunnel patients.

  12. Guidelines and pharmacopoeial standards for pharmaceutical impurities: overview and critical assessment.

    PubMed

    Snodin, David J; McCrossen, Sean D

    2012-07-01

    ICH/regional guidances and agency scrutiny provide the regulatory framework for safety assessment and control of impurities in small-molecule drug substances and drug products. We provide a critical assessment of the principal impurity guidances and, in particular, focus on deficiencies in the derivation of the threshold of toxicological concern (TTC) as applied to genotoxic impurities and the many toxicological anomalies generated by following the current guidelines on impurities. In terms of pharmacopoeial standards, we aim to highlight the fact that strictly controlling numerous impurities, especially those that are minor structural variants of the active substance, is likely to produce minimal improvements in drug safety. It is believed that, wherever possible, there is a need to simplify and rebalance the current impurity paradigm, moving away from standards derived largely from batch analytical data towards structure-based qualification thresholds and risk assessments using readily available safety data. Such changes should also lead to a minimization of in vivo testing for toxicological qualification purposes. Recent improvements in analytical techniques and performance have enabled the detection of ever smaller amounts of impurities with increased confidence. The temptation to translate this information directly to the regulatory sphere without any kind of safety evaluation should be resisted.

  13. Methods for assessing channel conditions related to scour-critical conditions at bridges in Tennessee

    USGS Publications Warehouse

    Bryan, B.A.; Simon, Andrew; Outlaw, G.S.; Thomas, Randy

    1995-01-01

    The ability to assess quickly the potential for scour at a bridge site, to evaluate those bridges with the greatest potential for significant amounts of scour, and to then identify scour-critical structures is important for public protection and bridge maintenance planning. A bridge-scour assessment information form was developed for collecting data describing the bridge site; the hydraulic geomorphic, and vegetation characteristics of the channel. Information from site assessments of 3,964 bridges in Tennessee was used to develop indexes of potential scour characteristics over broad geographic areas, such as counties, regions, or drainage basins. Channel instability charac- teristics differ from region to region. In west Tennessee counties, channel instability has progressed from valley bottoms into the uplands through headward degradation. In middle and east counties of Tennessee, channel widening is a dominant process, but widespread degradation has been prevented by stream beds being lines with erosion-resistant bedrock, boulder, cobble, and gravel, and by the absence of channelization. Neither quantifiable headcutting nor degradation in bedrock channels was noted at any site in the State. However, potential for lateral scour is prevalent in Middle and East Tennessee.

  14. The Sixth Rhino: A Taxonomic Re-Assessment of the Critically Endangered Northern White Rhinoceros

    PubMed Central

    Groves, Colin P.; Fernando, Prithiviraj; Robovský, Jan

    2010-01-01

    Background The two forms of white rhinoceros; northern and southern, have had contrasting conservation histories. The Northern form, once fairly numerous is now critically endangered, while the southern form has recovered from a few individuals to a population of a few thousand. Since their last taxonomic assessment over three decades ago, new material and analytical techniques have become available, necessitating a review of available information and re-assessment of the taxonomy. Results Dental morphology and cranial anatomy clearly diagnosed the southern and northern forms. The differentiation was well supported by dental metrics, cranial growth and craniometry, and corresponded with differences in post-cranial skeleton, external measurements and external features. No distinctive differences were found in the limited descriptions of their behavior and ecology. Fossil history indicated the antiquity of the genus, dating back at least to early Pliocene and evolution into a number of diagnosable forms. The fossil skulls examined fell outside the two extant forms in the craniometric analysis. Genetic divergence between the two forms was consistent across both nuclear and mitochondrial genomes, and indicated a separation of over a million years. Conclusions On re-assessing the taxonomy of the two forms we find them to be morphologically and genetically distinct, warranting the recognition of the taxa formerly designated as subspecies; Ceratotherium simum simum the southern form and Ceratotherium simum cottoni the northern form, as two distinct species Ceratotherium simum and Ceratotherium cottoni respectively. The recognition of the northern form as a distinct species has profound implications for its conservation. PMID:20383328

  15. [Assessment of the critical patient at admission. An indicator of quality of care].

    PubMed

    Miró Bonet, M; Amorós Cerdá, S M; De Juan Sánchez, S; Fortea Cabo, E; Frau Morro, J; Moragues Mas, J; Pastor Picornell, C I

    2000-01-01

    The type of information recorded by nurses at admission of critical patients to the Intensive Care Unit was described and the relation between the information recorded and the presence of absence of endotracheal intubation in the patient admitted was analyzed. A sample of 214 admission records of patients admitted to our unit in 1998 was studied using a data sheet based on Virginia Henderson assessment questionnaires. The presence or absence of 71 variables classified into four sections was analyzed: personal data, general data, Virginia Henderson basic needs, and other assessment data. Most data collected at admission were objective data obtained by observation and/or physical examination of the patient. These data were contained in two sections: "Virginia Henderson basic needs" (normal breathing, food and water intake, excretion, mobility, maintaining posture, conserving body temperature, skin hygiene and integrity, and avoiding danger) and "other assessment data" (medical treatment, diagnostic and therapeutic tests, and hemodynamic monitoring). Information about the patient's background in the section "general data" was obtained less frequently. Subjective data obtained from interviews was clearly limited. These data are included in the "Virginia Henderson basic needs" (sleep, rest, dressing and undressing, communicating, values and beliefs, feeling of satisfaction, absence of boredom, and intellectual stimulation).

  16. Academic physicians' assessment of the effects of computers on health care.

    PubMed Central

    Detmer, W. M.; Friedman, C. P.

    1994-01-01

    We assessed the attitudes of academic physicians towards computers in health care at two academic medical centers that are in the early stages of clinical information-system deployment. We distributed a 4-page questionnaire to 470 subjects, and a total of 272 physicians (58%) responded. Our results show that respondents use computers frequently, primarily to perform academic-oriented tasks as opposed to clinical tasks. Overall, respondents viewed computers as being slightly beneficial to health care. They perceive self-education and access to up-to-date information as the most beneficial aspects of computers and are most concerned about privacy issues and the effect of computers on the doctor-patient relationship. Physicians with prior computer training and greater knowledge of informatics concepts had more favorable attitudes towards computers in health care. We suggest that negative attitudes towards computers can be addressed by careful system design as well as targeted educational activities. PMID:7949990

  17. [Computer-assisted histocompatibility assessment in mixed lymphocyte culture (MLC)].

    PubMed

    Schwartz, D; Hajek-Rosenmayr, A

    1987-02-20

    Analysis of the results of mixed lymphocyte culture (MLC) for compatibility testing preceding transplantation of bone marrow and other organs has so far required a vast input, both in terms of laboratory staff and work hours. We have developed a computer programme which performs this work rapidly. Graphics of the reaction patterns can be obtained, moreover, and these can prove a helpful tool in interpretation of the results.

  18. A Micro-Computer Based System for the Management of the Critically Ill

    PubMed Central

    Comerchero, Harry; Thomas, Gregory; Shapira, Gaby; Greatbatch, Mennen; Hoyt, John W.

    1978-01-01

    A central station based system is described which employs a micro-computer for continuous monitoring of hemodynamic parameters for multiple patients. Monitored vital signs are displayed on a “WARD STATUS” video monitor and processed for long-term trend storage and retrieval. Alarm events and changes in module settings at the bedside are immediately reflected on the WARD STATUS display. Medication administration can be indicated and presented together with the graphical trends of any monitored parameter. Optional features of the system include on-line determination of Cardiac Output, Pulmonary Wedge Pressure Measurements, Arrhythmia and Respiratory Monitoring. An alphanumeric terminal connected to the micro-computer facilitates “background” programming in high level languages. This facility can be used to provide tailored patient data management capability to the medical staff or can be used as a tool for in-house development of special purpose application programs. The system is currently implemented on a Digital Equipment Corporation LSI-11 with 28K memory and dual floppy disks. ImagesFig. 2

  19. Computational fluid dynamics approaches in quality and hygienic production of semisolid low-moisture foods: a review of critical factors.

    PubMed

    Mondal, Arpita; Buchanan, Robert L; Lo, Y Martin

    2014-10-01

    Low-moisture foods have been responsible for a number of salmonellosis outbreaks worldwide over the last few decades, with cross contamination from contaminated equipment being the most predominant source. To date, actions have been focused on stringent hygienic practices prior to production, namely periodical sanitization of the processing equipment and lines. Not only does optimum sanitization require in-depth knowledge on the type and source of contaminants, but also the heat resistance of microorganisms is unique and often dependent on the heat transfer characteristics of the low-moisture foods. Rheological properties, including viscosity, degree of turbulence, and flow characteristics (for example, Newtonian or non-Newtonian) of both liquid and semisolid foods are critical factors impacting the flow behavior that consequently interferes heat transfer and related control elements. The demand for progressively more accurate prediction of complex fluid phenomena has called for the employment of computational fluid dynamics (CFD) to model mass and heat transfer during processing of various food products, ranging from drying to baking. With the aim of improving the quality and safety of low-moisture foods, this article critically reviewed the published literature concerning microbial survival in semisolid low-moisture foods, including chocolate, honey, and peanut butter. Critical rheological properties and state-of-the-art CFD application relevant to quality production of those products were also addressed. It is anticipated that adequate prediction of specific transport properties during optimum sanitization through CFD could be used to solve current and future food safety challenges.

  20. Geomagnetic Excursions: A Critical Assessment of the Evidence as Recorded in Sediments of the Brunhes Epoch

    NASA Astrophysics Data System (ADS)

    Verosub, K. L.

    1982-08-01

    Geomagnetic excursions have tantalized geophysicists since the earliest suggestion of their occurrence over 15 years ago. Either as large-scale geomagnetic secular variation, geomagnetic reversals of short duration or aborted reversals, they held great promise of providing new insights into the nature of the origin of the geomagnetic field. Unfortunately the evidence for geomagnetic excursions from the palaeomagnetic record of Brunhes age sediments is not as compelling as the theoretical arguments. A critical assessment of the available data indicates that the Gothenburg excursion is unlikely to have occurred and the Erieau excursion is very unlikely. The Mono Lake excursion probably occurred, but its absence in nearby contemporaneous sites creates profound problems. The Blake Event appears to be an actual short reversal of complex character, but confirmation of its global nature may be quite difficult.

  1. Regulatory assessment of safety critical software used in upgrades to analog systems

    SciTech Connect

    Taylor, R.P.

    1994-12-31

    As a result of the difficulties encountered by both licensee and regulator during the licensing of the Darlington nuclear generating station software-based shutdown systems, Ontario Hydro was directed by the Atomic Energy Control Board (AECL) to produce improved company standards and procedures for safety-critical software development. In partnership with Atomic Energy of Canada Ltd. (AECL), a joint committee called OASES (Ontario Hydro/AECL Software Engineering Standards) has developed a suite of standards and procedures for software specification, design, implementation, verification, testing, and safety analysis. These standards are now being applied to new systems and are being adapted for use on upgrades to existing systems. Several digital protection systems have been installed recently in Canadian nuclear plants, such as a primary heat transport pump trip and an emergency powerhouse venting system. We have learned from the experience of assessing these systems and are now applying these lessons to systems developed under the new OASES standards.

  2. Assessment of hygiene standards and Hazard Analysis Critical Control Points implementation on passenger ships.

    PubMed

    Mouchtouri, Varavara; Malissiova, Eleni; Zisis, Panagiotis; Paparizou, Evina; Hadjichristodoulou, Christos

    2013-01-01

    The level of hygiene on ferries can have impact on travellers' health. The aim of this study was to assess the hygiene standards of ferries in Greece and to investigate whether Hazard Analysis Critical Control Points (HACCP) implementation contributes to the hygiene status and particularly food safety aboard passenger ships. Hygiene inspections on 17 ferries in Greece were performed using a standardized inspection form, with a 135-point scale. Thirty-four water and 17 food samples were collected and analysed. About 65% (11/17) of ferries were scored with >100 points. Ferries with HACCP received higher scores during inspection compared to those without HACCP (p value <0.001). All 34 microbiological water test results were found negative and, from the 17 food samples, only one was found positive for Salmonella spp. Implementation of management systems including HACCP principles can help to raise the level of hygiene aboard passenger ships.

  3. Quantifying and modelling the carbon sequestration capacity of seagrass meadows--a critical assessment.

    PubMed

    Macreadie, P I; Baird, M E; Trevathan-Tackett, S M; Larkum, A W D; Ralph, P J

    2014-06-30

    Seagrasses are among the planet's most effective natural ecosystems for sequestering (capturing and storing) carbon (C); but if degraded, they could leak stored C into the atmosphere and accelerate global warming. Quantifying and modelling the C sequestration capacity is therefore critical for successfully managing seagrass ecosystems to maintain their substantial abatement potential. At present, there is no mechanism to support carbon financing linked to seagrass. For seagrasses to be recognised by the IPCC and the voluntary C market, standard stock assessment methodologies and inventories of seagrass C stocks are required. Developing accurate C budgets for seagrass meadows is indeed complex; we discuss these complexities, and, in addition, we review techniques and methodologies that will aid development of C budgets. We also consider a simple process-based data assimilation model for predicting how seagrasses will respond to future change, accompanied by a practical list of research priorities.

  4. A Tool for Music Preference Assessment in Critically Ill Patients Receiving Mechanical Ventilatory Support

    PubMed Central

    CHLAN, LINDA; HEIDERSCHEIT, ANNIE

    2010-01-01

    Music is an ideal intervention to reduce anxiety and promote relaxation in critically ill patients. This article reviews the research studies on music-listening interventions to manage distressful symptoms in this population, and describes the development and implementation of the Music Assessment Tool (MAT) to assist professionals in ascertaining patients’ music preferences in the challenging, dynamic clinical environment of the intensive care unit (ICU). The MAT is easy to use with these patients who experience profound communication challenges due to fatigue and inability to speak because of endotracheal tube placement. The music therapist and ICU nursing staff are encouraged to work collaboratively to implement music in a personalized manner to ensure the greatest benefit for mechanically ventilated patients. PMID:24489432

  5. A systematic review and critical assessment of incentive strategies for discovery and development of novel antibiotics

    PubMed Central

    Renwick, Matthew J; Brogan, David M; Mossialos, Elias

    2016-01-01

    Despite the growing threat of antimicrobial resistance, pharmaceutical and biotechnology firms are reluctant to develop novel antibiotics because of a host of market failures. This problem is complicated by public health goals that demand antibiotic conservation and equitable patient access. Thus, an innovative incentive strategy is needed to encourage sustainable investment in antibiotics. This systematic review consolidates, classifies and critically assesses a total of 47 proposed incentives. Given the large number of possible strategies, a decision framework is presented to assist with the selection of incentives. This framework focuses on addressing market failures that result in limited investment, public health priorities regarding antibiotic stewardship and patient access, and implementation constraints and operational realities. The flexible nature of this framework allows policy makers to tailor an antibiotic incentive package that suits a country's health system structure and needs. PMID:26464014

  6. PROBABILISTIC ASSESSMENT OF A CRITICALITY IN A WASTE CONTAINER AT SRS

    SciTech Connect

    Eghbali, D

    2006-12-26

    Transuranic solid waste that has been generated as a result of the production of nuclear material for the United States defense program at the Savannah River Site (SRS) has been stored in more than 30,000 55-gallon drums and various size carbon steel boxes since 1953. Nearly two thirds of those containers have been processed and shipped to the Waste Isolation Pilot Plant. Among the containers assayed so far, the results indicate several drums with fissile inventories significantly higher (600-1000 grams {sup 239}Pu) than their original assigned values. While part of this discrepancy can be attributed to the past limited assay capabilities, human errors are believed to be the primary contributor. This paper summarizes an assessment of the probability of occurrence of a criticality accident during handling of the remaining transuranic waste containers at SRS.

  7. Temporal discounting in life cycle assessment: A critical review and theoretical framework

    SciTech Connect

    Yuan, Chris; Wang, Endong; Zhai, Qiang; Yang, Fan

    2015-02-15

    Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting in LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.

  8. Real Space Migdal-Kadanoff Renormalisation of Glassy Systems: Recent Results and a Critical Assessment

    NASA Astrophysics Data System (ADS)

    Angelini, Maria Chiara; Biroli, Giulio

    2017-03-01

    In this manuscript, in honour of L. Kadanoff, we present recent progress obtained in the description of finite dimensional glassy systems thanks to the Migdal-Kadanoff renormalisation group (MK-RG). We provide a critical assessment of the method, in particular discuss its limitation in describing situations in which an infinite number of pure states might be present, and analyse the MK-RG flow in the limit of infinite dimensions. MK-RG predicts that the spin-glass transition in a field and the glass transition are governed by zero-temperature fixed points of the renormalization group flow. This implies a typical energy scale that grows, approaching the transition, as a power of the correlation length, thus leading to enormously large time-scales as expected from experiments and simulations. These fixed points exist only in dimensions larger than d_L>3 but they nevertheless influence the RG flow below it, in particular in three dimensions. MK-RG thus predicts a similar behavior for spin-glasses in a field and models of glasses and relates it to the presence of avoided critical points.

  9. Faculty Approaches to Assessing Critical Thinking in the Humanities and the Natural and Social Sciences: Implications for General Education

    ERIC Educational Resources Information Center

    Nicholas, Mark C.; Labig, Chalmer E., Jr.

    2013-01-01

    An analysis of interviews, focus-group discussions, assessment instruments, and assignment prompts revealed that within general education, faculty assessed critical thinking as faceted using methods and criteria that varied epistemically across disciplines. Faculty approaches were misaligned with discipline-general institutional approaches.…

  10. Assessing a Critical Aspect of Construct Continuity when Test Specifications Change or Test Forms Deviate from Specifications

    ERIC Educational Resources Information Center

    Liu, Jinghua; Dorans, Neil J.

    2013-01-01

    We make a distinction between two types of test changes: inevitable deviations from specifications versus planned modifications of specifications. We describe how score equity assessment (SEA) can be used as a tool to assess a critical aspect of construct continuity, the equivalence of scores, whenever planned changes are introduced to testing…

  11. Computer-Aided Argument Mapping in an EFL Setting: Does Technology Precede Traditional Paper and Pencil Approach in Developing Critical Thinking?

    ERIC Educational Resources Information Center

    Eftekhari, Maryam; Sotoudehnama, Elaheh; Marandi, S. Susan

    2016-01-01

    Developing higher-order critical thinking skills as one of the central objectives of education has been recently facilitated via software packages. Whereas one such technology as computer-aided argument mapping is reported to enhance levels of critical thinking (van Gelder 2001), its application as a pedagogical tool in English as a Foreign…

  12. Contemporary issues for experimental design in assessment of medical imaging and computer-assist systems

    NASA Astrophysics Data System (ADS)

    Wagner, Robert F.; Beiden, Sergey V.; Campbell, Gregory; Metz, Charles E.; Sacks, William M.

    2003-05-01

    The dialog among investigators in academia, industry, NIH, and the FDA has grown in recent years on topics of historic interest to attendees of these SPIE sub-conferences on Image Perception, Observer Performance, and Technology Assessment. Several of the most visible issues in this regard have been the emergence of digital mammography and modalities for computer-assisted detection and diagnosis in breast and lung imaging. These issues appear to be only the "tip of the iceberg" foreshadowing a number of emerging advances in imaging technology. So it is timely to make some general remarks looking back and looking ahead at the landscape (or seascape). The advances have been facilitated and documented in several forums. The major role of the SPIE Medical Imaging Conferences i well-known to all of us. Many of us were also present at the Medical Image Perception Society and co-sponsored by CDRH and NCI in September of 2001 at Airlie House, VA. The workshops and discussions held at that conference addressed some critical contemporary issues related to how society - and in particular industry and FDA - approach the general assessment problem. A great deal of inspiration for these discussions was also drawn from several workshops in recent years sponsored by the Biomedical Imaging Program of the National Cancer Institute on these issues, in particular the problem of "The Moving Target" of imaging technology. Another critical phenomenon deserving our attention is the fact that the Fourth National Forum on Biomedical Imaging in Oncology was recently held in Bethesda, MD., February 6-7, 2003. These forums are presented by the National Cancer Institute (NCI), the Food and Drug Administration (FDA), the Centers for Medicare and Medicaid Services (CMS), and the National Electrical Manufacturers Association (NEMA). They are sponsored by the National Institutes of Health/Foundation for Advanced Education in the Sciences (NIH/FAES). These forums led to the development of the NCI

  13. Long-Term Assessment of Critical Radionuclides and Associated Environmental Media at the Savannah River Site

    SciTech Connect

    Jannik, G. T.; Baker, R. A.; Lee, P. L.; Eddy, T. P.; Blount, G. C.; Whitney, G. R.

    2012-11-06

    During the operational history of the Savannah River Site (SRS), many different radionuclides have been released from site facilities. However, only a relatively small number of the released radionuclides have been significant contributors to doses and risks to the public. At SRS dose and risk assessments indicate tritium oxide in air and surface water, and Cs-137 in fish and deer have been, and continue to be, the critical radionuclides and pathways. In this assessment, indepth statistical analyses of the long-term trends of tritium oxide in atmospheric and surface water releases and Cs-137 concentrations in fish and deer are provided. Correlations also are provided with 1) operational changes and improvements, 2) geopolitical events (Cold War cessation), and 3) recent environmental remediation projects and decommissioning of excess facilities. For example, environmental remediation of the F- and H-Area Seepage Basins and the Solid Waste Disposal Facility have resulted in a measurable impact on the tritium oxide flux to the onsite Fourmile Branch stream. Airborne releases of tritium oxide have been greatly affected by operational improvements and the end of the Cold War in 1991. However, the effects of SRS environmental remediation activities and ongoing tritium operations on tritium concentrations in the environment are measurable and documented in this assessment. Controlled hunts of deer and feral hogs are conducted at SRS for approximately six weeks each year. Before any harvested animal is released to a hunter, SRS personnel perform a field analysis for Cs-137 concentrations to ensure the hunter's dose does not exceed the SRS administrative game limit of 0.22 millisievert (22 mrem). However, most of the Cs-137 found in SRS onsite deer is not from site operations but is from nuclear weapons testing fallout from the 1950's and early 1960's. This legacy source term is trended in the SRS deer, and an assessment of the ''effective'' half-life of Cs-137 in deer

  14. Application of Computational Toxicology to Prospective and Diagnostic Ecological Risk Assessment

    EPA Pesticide Factsheets

    Application of Computational Toxicology to Prospective and Diagnostic Ecological Risk Assessment (Presented by: Dan Villeneuve, Ph.D., Research Toxicologist, US-EPA Mid-Continent Ecology Division) (1/24/2013)

  15. Assessing the impact of ionizing radiation on aquatic invertebrates: a critical review.

    PubMed

    Dallas, Lorna J; Keith-Roach, Miranda; Lyons, Brett P; Jha, Awadhesh N

    2012-05-01

    There is growing scientific, regulatory and public concern over anthropogenic input of radionuclides to the aquatic environment, especially given the issues surrounding existing nuclear waste, future energy demand and past or potential nuclear accidents. A change in the approach to how we protect the environment from ionizing radiation has also underlined the importance of assessing its impact on nonhuman biota. This review presents a thorough and critical examination of the available information on the effects of ionizing radiation on aquatic invertebrates, which constitute approximately 90% of extant life on the planet and play vital roles in ecosystem functioning. The aim of the review was to assess the progress made so far, addressing any concerns and identifying the knowledge gaps in the field. The critical analysis of the available information included determining yearly publications in the field, qualities of radiation used, group(s) of animals studied, and levels of biological organization at which effects were examined. The overwhelming conclusion from analysis of the available information is that more data are needed in almost every area. However, in light of the current priorities in human and environmental health, and considering regulatory developments, the following are areas of particular interest for future research on the effects of ionizing radiation on nonhuman biota in general and aquatic invertebrates in particular: (1) studies that use end points across multiple levels of biological organization, including an ecosystem level approach where appropriate, (2) multiple species studies that produce comparable data across phylogenetic groups, and (3) determination of the modifying (i.e. antagonistic, additive or synergistic) effects of biotic and abiotic factors on the impact of ionizing radiation. It is essential that all of these issues are examined in the context of well-defined radiation exposure and total doses received and consider the life

  16. Application of a modified sequential organ failure assessment score to critically ill patients.

    PubMed

    Namendys-Silva, S A; Silva-Medina, M A; Vásquez-Barahona, G M; Baltazar-Torres, J A; Rivero-Sigarroa, E; Fonseca-Lazcano, J A; Domínguez-Cherit, G

    2013-02-01

    The purpose of the present study was to explore the usefulness of the Mexican sequential organ failure assessment (MEXSOFA) score for assessing the risk of mortality for critically ill patients in the ICU. A total of 232 consecutive patients admitted to an ICU were included in the study. The MEXSOFA was calculated using the original SOFA scoring system with two modifications: the PaO2/FiO2 ratio was replaced with the SpO2/FiO2 ratio, and the evaluation of neurologic dysfunction was excluded. The ICU mortality rate was 20.2%. Patients with an initial MEXSOFA score of 9 points or less calculated during the first 24 h after admission to the ICU had a mortality rate of 14.8%, while those with an initial MEXSOFA score of 10 points or more had a mortality rate of 40%. The MEXSOFA score at 48 h was also associated with mortality: patients with a score of 9 points or less had a mortality rate of 14.1%, while those with a score of 10 points or more had a mortality rate of 50%. In a multivariate analysis, only the MEXSOFA score at 48 h was an independent predictor for in-ICU death with an OR = 1.35 (95%CI = 1.14-1.59, P < 0.001). The SOFA and MEXSOFA scores calculated 24 h after admission to the ICU demonstrated a good level of discrimination for predicting the in-ICU mortality risk in critically ill patients. The MEXSOFA score at 48 h was an independent predictor of death; with each 1-point increase, the odds of death increased by 35%.

  17. Critical issues in the formation of quantum computer test structures by ion implantation

    SciTech Connect

    Schenkel, T.; Lo, C. C.; Weis, C. D.; Schuh, A.; Persaud, A.; Bokor, J.

    2009-04-06

    The formation of quantum computer test structures in silicon by ion implantation enables the characterization of spin readout mechanisms with ensembles of dopant atoms and the development of single atom devices. We briefly review recent results in the characterization of spin dependent transport and single ion doping and then discuss the diffusion and segregation behaviour of phosphorus, antimony and bismuth ions from low fluence, low energy implantations as characterized through depth profiling by secondary ion mass spectrometry (SIMS). Both phosphorus and bismuth are found to segregate to the SiO2/Si interface during activation anneals, while antimony diffusion is found to be minimal. An effect of the ion charge state on the range of antimony ions, 121Sb25+, in SiO2/Si is also discussed.

  18. Regulating fatty acids in infant formula: critical assessment of U.S. policies and practices

    PubMed Central

    2014-01-01

    Background Fatty acids in breast-milk such as docosahexaenoic acid and arachidonic acid, commonly known as DHA and ARA, contribute to the healthy development of children in various ways. However, the manufactured versions that are added to infant formula might not have the same health benefits as those in breast-milk. There is evidence that the manufactured additives might cause harm to infants’ health, and they might lead to unwarranted increases in the cost of infant formula. The addition of such fatty acids to infant formula needs to be regulated. In the U.S., the Food and Drug Administration has primary responsibility for regulating the composition of infant formula. The central purpose of this study is to assess the FDA’s efforts with regard to the regulation of fatty acids in infant formula. Methods This study is based on critical analysis of policies and practices described in publicly available documents of the FDA, the manufacturers of fatty acids, and other relevant organizations. The broad framework for this work was set out by the author in his book on Regulating Infant Formula, published in 2011. Results The FDA does not assess the safety or the health impacts of fatty acid additives to infant formula before they are marketed, and there is no systematic assessment after marketing is underway. Rather than making its own independent assessments, the FDA accepts the manufacturers’ claims regarding their products’ safety and effectiveness. Conclusions The FDA is not adequately regulating the use of fatty acid additives to infant formula. This results in exposure of infants to potential risks. Adverse reactions are already on record. Also, the additives have led to increasing costs of infant formula despite the lack of proven benefits to normal, full term infants. There is a need for more effective regulation of DHA and ARA additives to infant formula. PMID:24433303

  19. Computer-Based Assessment of School Readiness and Early Reasoning

    ERIC Educational Resources Information Center

    Csapó, Beno; Molnár, Gyöngyvér; Nagy, József

    2014-01-01

    This study explores the potential of using online tests for the assessment of school readiness and for monitoring early reasoning. Four tests of a face-to-face-administered school readiness test battery (speech sound discrimination, relational reasoning, counting and basic numeracy, and deductive reasoning) and a paper-and-pencil inductive…

  20. Computer-Based Screening for the New Modified Alternate Assessment

    ERIC Educational Resources Information Center

    Kettler, Ryan J.

    2011-01-01

    The final regulations of the "No Child Left Behind" Act (U.S. Department of Education, 2007a, 2007b) indicate that a small group of students with disabilities can be counted as proficient through an alternate assessment based on modified academic achievement standards. This new policy gives individualized education program teams the…

  1. An Assessment of Nursing Attitudes toward Computers in Health Care.

    ERIC Educational Resources Information Center

    Carl, David L.; And Others

    The attitudes and perceptions of practicing nurses, student nurses, and nurse educators toward computerization of health care were assessed using questionnaires sent to two general hospitals and five nursing education programs. The sample consisted of 83 first-year nursing students, 84 second-year nursing students, 52 practicing nurses, and 26…

  2. Assessment of toxic metals in waste personal computers

    SciTech Connect

    Kolias, Konstantinos; Hahladakis, John N. Gidarakos, Evangelos

    2014-08-15

    Highlights: • Waste personal computers were collected and dismantled in their main parts. • Motherboards, monitors and plastic housing were examined in their metal content. • Concentrations measured were compared to the RoHS Directive, 2002/95/EC. • Pb in motherboards and funnel glass of devices released <2006 was above the limit. • Waste personal computers need to be recycled and environmentally sound managed. - Abstract: Considering the enormous production of waste personal computers nowadays, it is obvious that the study of their composition is necessary in order to regulate their management and prevent any environmental contamination caused by their inappropriate disposal. This study aimed at determining the toxic metals content of motherboards (printed circuit boards), monitor glass and monitor plastic housing of two Cathode Ray Tube (CRT) monitors, three Liquid Crystal Display (LCD) monitors, one LCD touch screen monitor and six motherboards, all of which were discarded. In addition, concentrations of chromium (Cr), cadmium (Cd), lead (Pb) and mercury (Hg) were compared with the respective limits set by the RoHS 2002/95/EC Directive, that was recently renewed by the 2012/19/EU recast, in order to verify manufacturers’ compliance with the regulation. The research included disassembly, pulverization, digestion and chemical analyses of all the aforementioned devices. The toxic metals content of all samples was determined using Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). The results demonstrated that concentrations of Pb in motherboards and funnel glass of devices with release dates before 2006, that is when the RoHS Directive came into force, exceeded the permissible limit. In general, except from Pb, higher metal concentrations were detected in motherboards in comparison with plastic housing and glass samples. Finally, the results of this work were encouraging, since concentrations of metals referred in the RoHS Directive were found in

  3. An Assessment of the Computer Science Activities of the Office of Naval Research

    DTIC Science & Technology

    1986-01-01

    A Panel of the Naval Studies Board of the National Research Council met for two days in October 1985 to assess the computer science programs of the...well as their knowledge of the field, the panel offers comments on the computer science research of the Navy and suggestions for further improvement...for example, was encouraged to take further advantage of its entrepreneurial flexibility to improve technology transfer between computer science and

  4. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    ERIC Educational Resources Information Center

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  5. Assessment of Techniques for Evaluating Computer Systems for Federal Agency Procurements. Final Report.

    ERIC Educational Resources Information Center

    Letmanyi, Helen

    Developed to identify and qualitatively assess computer system evaluation techniques for use during acquisition of general purpose computer systems, this document presents several criteria for comparison and selection. An introduction discusses the automatic data processing (ADP) acquisition process and the need to plan for uncertainty through…

  6. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  7. Randomised Items in Computer-Based Tests: Russian Roulette in Assessment?

    ERIC Educational Resources Information Center

    Marks, Anthony M.; Cronje, Johannes C.

    2008-01-01

    Computer-based assessments are becoming more commonplace, perhaps as a necessity for faculty to cope with large class sizes. These tests often occur in large computer testing venues in which test security may be compromised. In an attempt to limit the likelihood of cheating in such venues, randomised presentation of items is automatically…

  8. Impacts of Mobile Computing on Student Learning in the University: A Comparison of Course Assessment Data

    ERIC Educational Resources Information Center

    Hawkes, Mark; Hategekimana, Claver

    2010-01-01

    This study focuses on the impact of wireless, mobile computing tools on student assessment outcomes. In a campus-wide wireless, mobile computing environment at an upper Midwest university, an empirical analysis is applied to understand the relationship between student performance and Tablet PC use. An experimental/control group comparison of…

  9. A Computer Services Program for Residents of a Continuing Care Retirement Community: Needs Assessment and Program Design

    ERIC Educational Resources Information Center

    Grad, Jane; Berdes, Celia

    2005-01-01

    Preparatory to establishing a computer services program for residents, Presbyterian Homes, a multi-campus continuing care retirement community, conducted an assessment of residents' computer needs, ownership, and usage. Based on the results of the needs assessment, computer resource rooms were established at each facility, with computer hardware…

  10. Online training course on critical appraisal for nurses: adaptation and assessment

    PubMed Central

    2014-01-01

    Background Research is an essential activity for improving quality and efficiency in healthcare. The objective of this study was to train nurses from the public Basque Health Service (Osakidetza) in critical appraisal, promoting continuous training and the use of research in clinical practice. Methods This was a prospective pre-post test study. The InfoCritique course on critical appraisal was translated and adapted. A sample of 50 nurses and 3 tutors was recruited. Educational strategies and assessment instruments were established for the course. A course website was created that contained contact details of the teaching team and coordinator, as well as a course handbook and videos introducing the course. Assessment comprised the administration of questionnaires before and after the course, in order to explore the main intervention outcomes: knowledge acquired and self-learning readiness. Satisfaction was also measured at the end of the course. Results Of the 50 health professionals recruited, 3 did not complete the course for personal or work-related reasons. The mean score on the pre-course knowledge questionnaire was 70.5 out of 100, with a standard deviation of 11.96. In general, participants’ performance on the knowledge questionnaire improved after the course, as reflected in the notable increase of the mean score, to 86.6, with a standard deviation of 10.00. Further, analyses confirmed statistically significant differences between pre- and post-course results (p < 0.001). With regard to self-learning readiness, after the course, participants reported a greater readiness and ability for self-directed learning. Lastly, in terms of level of satisfaction with the course, the mean score was 7 out of 10. Conclusions Participants significantly improved their knowledge score and self-directed learning readiness after the educational intervention, and they were overall satisfied with the course. For the health system and nursing professionals, this type of

  11. Computational models of ethanol-induced neurodevelopmental toxicity across species: Implications for risk assessment.

    PubMed

    Gohlke, Julia M; Griffith, William C; Faustman, Elaine M

    2008-02-01

    Computational, systems-based approaches can provide a quantitative construct for evaluating risk in the context of mechanistic data. Previously, we developed computational models for the rat, mouse, rhesus monkey, and human, describing the acquisition of adult neuron number in the neocortex during the key neurodevelopmental processes of neurogenesis and synaptogenesis. Here we apply mechanistic data from the rat describing ethanol-induced toxicity in the developing neocortex to evaluate the utility of these models for analyzing neurodevelopmental toxicity across species. Our model can explain long-term neocortical neuronal loss in the rodent model after in utero exposure to ethanol based on inhibition of proliferation during neurogenesis. Our human model predicts a significant neuronal deficit after daily peak BECs reaching 10-20 mg/dl, which is the approximate BEC reached after drinking one standard drink within one hour. In contrast, peak daily BECs of 100 mg/dl are necessary to predict similar deficits in the rat. Our model prediction of increased sensitivity of primate species to ethanol-induced inhibition of proliferation is based on application of in vivo experimental data from primates showing a prolonged rapid growth period in the primate versus rodent neuronal progenitor population. To place our predictions into a broader context, we evaluate the evidence for functional low-dose effects across rats, monkeys, and humans. Results from this critical evaluation suggest subtle effects are evident at doses causing peak BECs of approximately 20 mg/dl daily, corroborating our model predictions. Our example highlights the utility of a systems-based modeling approach in risk assessment.

  12. A Structural and Functional Assessment of the Lung via Multidetector-Row Computed Tomography

    PubMed Central

    Hoffman, Eric A.; Simon, Brett A.; McLennan, Geoffrey

    2006-01-01

    With advances in multidetector-row computed tomography (MDCT), it is now possible to image the lung in 10 s or less and accurately extract the lungs, lobes, and airway tree to the fifth- through seventh-generation bronchi and to regionally characterize lung density, texture, ventilation, and perfusion. These methods are now being used to phenotype the lung in health and disease and to gain insights into the etiology of pathologic processes. This article outlines the application of these methodologies with specific emphasis on chronic obstructive pulmonary disease. We demonstrate the use of our methods for assessing regional ventilation and perfusion and demonstrate early data that show, in a sheep model, a regionally intact hypoxic pulmonary vasoconstrictor (HPV) response with an apparent inhibition of HPV regionally in the presence of inflammation. We present the hypothesis that, in subjects with pulmonary emphysema, one major contributing factor leading to parenchymal destruction is the lack of a regional blunting of HPV when the regional hypoxia is related to regional inflammatory events (bronchiolitis or alveolar flooding). If maintaining adequate blood flow to inflamed lung regions is critical to the nondestructive resolution of inflammatory events, the pathologic condition whereby HPV is sustained in regions of inflammation would likely have its greatest effect in the lung apices where blood flow is already reduced in the upright body posture. PMID:16921136

  13. Criticality safety assessment of a TRIGA reactor spent-fuel pool under accident conditions

    SciTech Connect

    Glumac, B; Ravnik, M.; Logar, M.

    1997-02-01

    Additional criticality safety analysis of a pool-type storage for TRIGA spent fuel at the Jozef Stefan Institute in Ljubljana, Slovenia, is presented. Previous results have shown that subcriticality is not guaranteed for some postulated accidents (earthquake with subsequent fuel rack disintegration resulting in contact fuel pitch) under the assumption that the fuel rack is loaded with fresh 12 wt% standard fuel. To mitigate this deficiency, a study was done on replacing a certain number of fuel elements in the rack with cadmium-loaded absorber rods. The Monte Carlo computer code MCNP4A with an ENDF/B-V library and detailed three-dimensional geometrical model of the spent-fuel rack was used for this purpose. First, a minimum critical number of fuel elements was determined for contact pitch, and two possible geometries of rack disintegration were considered. Next, it was shown that subcriticality can be ensured when pitch is decreased from a rack design pitch of 8 cm to contact, if a certain number of fuel elements (8 to 20 out of 70) are replaced by absorber rods, which are uniformly mixed into the lattice. To account for the possibility that random mixing of fuel elements and absorber rods can occur during rack disintegration and result in a supercritical configuration, a probabilistic study was made to sample the probability density functions for random absorber rod lattice loadings. Results of the calculations show that reasonably low probabilities for supercriticality can be achieved (down to 10{sup {minus}6} per severe earthquake, which would result in rack disintegration and subsequent maximum possible pitch decrease) even in the case where fresh 12 wt% standard TRIGA fuel would be stored in the spent-fuel pool.

  14. Sugaring the Pill: Assessing Rhetorical Strategies Designed to Minimize Defensive Reactions to Group Criticism

    ERIC Educational Resources Information Center

    Hornsey, Matthew J.; Robson, Erin; Smith, Joanne; Esposo, Sarah; Sutton, Robbie M.

    2008-01-01

    People are considerably more defensive in the face of group criticism when the criticism comes from an out-group rather than an in-group member (the intergroup sensitivity effect). We tested three strategies that out-group critics can use to reduce this heightened defensiveness. In all studies, Australians received criticism of their country…

  15. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Bland, Arthur S Buddy; Hack, James J; Baker, Ann E; Barker, Ashley D; Boudwin, Kathlyn J.; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; White, Julia C

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next

  16. Cognitive Assessment of Movement-Based Computer Games

    NASA Technical Reports Server (NTRS)

    Kearney, Paul

    2008-01-01

    This paper examines the possibility that dance games such as Dance Dance Revolution or StepMania enhance the cognitive abilities that are critical to academic achievement. These games appear to place a high cognitive load on working memory requiring the player to convert a visual signal to a physical movement up to 7 times per second. Players see a pattern of directions displayed on the screen and they memorise these as a dance sequence. Other researchers have found that attention span and memory ability, both cognitive abilities required for academic achievement, are improved through the use of physical movement and exercise. This paper reviews these claims and documents tool development for on-going research by the author.

  17. Assessment of metabolic bone diseases by quantitative computed tomography

    SciTech Connect

    Richardson, M.L.; Genant, H.K.; Cann, C.E.; Ettinger, B.; Gordan, G.S.; Kolb, F.O.; Reiser, U.J.

    1985-05-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid- induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements.

  18. Assessing The Impact Of Computed Radiography And PACS

    NASA Astrophysics Data System (ADS)

    Hedgcock, Marcus W.; Kehr, Katherine

    1989-05-01

    Our institution (San Francisco VA Medical Center) is a VA pilot center for total digital imaging and PACS. Quantitative information about PACS impact on health care is limited, because no centers have done rigorous preimplementation studies. We are gathering quantitative service delivery and cost data before, during, and after stepwise implementation of computed radiography and PACS at our institution to define the impact on imaging service delivery. We designed a simple audit method using the x-ray request and time clocks to determine patient waiting time, imaging time, film use, image availability to the radiologist, matching of current with previous images, image availability to clinicians, and time to final interpretation. Our department model is a multichannel, mulitserver patient queue. Our current radiograph file is space limited, containing only one year of images; older images are kept in a remote file area in another building. In addition, there are 16 subfile areas within the Radiology Service and the medical center. Our preimplementation audit showed some long waiting times (40 minutes, average 20) and immediate retrieval of prior films in only 42% of cases, with an average retrieval time of 22 hours. Computed radiography and the optical archive have the potential to improve these figures. The audit will be ongoing and automated as implementation of PACS progresses, to measure service improvement and learning curve with the new equipment. We present the audit format and baseline preimplementation figures.

  19. Defining a Computational Framework for the Assessment of ...

    EPA Pesticide Factsheets

    The Adverse Outcome Pathway (AOP) framework describes the effects of environmental stressors across multiple scales of biological organization and function. This includes an evaluation of the potential for each key event to occur across a broad range of species in order to determine the taxonomic applicability of each AOP. Computational tools are needed to facilitate this process. Recently, we developed a tool that uses sequence homology to evaluate the applicability of molecular initiating events across species (Lalone et al., Toxicol. Sci., 2016). To extend our ability to make computational predictions at higher levels of biological organization, we have created the AOPdb. This database links molecular targets identified associated with key events in the AOPwiki to publically available data (e.g. gene-protein, pathway, species orthology, ontology, chemical, disease) including ToxCast assay information. The AOPdb combines different data types in order to characterize the impacts of chemicals to human health and the environment and serves as a decision support tool for case study development in the area of taxonomic applicability. As a proof of concept, the AOPdb allows identification of relevant molecular targets, biological pathways, and chemical and disease associations across species for four AOPs from the AOP-Wiki (https://aopwiki.org): Estrogen receptor antagonism leading to reproductive dysfunction (Aop:30); Aromatase inhibition leading to reproductive d

  20. A computer services program for residents of a continuing care retirement community: needs assessment and program design.

    PubMed

    Grad, Jane; Berdes, Celia

    2005-01-01

    Preparatory to establishing a computer services program for residents, Presbyterian Homes, a multi-campus continuing care retirement community, conducted an assessment of residents' computer needs, ownership, and usage. Based on the results of the needs assessment, computer resource rooms were established at each facility, with computer hardware and software adapted for the use of seniors. We also deliver adapted computer education for residents, including small-group training in basic software skills; classes on software for computer accessibility; and workshops on themes motivating computer use. Ongoing evaluation shows that half of residents make use of computer resources, and that their computer skills have opened the door to other educational opportunities.

  1. The Identification, Implementation, and Evaluation of Critical User Interface Design Features of Computer-Assisted Instruction Programs in Mathematics for Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Seo, You-Jin; Woo, Honguk

    2010-01-01

    Critical user interface design features of computer-assisted instruction programs in mathematics for students with learning disabilities and corresponding implementation guidelines were identified in this study. Based on the identified features and guidelines, a multimedia computer-assisted instruction program, "Math Explorer", which delivers…

  2. Evaluation of perfusion index as a tool for pain assessment in critically ill patients.

    PubMed

    Hasanin, Ahmed; Mohamed, Sabah Abdel Raouf; El-Adawy, Akram

    2016-09-24

    Pain is a common and undertreated problem in critically ill patients. Pain assessment in critically ill patients is challenging and relies on complex scoring systems. The aim of this work was to find out the possible role of the perfusion index (PI) measured by a pulse oximeter (Masimo Radical 7; Masimo Corp., Irvine, CA, USA) in pain assessment in critically ill patients. A prospective observational study was carried out on 87 sedated non-intubated patients in a surgical intensive care unit. In addition to routine monitoring, a Masimo pulse oximeter probe was used for PI measurement. The sedation level of the patients was assessed by using the Richmond Agitation-Sedation Scale (RASS). The pain intensity was determined by applying the behavioral pain scale for non-intubated (BPS-NI) patients. The PI, arterial blood pressure, heart rate, RASS, and BPS-NI values before and after the application of a standard painful stimulus (changing the patient position) were reported. Correlation between the PI and other variables was carried out at the two measurements. Correlation between changes in the PI (delta PI) and in the hemodynamic variables, RASS, and BPS-NI was also done. Changing the patient position resulted in a significant increase in SBP (128 ± 20 vs 120.4 ± 20.6, P = 0.009), DBP (71.3 ± 11.2 vs 68.7 ± 11.3, P = 0.021), heart rate (99.5 ± 19 vs 92.7 ± 18.2, P = 0.013), and BPS-NI (7[6-8] vs 3[3-3], P < 0.001) values and a significant decrease in the PI (1[0.5-1.9] vs 2.2[0.97-3.6], P < 0.001) value compared to the baseline readings. There was no correlation between the values of the PI and the ABP, BPS-NI, and RASS at the two measurements. A good correlation was found between the delta PI and delta BPS-NI (r = -0.616, P < 0.001). A weak correlation was observed between the PI and heart rate after the patient positioning (r = -0.249, P < 0.02). In surgical critically ill non-intubated patients, the application of a painful

  3. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  4. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  5. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  6. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  7. Neurophysiological assessment of brain dysfunction in critically ill patients: an update.

    PubMed

    Azabou, Eric; Fischer, Catherine; Guerit, Jean Michel; Annane, Djillali; Mauguiere, François; Lofaso, Fréderic; Sharshar, Tarek

    2017-01-21

    The aim of this review was to provide up-to-date information about the usefulness of clinical neurophysiology testing in the management of critically ill patients. Evoked potentials (EPs) and electroencephalogram (EEG) are non-invasive clinical neurophysiology tools that allow an objective assessment of the central nervous system's function at the bedside in intensive care unit (ICU). These tests are quite useful in diagnosing cerebral complications, and establishing the vital and functional prognosis in ICU. EEG keeps a particularly privileged importance in detecting seizures phenomena such as subclinical seizures and non-convulsive status epilepticus. Quantitative EEG (QEEG) analysis techniques commonly called EEG Brain mapping can provide obvious topographic displays of digital EEG signals characteristics, showing the potential distribution over the entire scalp including filtering, frequency, and amplitude analysis and color mapping. Evidences of usefulness of QEEG for seizures detection in ICU are provided by several recent studies. Furthermore, beyond detection of epileptic phenomena, changes of some QEEG panels are early warning indicators of sedation level as well as brain damage or dysfunction in ICU. EPs offer the opportunity for assessing brainstem's functional integrity, as well as subcortical and cortical brain areas. A multimodal use, combining EEG and various modalities of EPs is recommended since this allows a more accurate functional exploration of the brain and helps caregivers to tailor therapeutic measures according to neurological worsening trends and to anticipate the prognosis in ICU.

  8. An impact assessment and critical appraisal of the ISO standard for wheelchair vocabulary.

    PubMed

    Dolan, Michael J; Henderson, Graham I

    2013-07-01

    Wheelchairs are, for users, a primary means of mobility and an important means of performing activities of daily living. A common, accepted vocabulary is required to support and foster evidence-based practice and communication amongst professionals and with users. The international standard for wheelchair vocabulary, ISO 7176-26:2007, specifies terms and definitions with the purpose of eliminating confusion from the duplication or inappropriate use of terms. The aim of this study was to assess its impact and, based on that assessment, critically appraise the standard. Two databases were searched returning 189 and 283 unique articles with wheelchair in the title published between 2004-2006 and 2009-2011 respectively. Compliance, based on title and abstract usage, was poor, ranging from 0 to 50% correct usage, with no significant difference between pre- and post-publication. A review of prescription forms found only 9% correct usage. A survey of NHS wheelchair managers found that only 30% were positive that they had a copy despite 67% agreeing that the standard is important. The ISO wheelchair vocabulary standard was found not to be achieving its stated purpose. It is recommended that it be revised taking into account the findings of this study including the need for targeted dissemination and increased awareness.

  9. Life Cycle Assessment of Pavements: A Critical Review of Existing Literature and Research

    SciTech Connect

    Santero, Nicholas; Masanet, Eric; Horvath, Arpad

    2010-04-20

    This report provides a critical review of existing literature and modeling tools related to life-cycle assessment (LCA) applied to pavements. The review finds that pavement LCA is an expanding but still limited research topic in the literature, and that the existing body of work exhibits methodological deficiencies and incompatibilities that serve as barriers to the widespread utilization of LCA by pavement engineers and policy makers. This review identifies five key issues in the current body of work: inconsistent functional units, improper system boundaries, imbalanced data for asphalt and cement, use of limited inventory and impact assessment categories, and poor overall utility. This review also identifies common data and modeling gaps in pavement LCAs that should be addressed in future work. These gaps include: the use phase (rolling resistance, albedo, carbonation, lighting, leachate, and tire wear and emissions), asphalt fumes, feedstock energy of bitumen, traffic delay, the maintenance phase, and the end-of-life phase. This review concludes with a comprehensive list of recommendations for future research, which shed light on where improvements in knowledge can be made that will benefit the accuracy and comprehensiveness of pavement LCAs moving forward.

  10. Professional nursing practice in critical units: assessment of work environment characteristics

    PubMed Central

    Maurício, Luiz Felipe Sales; Okuno, Meiry Fernanda Pinto; Campanharo, Cássia Regina Vancini; Lopes, Maria Carolina Barbosa Teixeira; Belasco, Angélica Gonçalves Silva; Batista, Ruth Ester Assayag

    2017-01-01

    ABSTRACT Objective: assess the autonomy, control over environment, and organizational support of nurses' work process and the relationships between physicians and nurses in critical care units. Method: cross-sectional study conducted with 162 nurses working in the intensive care units and emergency service of a university hospital. The workers' satisfaction with their work environment was assessed using Brazilian Nursing Work Index - Revised, translated and adapted for the Brazilian culture. Results: average age was 31.6 ± 3.9 years; 80.2% were women; 68.5% Caucasians and 71.6% worked in intensive care units. The nurses considered autonomy (2.38 ± 0.64) and their relationship with physicians (2.24 ± 0.62) to be characteristics of the work environment that favored professional practice. Control over environment (2.78 ± 0.62) and organizational support (2.51 ± 0.54), however, were considered to be unfavorable. No statistically significant differences were found between the units based on the scores obtained by the professionals on the Brazilian Nursing Work Index - Revised. Conclusion: autonomy, relationship between physicians and nurses, and organizational support were considered by the units to be characteristics that favored nurses' professional practices. On the other hand, control over environment and organizational support were considered unfavorable. PMID:28301034

  11. Critical parameters of a noise model that affect fault tolerant quantum computation on a single qubit

    NASA Astrophysics Data System (ADS)

    Iyer, Pavithran; da Silva, Marcus P.; Poulin, David

    In this work, we aim to determine the parameters of a single qubit channel that can tightly bound the logical error rate of the Steane code. We do not assume any a priori structure for the quantum channel, except that it is a CPTP map and we use a concatenated Steane code to encode a single qubit. Unlike the standard Monte Carlo technique that requires many iterations to estimate the logical error rate with sufficient accuracy, we use techniques to compute the complete effect of a physical CPTP map, at the logical level. Using this, we have studied the predictive power of several physical noise metrics on the logical error rate, and show, through numerical simulations with random quantum channels, that, on their own, none of the natural physical metrics lead to accurate predictions about the logical error rate. We then show how machine learning techniques help us to explore which features of a random quantum channel are important in predicting its logical error rate.

  12. Computational and experimental analysis of TMS-induced electric field vectors critical to neuronal activation

    NASA Astrophysics Data System (ADS)

    Krieg, Todd D.; Salinas, Felipe S.; Narayana, Shalini; Fox, Peter T.; Mogul, David J.

    2015-08-01

    Objective. Transcranial magnetic stimulation (TMS) represents a powerful technique to noninvasively modulate cortical neurophysiology in the brain. However, the relationship between the magnetic fields created by TMS coils and neuronal activation in the cortex is still not well-understood, making predictable cortical activation by TMS difficult to achieve. Our goal in this study was to investigate the relationship between induced electric fields and cortical activation measured by blood flow response. Particularly, we sought to discover the E-field characteristics that lead to cortical activation. Approach. Subject-specific finite element models (FEMs) of the head and brain were constructed for each of six subjects using magnetic resonance image scans. Positron emission tomography (PET) measured each subject’s cortical response to image-guided robotically-positioned TMS to the primary motor cortex. FEM models that employed the given coil position, orientation, and stimulus intensity in experimental applications of TMS were used to calculate the electric field (E-field) vectors within a region of interest for each subject. TMS-induced E-fields were analyzed to better understand what vector components led to regional cerebral blood flow (CBF) responses recorded by PET. Main results. This study found that decomposing the E-field into orthogonal vector components based on the cortical surface geometry (and hence, cortical neuron directions) led to significant differences between the regions of cortex that were active and nonactive. Specifically, active regions had significantly higher E-field components in the normal inward direction (i.e., parallel to pyramidal neurons in the dendrite-to-axon orientation) and in the tangential direction (i.e., parallel to interneurons) at high gradient. In contrast, nonactive regions had higher E-field vectors in the outward normal direction suggesting inhibitory responses. Significance. These results provide critical new

  13. Single-molecule protein sequencing through fingerprinting: computational assessment

    NASA Astrophysics Data System (ADS)

    Yao, Yao; Docter, Margreet; van Ginkel, Jetty; de Ridder, Dick; Joo, Chirlmin

    2015-10-01

    Proteins are vital in all biological systems as they constitute the main structural and functional components of cells. Recent advances in mass spectrometry have brought the promise of complete proteomics by helping draft the human proteome. Yet, this commonly used protein sequencing technique has fundamental limitations in sensitivity. Here we propose a method for single-molecule (SM) protein sequencing. A major challenge lies in the fact that proteins are composed of 20 different amino acids, which demands 20 molecular reporters. We computationally demonstrate that it suffices to measure only two types of amino acids to identify proteins and suggest an experimental scheme using SM fluorescence. When achieved, this highly sensitive approach will result in a paradigm shift in proteomics, with major impact in the biological and medical sciences.

  14. Assessment of feline abdominal adipose tissue using computed tomography.

    PubMed

    Lee, Hyeyeon; Kim, Mieun; Choi, Mihyun; Lee, Namsoon; Chang, Jinhwa; Yoon, Junghee; Choi, Mincheol

    2010-12-01

    Obesity is a common nutritional disorder in cats and it increases the risk factors for various diseases. The aim of this study is to suggest a method for the evaluation of feline obesity using computed tomography. The attenuation range from -156 to -106 was determined as the range of feline abdominal adipose tissue. With this range, total (TAT), visceral (VAT) and subcutaneous (SAT) adipose tissues were measured. The best correlation between the adipose tissue in cross-sectional image and entire abdomen volume was obtained at the L3 and L5 levels. The mean VAT/SAT ratio was 1.18±0.32, which was much higher than in humans. The cats with an overweight body condition had a significantly lower VAT/SAT ratio than cats with an ideal body condition. This technique may contribute to both the clinical diagnosis and the experimental study of feline obesity.

  15. Radiological Assessment of Bioengineered Bone in a Muscle Flap for the Reconstruction of Critical-Size Mandibular Defect

    PubMed Central

    Al-Fotawei, Randa; Ayoub, Ashraf F.; Heath, Neil; Naudi, Kurt B.; Tanner, K. Elizabeth; Dalby, Matthew J.; McMahon, Jeremy

    2014-01-01

    This study presents a comprehensive radiographic evaluation of bone regeneration within a pedicled muscle flap for the reconstruction of critical size mandibular defect. The surgical defect (20 mm×15 mm) was created in the mandible of ten experimental rabbits. The masseter muscle was adapted to fill the surgical defect, a combination of calcium sulphate/hydroxyapatite cement (CERAMENT™ |SPINE SUPPORT), BMP-7 and rabbit mesenchymal stromal cells (rMSCs) was injected inside the muscle tissue. Radiographic assessment was carried out on the day of surgery and at 4, 8, and 12 weeks postoperatively. At 12 weeks, the animals were sacrificed and cone beam computerized tomography (CBCT) scanning and micro-computed tomography (µ-CT) were carried out. Clinically, a clear layer of bone tissue was identified closely adherent to the border of the surgical defect. Sporadic radio-opaque areas within the surgical defect were detected radiographically. In comparison with the opposite non operated control side, the estimated quantitative scoring of the radio-opacity was 46.6% ±15, the mean volume of the radio-opaque areas was 63.4% ±20. Areas of a bone density higher than that of the mandibular bone (+35% ±25%) were detected at the borders of the surgical defect. The micro-CT analysis revealed thinner trabeculae of the regenerated bone with a more condensed trabecular pattern than the surrounding native bone. These findings suggest a rapid deposition rate of the mineralised tissue and an active remodelling process of the newly regenerated bone within the muscle flap. The novel surgical model of this study has potential clinical application; the assessment of bone regeneration using the presented radiolographic protocol is descriptive and comprehensive. The findings of this research confirm the remarkable potential of local muscle flaps as local bioreactors to induce bone formation for reconstruction of maxillofacial bony defects. PMID:25226170

  16. A Comparative Assessment of Computer Literacy of Private and Public Secondary School Students in Lagos State, Nigeria

    ERIC Educational Resources Information Center

    Osunwusi, Adeyinka Olumuyiwa; Abifarin, Michael Segun

    2013-01-01

    The aim of this study was to conduct a comparative assessment of computer literacy of private and public secondary school students. Although the definition of computer literacy varies widely, this study treated computer literacy in terms of access to, and use of, computers and the internet, basic knowledge and skills required to use computers and…

  17. Computer-Based Assessment: Can It Deliver on Its Promise? Knowledge Brief.

    ERIC Educational Resources Information Center

    Rabinowitz, Stanley; Brandt, Tamara

    Computer-based assessment appears to offer the promise of radically improving both how assessments are implemented and the quality of the information they can deliver. However, as many states consider whether to embrace this new technology, serious concerns remain about the fairness of the new systems and the readiness of states (and districts and…

  18. Evaluation of a Computer-Controlled Videodisc Program To Teach Pediatric Neuromotor Assessment. Revised.

    ERIC Educational Resources Information Center

    Huntley, Joan Sustik; And Others

    To assess the instructional effectiveness and user acceptance of a computer-controlled videodisc program designed to teach medical students to recognize and assess motor dysfunction in infants, an experimental group of third year medical students (N=65) were instructed using the videodisc program; a corresponding control group (N=70) did not use…

  19. Assessing Visibility, Legibility and Comprehension for Interactive Whiteboards (IWBS) vs. Computers

    ERIC Educational Resources Information Center

    Megalakaki, Olga; Aparicio, Xavier; Porion, Alexandre; Pasqualotti, Léa; Baccino, Thierry

    2016-01-01

    The usability of interactive whiteboards vs. computers was evaluated on three dimensions (visibility, legibility and comprehension) in the secondary school pupils. The visibility assessment consisted in detecting a visual stimulus varying in luminance using a staircase procedure, legibility was assessed with a target-search task, and we…

  20. Supporting Student Learning: The Use of Computer-Based Formative Assessment Modules.

    ERIC Educational Resources Information Center

    Peat, Mary; Franklin, Sue

    2002-01-01

    Describes the development of a variety of computer-based assessment opportunities, both formative and summative, that are available to a large first-year biology class at the University of Sydney (Australia). Discusses online access to weekly quizzes, a mock exam, and special self-assessment modules that are beneficial to student learning.…

  1. Development of an Integrated Computer-Based Assessment System for Early Childhood Programs

    ERIC Educational Resources Information Center

    Bennett, Deborah E.; Arvidson, Helen H.; Giorgetti, Karen

    2004-01-01

    This article describes the development of a computer-based assessment system for children in early childhood programs, The Indiana Assessment System of Educational Proficiencies: Early Childhood (IASEP: EC). Skills in five developmental domains (i.e., cognitive, communication, social, sensory motor, and self-help) were selected and content…

  2. Staff and Student Perceptions of Computer-Assisted Assessment for Physiology Practical Classes

    ERIC Educational Resources Information Center

    Sheader, Elizabeth; Gouldsborough, Ingrid; Grady, Ruth

    2006-01-01

    Effective assessment of laboratory practicals is a challenge for large-size classes. To reduce the administrative burden of staff members without compromising the student learning experience, we utilized dedicated computer software for short-answer question assessment for nearly 300 students and compared it with the more traditional, paper-based…

  3. Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction

    ERIC Educational Resources Information Center

    Zoanetti, Nathan

    2010-01-01

    This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…

  4. Synchronous Computer-Mediated Dynamic Assessment: A Case Study of L2 Spanish Past Narration

    ERIC Educational Resources Information Center

    Darhower, Mark Anthony

    2014-01-01

    In this study, dynamic assessment is employed to help understand the developmental processes of two university Spanish learners as they produce a series of past narrations in a synchronous computer mediated environment. The assessments were conducted in six weekly one-hour chat sessions about various scenes of a Spanish language film. The analysis…

  5. Formative Computer-Based Assessment in Higher Education: The Effectiveness of Feedback in Supporting Student Learning

    ERIC Educational Resources Information Center

    Miller, Tess

    2009-01-01

    A formative computer-based assessment (CBA) was one of three instruments used for assessment in a Bachelor of Education course at Queen's University (Ontario, Canada) with an enrolment of approximately 700 students. The formative framework fostered a self-regulated learning environment whereby feedback on the CBA was used to support rather than…

  6. The physiological and emotional effects of touch: Assessing a hand-massage intervention with high self-critics.

    PubMed

    Maratos, Frances A; Duarte, Joana; Barnes, Christopher; McEwan, Kirsten; Sheffield, David; Gilbert, Paul

    2017-01-25

    Research demonstrates that highly self-critical individuals can respond negatively to the initial introduction of a range of therapeutic interventions. Yet touch as a form of therapeutic intervention in self-critical individuals has received limited prior investigation, despite documentation of its beneficial effects for well-being. Using the Forms of Self-Criticism/Self-Reassuring Scale, 15 high- and 14 low- self-critical individuals (from a sample of 139 females) were recruited to assess how self-criticism impacts upon a single instance of focused touch. All participants took part in a hand massage- and haptic control- intervention. Salivary cortisol and alpha amylase, as well as questionnaire measures of emotional responding were taken before and after the interventions. Following hand massage, analyses revealed cortisol decreased significantly across all participants; and that significant changes in emotional responding reflected well-being improvements across all participants. Supplementary analyses further revealed decreased alpha amylase responding to hand massage as compared to a compassion-focused intervention in the same (highly self-critical) individuals. Taken together, the physiological and emotional data indicate high self-critical individuals responded in a comparable manner to low self-critical individuals to a single instance of hand massage. This highlights that focused touch may be beneficial when first engaging highly self-critical individuals with specific interventions.

  7. The critical role of culture and environment as determinants of women's participation in computer science

    NASA Astrophysics Data System (ADS)

    Frieze, Carol

    This thesis proposes the need for, and illustrates, a new approach to how we think about, and act on, issues relating to women's participation, or lack of participation, in computer science (CS). This approach is based on a cultural perspective arguing that many of the reasons for women entering---or not entering---CS programs have little to do with gender and a lot to do with environment and culture. Evidence for this approach comes primarily from a qualitative, research study, which shows the effects of changes in the micro-culture on CS undergraduates at Carnegie Mellon, and from studies of other cultural contexts that illustrate a "Women-CS fit". We also discuss the interventions that have been crucial to the evolution of this specific micro-culture. Our argument goes against the grain of many gender and CS studies which conclude that the reasons for women's low participation in CS are based in gender --and particularly in gender differences in how men and women relate to the field. Such studies tend to focus on gender differences and recommend accommodating (what are perceived to be) women's different ways of relating to CS. This is often interpreted as contextualizing the curriculum to make it "female-friendly". The CS curriculum at Carnegie Mellon was not contextualized to be "female-friendly". Nevertheless, over the past few years, the school has attracted and graduated well above the US national average for women in undergraduate CS programs. We argue that this is due in large part to changes in the culture and environment of the department. As the environment has shifted from an unbalanced to a more balanced environment (balanced in terms of gender, breadth of student personalities, and professional support for women) the way has been opened for a range of students, including a significant number of women, to participate, and be successful, in the CS major. Our research shows that as men and women inhabit, and participate in, a more balanced environment

  8. Computational Modeling and Assessment Of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect

    David W. Gandy; John P. Shingledecker

    2011-04-11

    Forced outages and boiler unavailability in conventional coal-fired fossil power plants is most often caused by fireside corrosion of boiler waterwalls. Industry-wide, the rate of wall thickness corrosion wastage of fireside waterwalls in fossil-fired boilers has been of concern for many years. It is significant that the introduction of nitrogen oxide (NOx) emission controls with staged burners systems has increased reported waterwall wastage rates to as much as 120 mils (3 mm) per year. Moreover, the reducing environment produced by the low-NOx combustion process is the primary cause of accelerated corrosion rates of waterwall tubes made of carbon and low alloy steels. Improved coatings, such as the MCrAl nanocoatings evaluated here (where M is Fe, Ni, and Co), are needed to reduce/eliminate waterwall damage in subcritical, supercritical, and ultra-supercritical (USC) boilers. The first two tasks of this six-task project-jointly sponsored by EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)-have focused on computational modeling of an advanced MCrAl nanocoating system and evaluation of two nanocrystalline (iron and nickel base) coatings, which will significantly improve the corrosion and erosion performance of tubing used in USC boilers. The computational model results showed that about 40 wt.% is required in Fe based nanocrystalline coatings for long-term durability, leading to a coating composition of Fe-25Cr-40Ni-10 wt.% Al. In addition, the long term thermal exposure test results further showed accelerated inward diffusion of Al from the nanocrystalline coatings into the substrate. In order to enhance the durability of these coatings, it is necessary to develop a diffusion barrier interlayer coating such TiN and/or AlN. The third task 'Process Advanced MCrAl Nanocoating Systems' of the six-task project jointly sponsored by the Electric Power Research Institute, EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)- has focused on processing of

  9. Computational assessment of several hydrogen-free high energy compounds.

    PubMed

    Tan, Bisheng; Huang, Ming; Long, Xinping; Li, Jinshan; Fan, Guijuan

    2016-01-01

    Tetrazino-tetrazine-tetraoxide (TTTO) is an attractive high energy compound, but unfortunately, it is not yet experimentally synthesized so far. Isomerization of TTTO leads to its five isomers, bond-separation energies were empolyed to compare the global stability of six compounds, it is found that isomer 1 has the highest bond-separation energy (1204.6kJ/mol), compared with TTTO (1151.2kJ/mol); thermodynamic properties of six compounds were theoretically calculated, including standard formation enthalpies (solid and gaseous), standard fusion enthalpies, standard vaporation enthalpies, standard sublimation enthalpies, lattice energies and normal melting points, normal boiling points; their detonation performances were also computed, including detonation heat (Q, cal/g), detonation velocity (D, km/s), detonation pressure (P, GPa) and impact sensitivity (h50, cm), compared with TTTO (Q=1311.01J/g, D=9.228km/s, P=40.556GPa, h50=12.7cm), isomer 5 exhibites better detonation performances (Q=1523.74J/g, D=9.389km/s, P=41.329GPa, h50= 28.4cm).

  10. Assessment of toxic metals in waste personal computers.

    PubMed

    Kolias, Konstantinos; Hahladakis, John N; Gidarakos, Evangelos

    2014-08-01

    Considering the enormous production of waste personal computers nowadays, it is obvious that the study of their composition is necessary in order to regulate their management and prevent any environmental contamination caused by their inappropriate disposal. This study aimed at determining the toxic metals content of motherboards (printed circuit boards), monitor glass and monitor plastic housing of two Cathode Ray Tube (CRT) monitors, three Liquid Crystal Display (LCD) monitors, one LCD touch screen monitor and six motherboards, all of which were discarded. In addition, concentrations of chromium (Cr), cadmium (Cd), lead (Pb) and mercury (Hg) were compared with the respective limits set by the RoHS 2002/95/EC Directive, that was recently renewed by the 2012/19/EU recast, in order to verify manufacturers' compliance with the regulation. The research included disassembly, pulverization, digestion and chemical analyses of all the aforementioned devices. The toxic metals content of all samples was determined using Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). The results demonstrated that concentrations of Pb in motherboards and funnel glass of devices with release dates before 2006, that is when the RoHS Directive came into force, exceeded the permissible limit. In general, except from Pb, higher metal concentrations were detected in motherboards in comparison with plastic housing and glass samples. Finally, the results of this work were encouraging, since concentrations of metals referred in the RoHS Directive were found in lower levels than the legislative limits.

  11. Assessing Computational Methods of Cis-Regulatory Module Prediction

    PubMed Central

    Su, Jing; Teichmann, Sarah A.; Down, Thomas A.

    2010-01-01

    Computational methods attempting to identify instances of cis-regulatory modules (CRMs) in the genome face a challenging problem of searching for potentially interacting transcription factor binding sites while knowledge of the specific interactions involved remains limited. Without a comprehensive comparison of their performance, the reliability and accuracy of these tools remains unclear. Faced with a large number of different tools that address this problem, we summarized and categorized them based on search strategy and input data requirements. Twelve representative methods were chosen and applied to predict CRMs from the Drosophila CRM database REDfly, and across the human ENCODE regions. Our results show that the optimal choice of method varies depending on species and composition of the sequences in question. When discriminating CRMs from non-coding regions, those methods considering evolutionary conservation have a stronger predictive power than methods designed to be run on a single genome. Different CRM representations and search strategies rely on different CRM properties, and different methods can complement one another. For example, some favour homotypical clusters of binding sites, while others perform best on short CRMs. Furthermore, most methods appear to be sensitive to the composition and structure of the genome to which they are applied. We analyze the principal features that distinguish the methods that performed well, identify weaknesses leading to poor performance, and provide a guide for users. We also propose key considerations for the development and evaluation of future CRM-prediction methods. PMID:21152003

  12. Cone Beam Computed Tomographic Assessment of Bifid Mandibular Condyle

    PubMed Central

    Khojastepour, Leila; Kolahi, Shirin; Panahi, Nazi

    2015-01-01

    Objectives: Differential diagnosis of bifid mandibular condyle (BMC) is important, since it may play a role in temporomandibular joint (TMJ) dysfunctions and joint symptoms. In addition, radiographic appearance of BMC may mimic tumors and/or fractures. The aim of this study was to evaluate the prevalence and orientation of BMC based on cone beam computed tomography (CBCT) scans. Materials and Methods: This cross-sectional study was performed on CBCT scans of paranasal sinuses of 425 patients. In a designated NNT station, all CBCT scans were evaluated in the axial, coronal and sagittal planes to find the frequency of BMC. The condylar head horizontal angulations were also determined in the transverse plane. T-test was used to compare the frequency of BMC between the left and right sides and between males and females. Results: Totally, 309 patients with acceptable visibility of condyles on CBCT scans were entered in the study consisting of 170 (55%) females and 139 (45%) males with a mean age of 39.43±9.7 years. The BMC was detected in 14 cases (4.53%). Differences between males and females, sides and horizontal angulations of condyle of normal and BMC cases were not significant. Conclusion: The prevalence of BMC in the studied population was 4.53%. No significant difference was observed between males and females, sides or horizontal angulations of the involved and uninvolved condyles. PMID:27559345

  13. Computational Assessment of the Aerodynamic Performance of a Variable-Speed Power Turbine for Large Civil Tilt-Rotor Application

    NASA Technical Reports Server (NTRS)

    Welch, Gerard E.

    2011-01-01

    The main rotors of the NASA Large Civil Tilt-Rotor notional vehicle operate over a wide speed-range, from 100% at take-off to 54% at cruise. The variable-speed power turbine offers one approach by which to effect this speed variation. Key aero-challenges include high work factors at cruise and wide (40 to 60 deg.) incidence variations in blade and vane rows over the speed range. The turbine design approach must optimize cruise efficiency and minimize off-design penalties at take-off. The accuracy of the off-design incidence loss model is therefore critical to the turbine design. In this effort, 3-D computational analyses are used to assess the variation of turbine efficiency with speed change. The conceptual design of a 4-stage variable-speed power turbine for the Large Civil Tilt-Rotor application is first established at the meanline level. The design of 2-D airfoil sections and resulting 3-D blade and vane rows is documented. Three-dimensional Reynolds Averaged Navier-Stokes computations are used to assess the design and off-design performance of an embedded 1.5-stage portion-Rotor 1, Stator 2, and Rotor 2-of the turbine. The 3-D computational results yield the same efficiency versus speed trends predicted by meanline analyses, supporting the design choice to execute the turbine design at the cruise operating speed.

  14. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Baker, Ann E; Barker, Ashley D; Bland, Arthur S Buddy; Boudwin, Kathlyn J.; Hack, James J; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; Wells, Jack C; White, Julia C; Hudson, Douglas L

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation billions of gallons of

  15. Towards a dynamic assessment of raw materials criticality: linking agent-based demand--with material flow supply modelling approaches.

    PubMed

    Knoeri, Christof; Wäger, Patrick A; Stamp, Anna; Althaus, Hans-Joerg; Weil, Marcel

    2013-09-01

    Emerging technologies such as information and communication-, photovoltaic- or battery technologies are expected to increase significantly the demand for scarce metals in the near future. The recently developed methods to evaluate the criticality of mineral raw materials typically provide a 'snapshot' of the criticality of a certain material at one point in time by using static indicators both for supply risk and for the impacts of supply restrictions. While allowing for insights into the mechanisms behind the criticality of raw materials, these methods cannot account for dynamic changes in products and/or activities over time. In this paper we propose a conceptual framework intended to overcome these limitations by including the dynamic interactions between different possible demand and supply configurations. The framework integrates an agent-based behaviour model, where demand emerges from individual agent decisions and interaction, into a dynamic material flow model, representing the materials' stocks and flows. Within the framework, the environmental implications of substitution decisions are evaluated by applying life-cycle assessment methodology. The approach makes a first step towards a dynamic criticality assessment and will enhance the understanding of industrial substitution decisions and environmental implications related to critical metals. We discuss the potential and limitation of such an approach in contrast to state-of-the-art methods and how it might lead to criticality assessments tailored to the specific circumstances of single industrial sectors or individual companies.

  16. An investigation of the reliability of Rapid Upper Limb Assessment (RULA) as a method of assessment of children's computing posture.

    PubMed

    Dockrell, Sara; O'Grady, Eleanor; Bennett, Kathleen; Mullarkey, Clare; Mc Connell, Rachel; Ruddy, Rachel; Twomey, Seamus; Flannery, Colleen

    2012-05-01

    Rapid Upper Limb Assessment (RULA) is a quick observation method of posture analysis. RULA has been used to assess children's computer-related posture, but the reliability of RULA on a paediatric population has not been established. The purpose of this study was to investigate the inter-rater and intra-rater reliability of the use of RULA with children. Video recordings of 24 school children were independently viewed by six trained raters who assessed their postures using RULA, on two separate occasions. RULA demonstrated higher intra-rater reliability than inter-rater reliability although both were moderate to good. RULA was more reliable when used for assessing the older children (8-12 years) than with the younger children (4-7 years). RULA may prove useful as part of an ergonomic assessment, but its level of reliability warrants caution for its sole use when assessing children, and in particular, younger children.

  17. Computational DNA hole spectroscopy: A new tool to predict mutation hotspots, critical base pairs, and disease 'driver' mutations.

    PubMed

    Villagrán, Martha Y Suárez; Miller, John H

    2015-08-27

    We report on a new technique, computational DNA hole spectroscopy, which creates spectra of electron hole probabilities vs. nucleotide position. A hole is a site of positive charge created when an electron is removed. Peaks in the hole spectrum depict sites where holes tend to localize and potentially trigger a base pair mismatch during replication. Our studies of mitochondrial DNA reveal a correlation between L-strand hole spectrum peaks and spikes in the human mutation spectrum. Importantly, we also find that hole peak positions that do not coincide with large variant frequencies often coincide with disease-implicated mutations and/or (for coding DNA) encoded conserved amino acids. This enables combining hole spectra with variant data to identify critical base pairs and potential disease 'driver' mutations. Such integration of DNA hole and variance spectra could ultimately prove invaluable for pinpointing critical regions of the vast non-protein-coding genome. An observed asymmetry in correlations, between the spectrum of human mtDNA variations and the L- and H-strand hole spectra, is attributed to asymmetric DNA replication processes that occur for the leading and lagging strands.

  18. Computational DNA hole spectroscopy: A new tool to predict mutation hotspots, critical base pairs, and disease ‘driver’ mutations

    PubMed Central

    Suárez, Martha Y.; Villagrán; Miller, John H.

    2015-01-01

    We report on a new technique, computational DNA hole spectroscopy, which creates spectra of electron hole probabilities vs. nucleotide position. A hole is a site of positive charge created when an electron is removed. Peaks in the hole spectrum depict sites where holes tend to localize and potentially trigger a base pair mismatch during replication. Our studies of mitochondrial DNA reveal a correlation between L-strand hole spectrum peaks and spikes in the human mutation spectrum. Importantly, we also find that hole peak positions that do not coincide with large variant frequencies often coincide with disease-implicated mutations and/or (for coding DNA) encoded conserved amino acids. This enables combining hole spectra with variant data to identify critical base pairs and potential disease ‘driver’ mutations. Such integration of DNA hole and variance spectra could ultimately prove invaluable for pinpointing critical regions of the vast non-protein-coding genome. An observed asymmetry in correlations, between the spectrum of human mtDNA variations and the L- and H-strand hole spectra, is attributed to asymmetric DNA replication processes that occur for the leading and lagging strands. PMID:26310834

  19. Evaluation of atmospheric nitrogen deposition model performance in the context of U.S. critical load assessments

    NASA Astrophysics Data System (ADS)

    Williams, Jason J.; Chung, Serena H.; Johansen, Anne M.; Lamb, Brian K.; Vaughan, Joseph K.; Beutel, Marc

    2017-02-01

    Air quality models are widely used to estimate pollutant deposition rates and thereby calculate critical loads and critical load exceedances (model deposition > critical load). However, model operational performance is not always quantified specifically to inform these applications. We developed a performance assessment approach designed to inform critical load and exceedance calculations, and applied it to the Pacific Northwest region of the U.S. We quantified wet inorganic N deposition performance of several widely-used air quality models, including five different Community Multiscale Air Quality Model (CMAQ) simulations, the Tdep model, and 'PRISM x NTN' model. Modeled wet inorganic N deposition estimates were compared to wet inorganic N deposition measurements at 16 National Trends Network (NTN) monitoring sites, and to annual bulk inorganic N deposition measurements at Mount Rainier National Park. Model bias (model - observed) and error (|model - observed|) were expressed as a percentage of regional critical load values for diatoms and lichens. This novel approach demonstrated that wet inorganic N deposition bias in the Pacific Northwest approached or exceeded 100% of regional diatom and lichen critical load values at several individual monitoring sites, and approached or exceeded 50% of critical loads when averaged regionally. Even models that adjusted deposition estimates based on deposition measurements to reduce bias or that spatially-interpolated measurement data, had bias that approached or exceeded critical loads at some locations. While wet inorganic N deposition model bias is only one source of uncertainty that can affect critical load and exceedance calculations, results demonstrate expressing bias as a percentage of critical loads at a spatial scale consistent with calculations may be a useful exercise for those performing calculations. It may help decide if model performance is adequate for a particular calculation, help assess confidence in

  20. Computer-aided design of dry powder inhalers using computational fluid dynamics to assess performance.

    PubMed

    Suwandecha, Tan; Wongpoowarak, Wibul; Srichana, Teerapol

    2016-01-01

    Dry powder inhalers (DPIs) are gaining popularity for the delivery of drugs. A cost effective and efficient delivery device is necessary. Developing new DPIs by modifying an existing device may be the simplest way to improve the performance of the devices. The aim of this research was to produce a new DPIs using computational fluid dynamics (CFD). The new DPIs took advantages of the Cyclohaler® and the Rotahaler®. We chose a combination of the capsule chamber of the Cyclohaler® and the mouthpiece and grid of the Rotahaler®. Computer-aided design models of the devices were created and evaluated using CFD. Prototype models were created and tested with the DPI dispersion experiments. The proposed model 3 device had a high turbulence with a good degree of deagglomeration in the CFD and the experiment data. The %fine particle fraction (FPF) was around 50% at 60 L/min. The mass median aerodynamic diameter was around 2.8-4 μm. The FPF were strongly correlated to the CFD-predicted turbulence and the mechanical impaction parameters. The drug retention in the capsule was only 5-7%. In summary, a simple modification of the Cyclohaler® and Rotahaler® could produce a better performing inhaler using the CFD-assisted design.