Science.gov

Sample records for computer criticality assessments

  1. Making Student Thinking Visible through a Concept Map in Computer-Based Assessment of Critical Thinking

    ERIC Educational Resources Information Center

    Rosen, Yigal; Tager, Maryam

    2014-01-01

    Major educational initiatives in the world place great emphasis on fostering rich computer-based environments of assessment that make student thinking and reasoning visible. Using thinking tools engages students in a variety of critical and complex thinking, such as evaluating, analyzing, and decision making. The aim of this study was to explore…

  2. Content Analysis in Computer-Mediated Communication: Analyzing Models for Assessing Critical Thinking through the Lens of Social Constructivism

    ERIC Educational Resources Information Center

    Buraphadeja, Vasa; Dawson, Kara

    2008-01-01

    This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…

  3. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    SciTech Connect

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-11-30

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a “black box”. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards.

  4. Computer-Based Assessment in Safety-Critical Industries: The Case of Shipping

    ERIC Educational Resources Information Center

    Gekara, Victor Oyaro; Bloor, Michael; Sampson, Helen

    2011-01-01

    Vocational education and training (VET) concerns the cultivation and development of specific skills and competencies, in addition to broad underpinning knowledge relating to paid employment. VET assessment is, therefore, designed to determine the extent to which a trainee has effectively acquired the knowledge, skills, and competencies required by…

  5. Carahunge - A Critical Assessment

    NASA Astrophysics Data System (ADS)

    González-García, A. César

    Carahunge is a megalithic monument in southern Armenia that has often been acclaimed as the oldest observatory. The monument, composed of dozens of standing stones, has some perforated stones. The direction of the holes has been measured and their orientation is related to the sun, moon, and stars, obtaining a date for the construction of such devices. After a critical review of the methods and conclusions, these are shown as untenable.

  6. Critical Assessment of Function Annotation Meeting, 2011

    SciTech Connect

    Friedberg, Iddo

    2015-01-21

    The Critical Assessment of Function Annotation meeting was held July 14-15, 2011 at the Austria Conference Center in Vienna, Austria. There were 73 registered delegates at the meeting. We thank the DOE for this award. It helped us organize and support a scientific meeting AFP 2011 as a special interest group (SIG) meeting associated with the ISMB 2011 conference. The conference was held in Vienna, Austria, in July 2011. The AFP SIG was held on July 15-16, 2011 (immediately preceding the conference). The meeting consisted of two components, the first being a series of talks (invited and contributed) and discussion sections dedicated to protein function research, with an emphasis on the theory and practice of computational methods utilized in functional annotation. The second component provided a large-scale assessment of computational methods through participation in the Critical Assessment of Functional Annotation (CAFA).

  7. NASA Critical Facilities Maintenance Assessment

    NASA Technical Reports Server (NTRS)

    Oberhettinger, David J.

    2006-01-01

    Critical Facilities Maintenance Assessment (CFMA) was first implemented by NASA following the March 2000 overtest of the High Energy Solar Spectroscopic Imager (HESSI) spacecraft. A sine burst dynamic test using a 40 year old shaker failed. Mechanical binding/slippage of the slip table imparted 10 times the planned force to the test article. There was major structural damage to HESSI. The mechanical "health" of the shaker had not been assessed and tracked to assure the test equipment was in good working order. Similar incidents have occurred at NASA facilities due to inadequate maintenance (e.g., rainwater from a leaky roof contaminated an assembly facility that housed a spacecraft). The HESSI incident alerted NASA to the urgent need to identify inadequacies in ground facility readiness and maintenance practices. The consequences of failures of ground facilities that service these NASA systems are severe due to the high unit value of NASA products.

  8. Computer Security Risk Assessment

    Energy Science and Technology Software Center (ESTSC)

    1992-02-11

    LAVA/CS (LAVA for Computer Security) is an application of the Los Alamos Vulnerability Assessment (LAVA) methodology specific to computer and information security. The software serves as a generic tool for identifying vulnerabilities in computer and information security safeguards systems. Although it does not perform a full risk assessment, the results from its analysis may provide valuable insights into security problems. LAVA/CS assumes that the system is exposed to both natural and environmental hazards and tomore » deliberate malevolent actions by either insiders or outsiders. The user in the process of answering the LAVA/CS questionnaire identifies missing safeguards in 34 areas ranging from password management to personnel security and internal audit practices. Specific safeguards protecting a generic set of assets (or targets) from a generic set of threats (or adversaries) are considered. There are four generic assets: the facility, the organization''s environment; the hardware, all computer-related hardware; the software, the information in machine-readable form stored both on-line or on transportable media; and the documents and displays, the information in human-readable form stored as hard-copy materials (manuals, reports, listings in full-size or microform), film, and screen displays. Two generic threats are considered: natural and environmental hazards, storms, fires, power abnormalities, water and accidental maintenance damage; and on-site human threats, both intentional and accidental acts attributable to a perpetrator on the facility''s premises.« less

  9. Critical services in the LHC computing

    NASA Astrophysics Data System (ADS)

    Sciabà, A.

    2010-04-01

    The LHC experiments (ALICE, ATLAS, CMS and LHCb) rely for the data acquisition, processing, distribution, analysis and simulation on complex computing systems, running using a variety of services, provided by the experiments, the Worldwide LHC Computing Grid and the different computing centres. These services range from the most basic (network, batch systems, file systems) to the mass storage services or the Grid information system, up to the different workload management systems, data catalogues and data transfer tools, often internally developed in the collaborations. In this contribution we review the status of the services most critical to the experiments by quantitatively measuring their readiness with respect to the start of the LHC operations. Shortcomings are identified and common recommendations are offered.

  10. Formative Assessment: A Critical Review

    ERIC Educational Resources Information Center

    Bennett, Randy Elliot

    2011-01-01

    This paper covers six interrelated issues in formative assessment (aka, "assessment for learning"). The issues concern the definition of formative assessment, the claims commonly made for its effectiveness, the limited attention given to domain considerations in its conceptualisation, the under-representation of measurement principles in that…

  11. Criticality assessment of LLRWDF closure

    SciTech Connect

    Sarrack, A.G.; Weber, J.H.; Woody, N.D.

    1992-10-06

    During the operation of the Low Level Radioactive Waste Disposal Facility (LLRWDF), large amounts (greater than 100 kg) of enriched uranium (EU) were buried. This EU came primarily from the closing and decontamination of the Naval Fuels Facility in the time period from 1987--1989. Waste Management Operations (WMO) procedures were used to keep the EU boxes separated to prevent possible criticality during normal operation. Closure of the LLRWDF is currently being planned, and waste stabilization by Dynamic Compaction (DC) is proposed. Dynamic compaction will crush the containers in the LLRWDF and result in changes in their geometry. Research of the LLRWDF operations and record keeping practices have shown that the EU contents of trenches are known, but details of the arrangement of the contents cannot be proven. Reviews of the trench contents, combined with analysis of potential critical configurations, revealed that some portions of the LLRWDF can be expected to be free of criticality concerns while other sections have credible probabilities for the assembly of a critical mass, even in the uncompacted configuration. This will have an impact on the closure options and which trenches can be compacted.

  12. Computer-assisted education for critical care nurses.

    PubMed

    Bove, L A

    2001-03-01

    Technology is changing rapidly and health care is just beinnng to see the wave of technological advances. Computer-assisted educational software is available for many topics and in many media. Educators and learners need to explore these media and determine how they can best fit into a total learning experience. Computers should be used to enhance education and training, rather than replace the human instructor. The latest software and hardware are interesting to learners, but technology needs to be weighed along with outcomes of education. Over the next 10 years, many of the materials we use today for critical care education will be replaced with more advanced technologies. Subject matter experts should work with computer experts to design and improve computer-assisted technology. In addition, all educators should assess the return on investment of these newer technologies before embracing them. PMID:11863142

  13. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  14. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  15. Assessing Postgraduate Students' Critical Thinking Ability

    ERIC Educational Resources Information Center

    Javed, Muhammad; Nawaz, Muhammad Atif; Qurat-Ul-Ain, Ansa

    2015-01-01

    This paper addresses to assess the critical thinking ability of postgraduate students. The target population was the male and female students at University level in Pakistan. A small sample of 45 male and 45 female students were selected randomly from The Islamia University of Bahawalpur, Pakistan. Cornell Critical Thinking Test Series, The…

  16. Equivalent damage: A critical assessment

    NASA Technical Reports Server (NTRS)

    Laflen, J. R.; Cook, T. S.

    1982-01-01

    Concepts in equivalent damage were evaluated to determine their applicability to the life prediction of hot path components of aircraft gas turbine engines. Equivalent damage was defined as being those effects which influence the crack initiation life-time beyond the damage that is measured in uniaxial, fully-reversed sinusoidal and isothermal experiments at low homologous temperatures. Three areas of equivalent damage were examined: mean stress, cumulative damage, and multiaxiality. For each area, a literature survey was conducted to aid in selecting the most appropriate theories. Where possible, data correlations were also used in the evaluation process. A set of criteria was developed for ranking the theories in each equivalent damage regime. These criteria considered aspects of engine utilization as well as the theoretical basis and correlative ability of each theory. In addition, consideration was given to the complex nature of the loading cycle at fatigue critical locations of hot path components; this loading includes non-proportional multiaxial stressing, combined temperature and strain fluctuations, and general creep-fatigue interactions. Through applications of selected equivalent damage theories to some suitable data sets it was found that there is insufficient data to allow specific recommendations of preferred theories for general applications. A series of experiments and areas of further investigations were identified.

  17. Intelligence Assessment with Computer Simulations

    ERIC Educational Resources Information Center

    Kroner, S.; Plass, J.L.; Leutner, D.

    2005-01-01

    It has been suggested that computer simulations may be used for intelligence assessment. This study investigates what relationships exist between intelligence and computer-simulated tasks that mimic real-world problem-solving behavior, and discusses design requirements that simulations have to meet in order to be suitable for intelligence…

  18. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  19. Recent Use of Covariance Data for Criticality Safety Assessment

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2008-01-01

    The TSUNAMI codes of the Oak Ridge National Laboratory SCALE code system were applied to a burnup credit application to demonstrate the use of sensitivity and uncertainty analysis with recent cross section covariance data for criticality safety code and data validation. The use of sensitivity and uncertainty analysis provides for the assessment of a defensible computational bias, bias uncertainty, and gap analysis for a complex system that otherwise could be assessed only through the use of expert judgment and conservative assumptions.

  20. Critical Elements of Computer Literacy for Teachers.

    ERIC Educational Resources Information Center

    Overbaugh, Richard C.

    A definition of computer literacy is developed that is broad enough to apply to educators in general, but which leaves room for specificity for particular situations and content areas. The following general domains that comprise computer literacy for all educators are addressed: (1) general computer operations; (2) software, including computer…

  1. To assess the reparative ability of differentiated mesenchymal stem cells in a rat critical size bone repair defect model using high frequency co-registered photoacoustic/ultrasound imaging and micro computed tomography

    NASA Astrophysics Data System (ADS)

    Zafar, Haroon; Gaynard, Sean; O'Flatharta, Cathal; Doroshenkova, Tatiana; Devine, Declan; Sharif, Faisal; Barry, Frank; Hayes, Jessica; Murphy, Mary; Leahy, Martin J.

    2016-03-01

    Stem cell based treatments hold great potential and promise to address many unmet clinical needs. The importance of non-invasive imaging techniques to monitor transplanted stem cells qualitatively and quantitatively is crucial. The objective of this study was to create a critical size bone defect in the rat femur and then assess the ability of the differentiated mesenchymal stem cells (MSCs) to repair the defect using high frequency co-registered photoacoustic(PA)/ultrasound(US) imaging and micro computed tomography (μCT) over an 8 week period. Combined PA and US imaging was performed using 256 elements, 21 MHz frequency linear-array transducer combined with multichannel collecting system. In vivo 3D PA and US images of the defect bone in the rat femur were acquired after 4 and 8 weeks of the surgery. 3D co-registered structural such as microvasculature and the functional images such as total concentration of haemoglobin (HbT) and the haemoglobin oxygen saturation (sO2) were obtained using PA and US imaging. Bone formation was assessed after 4 and 8 weeks of the surgery by μCT. High frequency linear-array based coregistered PA/US imaging has been found promising in terms of non-invasiveness, sensitivity, adaptability, high spatial and temporal resolution at sufficient depths for the assessment of the reparative ability of MSCs in a rat critical size bone repair defect model.

  2. Critical care computing. Past, present, and future.

    PubMed

    Seiver, A

    2000-10-01

    With rapidly increasing processing power, networks, and bandwidth, we have ever more powerful tools for ICU computing. The challenge is to use these tools to build on the work of the Innovators and Early Adopters, who pioneered the first three generations of systems, and extend computing to the Majority, who still rely on paper. What is needed is compelling evidence that these systems reduce cost and improve quality. The experience of other industries suggests that we need to address fundamental issues, such as clinical organization, roles, behavior, and incentives, before we will be able to prove the benefits of computing technology. When these preconditions are met, the promise of computing will be realized, perhaps with the upcoming fourth-generation systems. ICU computing can then finally cross the chasm and become the standard of care. PMID:11070807

  3. Risk-Assessment Computer Program

    NASA Technical Reports Server (NTRS)

    Dias, William C.; Mittman, David S.

    1993-01-01

    RISK D/C is prototype computer program assisting in attempts to do program risk modeling for Space Exploration Initiative (SEI) architectures proposed in Synthesis Group Report. Risk assessment performed with respect to risk events, probabilities, and severities of potential results. Enables ranking, with respect to effectiveness, of risk-mitigation strategies proposed for exploration program architecture. Allows for fact that risk assessment in early phases of planning subjective. Although specific to SEI in present form, also used as software framework for development of risk-assessment programs for other specific uses. Developed for Macintosh(TM) series computer. Requires HyperCard(TM) 2.0 or later, as well as 2 Mb of random-access memory and System 6.0.8 or later.

  4. Criticality assessment of TRU burial ground culverts

    SciTech Connect

    Winn, W.G.

    1990-09-26

    An effort to assess the criticality risks of {sup 239}Pu in TRU Burial Ground Culverts has been underway for several years. The concern arose from discrepancies in two types of monitors that have been used to assay the {sup 239}Pu waste prior to storage in 55-gallon drums that are placed in the culverts. One type is the solid waste monitor (SWM), which is based on gamma-ray measurements; the other is the neutron coincidence monitor, which is based on neutron measurements. The NCC was put into routine service after 1985 and has generally yielded higher 239 Pu assays than the SWM. Culverts with pre-1986 waste only had SWM assays of {sup 239}Pu; thus, it was questioned whether their actual {sup 239}Pu loadings could be high enough to pose criticality concerns. Studies to characterize the culvert criticality potential have included appraisal of NCC vs SWM, neutron measurements atop the culverts, gamma-ray measurements atop the culverts, and probabilistic risk analyses. Overall, these studies have implied that the culverts are critically safe; however, their results have not been examined collectively. The present report uses the collective information of the preceding studies to arrive at a more complete assessment of the culvert criticality aspects. A conservative k{sub eff} is estimated for an individual suspicious culvert and a PRA is evaluated for its {open_quotes}worst{close_quotes} drum. These two pieces of information form the basis of the appraisal, but other evidence is also included as support.

  5. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  6. DOE/EM Criticality Safety Needs Assessment

    SciTech Connect

    Westfall, Robert Michael; Hopper, Calvin Mitchell

    2011-02-01

    The issue of nuclear criticality safety (NCS) in Department of Energy Environmental Management (DOE/EM) fissionable material operations presents challenges because of the large quantities of material present in the facilities and equipment that are committed to storage and/or material conditioning and dispositioning processes. Given the uncertainty associated with the material and conditions for many DOE/EM fissionable material operations, ensuring safety while maintaining operational efficiency requires the application of the most-effective criticality safety practices. In turn, more-efficient implementation of these practices can be achieved if the best NCS technologies are utilized. In 2002, DOE/EM-1 commissioned a survey of criticality safety technical needs at the major EM sites. These needs were documented in the report Analysis of Nuclear Criticality Safety Technology Supporting the Environmental Management Program, issued May 2002. Subsequent to this study, EM safety management personnel made a commitment to applying the best and latest criticality safety technology, as described by the DOE Nuclear Criticality Safety Program (NCSP). Over the past 7 years, this commitment has enabled the transfer of several new technologies to EM operations. In 2008, it was decided to broaden the basis of the EM NCS needs assessment to include not only current needs for technologies but also NCS operational areas with potential for improvements in controls, analysis, and regulations. A series of NCS workshops has been conducted over the past years, and needs have been identified and addressed by EM staff and contractor personnel. These workshops were organized and conducted by the EM Criticality Safety Program Manager with administrative and technical support by staff at Oak Ridge National Laboratory (ORNL). This report records the progress made in identifying the needs, determining the approaches for addressing these needs, and assimilating new NCS technologies into EM

  7. Radiation exposure and risk assessment for critical female body organs

    NASA Technical Reports Server (NTRS)

    Atwell, William; Weyland, Mark D.; Hardy, Alva C.

    1991-01-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed.

  8. Problem Solving and Critical Thinking for Computer Science Educators.

    ERIC Educational Resources Information Center

    Norris, Cathleen A., Ed.; Poirot, James L., Ed.

    The eight papers presented in this monograph are a result of the Problem Solving and Critical Thinking Research Workshop that was held in conjunction with the 1990 National Educational Computing Conference (NECC). The intent of the workshop was to provide a unique forum for researchers to share ideas in a special area of educational computing. The…

  9. Computer Assisted Instruction: Current Trends and Critical Issues.

    ERIC Educational Resources Information Center

    Chambers, Jack A.; Sprecher, Jerry W.

    1980-01-01

    The use of computers to assist in learning situations is reviewed on an international basis, evaluation studies and costs are examined, and critical issues are analyzed as they pertain to hardware, software, and courseware development. Recommendations are offered for educationally cost-effective uses of computer-assisted instruction. A 67-item…

  10. Nutritional assessment in the critically ill.

    PubMed

    Manning, E M; Shenkin, A

    1995-07-01

    Although many of the measurements and techniques outlined in this article may be epidemiologically useful and correlate with morbidity and mortality, no single indicator is of consistent value in the nutritional assessment of critically ill patients. Measurements such as anthropometrics, total body fat estimation, or delayed hypersensitivity skin testing either are liable to non-nutritional influences or lack accuracy and precision in individual patients. Plasma concentrations of hepatic proteins are affected significantly by the patient's underlying disease state and therapeutic interventions and therefore lack specificity. Although the measurement of these proteins is of little value in the initial nutritional assessment of the critically ill, serial measurement, particularly of plasma pre-albumin, may be useful in monitoring the response to nutritional support. Nitrogen balance is a widely used and valuable nutritional indicator in the critically ill. Direct measurement of urine nitrogen is the preferred test, although nitrogen excretion often is derived from 24-hour urine urea measurement, an inexpensive and easy procedure, but one that is less accurate. More accurate techniques of assessing change in nutritional status, such as IVNAA of total body nitrogen or isotopic measurement of exchangeable potassium or sodium, are more expensive, less available, unsuitable for repeated analyses, and less feasible in severely ill patients. Total body nitrogen measured using IVNAA and total-body potassium, however, are the most accurate ways of measuring body composition in the presence of large amounts of edema fluid. The application of body composition measurements to patient care remains poorly defined because of the many problems encountered with the various techniques, including cost, availability, and radiation exposure. Improved, more sensitive and, preferably, bedside methods for the measurement of body composition are needed. It is of paramount importance that

  11. Nutritional Assessment in Critically Ill Patients

    PubMed Central

    Hejazi, Najmeh; Mazloom, Zohreh; Zand, Farid; Rezaianzadeh, Abbas; Amini, Afshin

    2016-01-01

    Background: Malnutrition is an important factor in the survival of critically ill patients. The purpose of the present study was to assess the nutritional status of patients in the intensive care unit (ICU) on the days of admission and discharge via a detailed nutritional assessment. Methods: Totally, 125 patients were followed up from admission to discharge at 8ICUs in Shiraz, Iran. The patients’ nutritional status was assessed using subjective global assessment (SGA), anthropometric measurements, biochemical indices, and body composition indicators. Diet prescription and intake was also evaluated. Results: Malnutrition prevalence significantly increased on the day of discharge (58.62%) compared to the day of admission (28.8%) according to SGA (P<0.001). The patients’ weight, mid-upper-arm circumference, mid-arm muscle circumference, triceps skinfold thickness, and calf circumference decreased significantly as well (P<0.001). Lean mass weight and body cell mass also decreased significantly (P<0.001). Biochemical indices showed no notable changes except for magnesium, which decreased significantly (P=0.013). A negative significant correlation was observed between malnutrition on discharge day and anthropometric measurements. Positive and significant correlations were observed between the number of days without enteral feeding, days delayed from ICU admission to the commencement of enteral feeding, and the length of ICU stay and malnutrition on discharge day. Energy and protein intakes were significantly less than the prescribed diet (26.26% and 26.48%, respectively). Conclusion: Malnutrition on discharge day increased in the patients in the ICU according to SGA. Anthropometric measurements were better predictors of the nutritional outcome of our critically ill patients than were biochemical tests. PMID:27217600

  12. Assessing Terrorist Motivations for Attacking Critical Infrastructure

    SciTech Connect

    Ackerman, G; Abhayaratne, P; Bale, J; Bhattacharjee, A; Blair, C; Hansell, L; Jayne, A; Kosal, M; Lucas, S; Moran, K; Seroki, L; Vadlamudi, S

    2006-12-04

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security and way of life. These complex and often interconnected systems have become so ubiquitous and essential to day-to-day life that they are easily taken for granted. Often it is only when the important services provided by such infrastructure are interrupted--when we lose easy access to electricity, health care, telecommunications, transportation or water, for example--that we are conscious of our great dependence on these networks and of the vulnerabilities that stem from such dependence. Unfortunately, it must be assumed that many terrorists are all too aware that CI facilities pose high-value targets that, if successfully attacked, have the potential to dramatically disrupt the normal rhythm of society, cause public fear and intimidation, and generate significant publicity. Indeed, revelations emerging at the time of this writing about Al Qaida's efforts to prepare for possible attacks on major financial facilities in New York, New Jersey, and the District of Columbia remind us just how real and immediate such threats to CI may be. Simply being aware that our nation's critical infrastructure presents terrorists with a plethora of targets, however, does little to mitigate the dangers of CI attacks. In order to prevent and preempt such terrorist acts, better understanding of the threats and vulnerabilities relating to critical infrastructure is required. The Center for Nonproliferation Studies (CNS) presents this document as both a contribution to the understanding of such threats and an initial effort at ''operationalizing'' its findings for use by analysts who work on issues of critical infrastructure protection. Specifically, this study focuses on a subsidiary aspect of CI threat assessment that has thus far remained largely unaddressed by contemporary terrorism research: the motivations and related factors that determine whether a terrorist

  13. Critical Emergency Medicine Procedural Skills: A Comparative Study of Methods for Teaching and Assessment.

    ERIC Educational Resources Information Center

    Chapman, Dane M.; And Others

    Three critical procedural skills in emergency medicine were evaluated using three assessment modalities--written, computer, and animal model. The effects of computer practice and previous procedure experience on skill competence were also examined in an experimental sequential assessment design. Subjects were six medical students, six residents,…

  14. Assessment of Situated Learning Using Computer Environments.

    ERIC Educational Resources Information Center

    Young, Michael

    1995-01-01

    Suggests that, based on a theory of situated learning, assessment must emphasize process as much as product. Several assessment examples are given, including a computer-based planning assistant for a mathematics and science video, suggestions for computer-based portfolio assessment, and speculations about embedded assessment of virtual situations.…

  15. HSE's safety assessment principles for criticality safety.

    PubMed

    Simister, D N; Finnerty, M D; Warburton, S J; Thomas, E A; Macphail, M R

    2008-06-01

    The Health and Safety Executive (HSE) published its revised Safety Assessment Principles for Nuclear Facilities (SAPs) in December 2006. The SAPs are primarily intended for use by HSE's inspectors when judging the adequacy of safety cases for nuclear facilities. The revised SAPs relate to all aspects of safety in nuclear facilities including the technical discipline of criticality safety. The purpose of this paper is to set out for the benefit of a wider audience some of the thinking behind the final published words and to provide an insight into the development of UK regulatory guidance. The paper notes that it is HSE's intention that the Safety Assessment Principles should be viewed as a reflection of good practice in the context of interpreting primary legislation such as the requirements under site licence conditions for arrangements for producing an adequate safety case and for producing a suitable and sufficient risk assessment under the Ionising Radiations Regulations 1999 (SI1999/3232 www.opsi.gov.uk/si/si1999/uksi_19993232_en.pdf). PMID:18495990

  16. CRITICAL ISSUES IN HIGH END COMPUTING - FINAL REPORT

    SciTech Connect

    Corones, James

    2013-09-23

    High-End computing (HEC) has been a driver for advances in science and engineering for the past four decades. Increasingly HEC has become a significant element in the national security, economic vitality, and competitiveness of the United States. Advances in HEC provide results that cut across traditional disciplinary and organizational boundaries. This program provides opportunities to share information about HEC systems and computational techniques across multiple disciplines and organizations through conferences and exhibitions of HEC advances held in Washington DC so that mission agency staff, scientists, and industry can come together with White House, Congressional and Legislative staff in an environment conducive to the sharing of technical information, accomplishments, goals, and plans. A common thread across this series of conferences is the understanding of computational science and applied mathematics techniques across a diverse set of application areas of interest to the Nation. The specific objectives of this program are: Program Objective 1. To provide opportunities to share information about advances in high-end computing systems and computational techniques between mission critical agencies, agency laboratories, academics, and industry. Program Objective 2. To gather pertinent data, address specific topics of wide interest to mission critical agencies. Program Objective 3. To promote a continuing discussion of critical issues in high-end computing. Program Objective 4.To provide a venue where a multidisciplinary scientific audience can discuss the difficulties applying computational science techniques to specific problems and can specify future research that, if successful, will eliminate these problems.

  17. Critical infrastructure systems of systems assessment methodology.

    SciTech Connect

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  18. Program computes single-point failures in critical system designs

    NASA Technical Reports Server (NTRS)

    Brown, W. R.

    1967-01-01

    Computer program analyzes the designs of critical systems that will either prove the design is free of single-point failures or detect each member of the population of single-point failures inherent in a system design. This program should find application in the checkout of redundant circuits and digital systems.

  19. Auroral weak double layers: A critical assessment

    NASA Astrophysics Data System (ADS)

    Koskinen, Hannu E. J.; Mälkki, Anssi M.

    Weak double layers (WDLs) were first observed in the mid-altitude auroral magnetosphere in 1976 by the S3-3 satellite. The observations were confirmed by Viking in 1986, when more detailed information of these small-scale plasma structures became available. WDLs are upward moving rarefactive solitary structures with negative electric potential. The potential drop over a WDL is typically 0-1 V with electric field pointing predominantly upward. The structures are usually found in relatively weak (≤2 kV) auroral acceleration regions where the field-aligned current is upward, but sometimes very small. The observations suggest that WDLs exist in regions of cool electron and ion background. Most likely the potential structures are embedded in the background ion population that may drift slowly upward. There have been several attempts for plasma physical explanation of WDLs but so far the success has not been very good. Computer simulations have been able to produce similar structures, but usually for somewhat unrealistic plasma parameters. A satisfactory understanding of the phenomenon requires consideration of the role of WDLs in the magnetosphere-ionosphere (MI) coupling, including the large-scale electric fields, both parallel and perpendicular to the magnetic field, and the Alfvén waves mediating the coupling. In this report we give a critical review of our present understanding of WDLs. We try to find out what can be safely deduced from the observations, what are just educated guesses, and where we may go wrong.

  20. Radiation exposure and risk assessment for critical female body organs

    SciTech Connect

    Atwell, W.; Weyland, M.D.; Hardy, A.C. NASA, Johnson Space Center, Houston, TX )

    1991-07-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed. 13 refs.

  1. Adapting the Critical Thinking Assessment Test for Palestinian Universities

    ERIC Educational Resources Information Center

    Basha, Sami; Drane, Denise; Light, Gregory

    2016-01-01

    Critical thinking is a key learning outcome for Palestinian students. However, there are no validated critical thinking tests in Arabic. Suitability of the US developed Critical Thinking Assessment Test (CAT) for use in Palestine was assessed. The test was piloted with university students in English (n = 30) and 4 questions were piloted in Arabic…

  2. Cryptographic Key Management and Critical Risk Assessment

    SciTech Connect

    Abercrombie, Robert K

    2014-05-01

    The Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) CyberSecurity for Energy Delivery Systems (CSEDS) industry led program (DE-FOA-0000359) entitled "Innovation for Increasing CyberSecurity for Energy Delivery Systems (12CSEDS)," awarded a contract to Sypris Electronics LLC to develop a Cryptographic Key Management System for the smart grid (Scalable Key Management Solutions for Critical Infrastructure Protection). Oak Ridge National Laboratory (ORNL) and Sypris Electronics, LLC as a result of that award entered into a CRADA (NFE-11-03562) between ORNL and Sypris Electronics, LLC. ORNL provided its Cyber Security Econometrics System (CSES) as a tool to be modified and used as a metric to address risks and vulnerabilities in the management of cryptographic keys within the Advanced Metering Infrastructure (AMI) domain of the electric sector. ORNL concentrated our analysis on the AMI domain of which the National Electric Sector Cyber security Organization Resource (NESCOR) Working Group 1 (WG1) has documented 29 failure scenarios. The computational infrastructure of this metric involves system stakeholders, security requirements, system components and security threats. To compute this metric, we estimated the stakes that each stakeholder associates with each security requirement, as well as stochastic matrices that represent the probability of a threat to cause a component failure and the probability of a component failure to cause a security requirement violation. We applied this model to estimate the security of the AMI, by leveraging the recently established National Institute of Standards and Technology Interagency Report (NISTIR) 7628 guidelines for smart grid security and the International Electrotechnical Commission (IEC) 63351, Part 9 to identify the life cycle for cryptographic key management, resulting in a vector that assigned to each stakeholder an estimate of their average loss in terms of dollars per day of system

  3. Computer Interview Problem Assessment of Psychiatric Patients

    PubMed Central

    Angle, Hugh V.; Ellinwood, Everett H.; Carroll, Judith

    1978-01-01

    Behavioral Assessment information, a more general form of Problem- Oriented Record data, appears to have many useful clinical qualities and was selected to be the information content for a computer interview system. This interview system was designed to assess problematic behaviors of psychiatric patients. The computer interview covered 29 life problem areas and took patients from four to eight hours to complete. In two reliability studies, the computer interview was compared to human interviews. A greater number of general and specific patient problems were identified in the computer interview than in the human interviews. The attitudes of computer patients and clinicians receiving the computer reports were surveyed.

  4. ACIDIC DEPOSITION PHENOMENON AND ITS EFFECTS: CRITICAL ASSESSMENT DOCUMENT

    EPA Science Inventory

    The Acidic Deposition Phenomenon and Its Effects: Critical Assessment Document (CAD) is a summary, integration, and interpretation of the current scientific understanding of acidic deposition. It is firmly based upon The Acidic Deposition Phenomenon and Its Effects: Critical Asse...

  5. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  6. Assessing Vulnerabilities, Risks, and Consequences of Damage to Critical Infrastructure

    SciTech Connect

    Suski, N; Wuest, C

    2011-02-04

    Phase brings together infrastructure owners and operators to identify critical assets and help the team create a structured information request. During this phase, we gain information about the critical assets from those who are most familiar with operations and interdependencies, making the time we spend on the ground conducting the assessment much more productive and enabling the team to make actionable recommendations. The Assessment Phase analyzes 10 areas: Threat environment, cyber architecture, cyber penetration, physical security, physical penetration, operations security, policies and procedures, interdependencies, consequence analysis, and risk characterization. Each of these individual tasks uses direct and indirect data collection, site inspections, and structured and facilitated workshops to gather data. Because of the importance of understanding the cyber threat, LLNL has built both fixed and mobile cyber penetration, wireless penetration and supporting tools that can be tailored to fit customer needs. The Post-Assessment Phase brings vulnerability and risk assessments to the customer in a format that facilitates implementation of mitigation options. Often the assessment findings and recommendations are briefed and discussed with several levels of management and, if appropriate, across jurisdictional boundaries. The end result is enhanced awareness and informed protective measures. Over the last 15 years, we have continued to refine our methodology and capture lessons learned and best practices. The resulting risk and decision framework thus takes into consideration real-world constraints, including regulatory, operational, and economic realities. In addition to 'on the ground' assessments focused on mitigating vulnerabilities, we have integrated our computational and atmospheric dispersion capability with easy-to-use geo-referenced visualization tools to support emergency planning and response operations. LLNL is home to the National Atmospheric Release

  7. A Review of Computer-Assisted Assessment

    ERIC Educational Resources Information Center

    Conole, Grainne; Warburton, Bill

    2005-01-01

    Pressure for better measurement of stated learning outcomes has resulted in a demand for more frequent assessment. The resources available are seen to be static or dwindling, but Information and Communications Technology is seen to increase productivity by automating assessment tasks. This paper reviews computer-assisted assessment (CAA) and…

  8. Assessing Quality of Critical Thought in Online Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa; Baltes, Beate; Lynn, Laura Knight

    2009-01-01

    Purpose: The purpose of this paper is to describe a theoretically based coding framework for an integrated analysis and assessment of critical thinking in online discussion. Design/methodology/approach: The critical thinking assessment framework (TAF) is developed through review of theory and previous research, verified by comparing results to…

  9. A Novel Instrument for Assessing Students' Critical Thinking Abilities

    ERIC Educational Resources Information Center

    White, Brian; Stains, Marilyne; Escriu-Sune, Marta; Medaglia, Eden; Rostamnjad, Leila; Chinn, Clark; Sevian, Hannah

    2011-01-01

    Science literacy involves knowledge of both science content and science process skills. In this study, we describe the Assessment of Critical Thinking Ability survey and its preliminary application to assess the critical thinking skills of undergraduate students, graduate students, and postdoctoral fellows. This survey is based on a complex and…

  10. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    PubMed Central

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  11. Critical assessment of automated flow cytometry data analysis techniques.

    PubMed

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R; Brinkman, Ryan; Gottardo, Raphael; Scheuermann, Richard H

    2013-03-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks: (i) mammalian cell population identification, to determine whether automated algorithms can reproduce expert manual gating and (ii) sample classification, to determine whether analysis pipelines can identify characteristics that correlate with external variables (such as clinical outcome). This analysis presents the results of the first FlowCAP challenges. Several methods performed well as compared to manual gating or external variables using statistical performance measures, which suggests that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  12. Inequalities, Assessment and Computer Algebra

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.

    2015-01-01

    The goal of this paper is to examine single variable real inequalities that arise as tutorial problems and to examine the extent to which current computer algebra systems (CAS) can (1) automatically solve such problems and (2) determine whether students' own answers to such problems are correct. We review how inequalities arise in…

  13. The Collegiate Learning Assessment: A Critical Perspective

    ERIC Educational Resources Information Center

    Shermis, Mark D.

    2008-01-01

    This article describes the Collegiate Learning Assessment (CLA), a postsecondary assessment tool designed to evaluate the "value-added" component of institutional contributions to student learning outcomes. Developed by the Council for Aid to Education (CAE), the instrument ostensibly focuses on the contributions of general education coursework…

  14. Bad Actors Criticality Assessment for Pipeline system

    NASA Astrophysics Data System (ADS)

    Nasir, Meseret; Chong, Kit wee; Osman, Sabtuni; Siaw Khur, Wee

    2015-04-01

    Failure of a pipeline system could bring huge economic loss. In order to mitigate such catastrophic loss, it is required to evaluate and rank the impact of each bad actor of the pipeline system. In this study, bad actors are known as the root causes or any potential factor leading to the system downtime. Fault Tree Analysis (FTA) is used to analyze the probability of occurrence for each bad actor. Bimbaum's Importance and criticality measure (BICM) is also employed to rank the impact of each bad actor on the pipeline system failure. The results demonstrate that internal corrosion; external corrosion and construction damage are critical and highly contribute to the pipeline system failure with 48.0%, 12.4% and 6.0% respectively. Thus, a minor improvement in internal corrosion; external corrosion and construction damage would bring significant changes in the pipeline system performance and reliability. These results could also be useful to develop efficient maintenance strategy by identifying the critical bad actors.

  15. Criticality of Water: Aligning Water and Mineral Resources Assessment.

    PubMed

    Sonderegger, Thomas; Pfister, Stephan; Hellweg, Stefanie

    2015-10-20

    The concept of criticality has been used to assess whether a resource may become a limiting factor to economic activities. It has been primarily applied to nonrenewable resources, in particular to metals. However, renewable resources such as water may also be overused and become a limiting factor. In this paper, we therefore developed a water criticality method that allows for a new, user-oriented assessment of water availability and accessibility. Comparability of criticality across resources is desirable, which is why the presented adaptation of the criticality approach to water is based on a metal criticality method, whose basic structure is maintained. With respect to the necessary adaptations to the water context, a transparent water criticality framework is proposed that may pave the way for future integrated criticality assessment of metals, water, and other resources. Water criticality scores were calculated for 159 countries subdivided into 512 geographic units for the year 2000. Results allow for a detailed analysis of criticality profiles, revealing locally specific characteristics of water criticality. This is useful for the screening of sites and their related water criticality, for indication of water related problems and possible mitigation options and water policies, and for future water scenario analysis. PMID:26392153

  16. RHIC CRITICAL POINT SEARCH: ASSESSING STARs CAPABILITIES.

    SciTech Connect

    SORENSEN,P.

    2006-07-03

    In this report we discuss the capabilities and limitations of the STAR detector to search for signatures of the QCD critical point in a low energy scan at RHIC. We find that a RHIC low energy scan will cover a broad region of interest in the nuclear matter phase diagram and that the STAR detector--a detector designed to measure the quantities that will be of interest in this search--will provide new observables and improve on previous measurements in this energy range.

  17. A Critical Evaluation of Cognitive Style Assessment.

    ERIC Educational Resources Information Center

    Richter, Ricka

    This document reviews theories of cognitive style and methods of cognitive style assessment as they relate to the context of South Africa, where sociopolitical changes call for reassessment of theoretical assumptions in education and training. The report consists of six chapters. After a brief introductory chapter, the second chapter gives an…

  18. The change in critical technologies for computational physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1990-01-01

    It is noted that the types of technology required for computational physics are changing as the field matures. Emphasis has shifted from computer technology to algorithm technology and, finally, to visual analysis technology as areas of critical research for this field. High-performance graphical workstations tied to a supercommunicator with high-speed communications along with the development of especially tailored visualization software has enabled analysis of highly complex fluid-dynamics simulations. Particular reference is made here to the development of visual analysis tools at NASA's Numerical Aerodynamics Simulation Facility. The next technology which this field requires is one that would eliminate visual clutter by extracting key features of simulations of physics and technology in order to create displays that clearly portray these key features. Research in the tuning of visual displays to human cognitive abilities is proposed. The immediate transfer of technology to all levels of computers, specifically the inclusion of visualization primitives in basic software developments for all work stations and PCs, is recommended.

  19. Fuzzy architecture assessment for critical infrastructure resilience

    SciTech Connect

    Muller, George

    2012-12-01

    This paper presents an approach for the selection of alternative architectures in a connected infrastructure system to increase resilience of the overall infrastructure system. The paper begins with a description of resilience and critical infrastructure, then summarizes existing approaches to resilience, and presents a fuzzy-rule based method of selecting among alternative infrastructure architectures. This methodology includes considerations which are most important when deciding on an approach to resilience. The paper concludes with a proposed approach which builds on existing resilience architecting methods by integrating key system aspects using fuzzy memberships and fuzzy rule sets. This novel approach aids the systems architect in considering resilience for the evaluation of architectures for adoption into the final system architecture.

  20. Assessment of critical-fluid extractions in the process industries

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The potential for critical-fluid extraction as a separation process for improving the productive use of energy in the process industries is assessed. Critical-fluid extraction involves the use of fluids, normally gaseous at ambient conditions, as extraction solvents at temperatures and pressures around the critical point. Equilibrium and kinetic properties in this regime are very favorable for solvent applications, and generally allow major reductions in the energy requirements for separating and purifying chemical component of a mixture.

  1. Assessment as Critical Praxis: A Community College Experience.

    ERIC Educational Resources Information Center

    Cameron, Jeanne; Walsh, Philip; Stavenhagen Helgren, Tina; Kobritz, Barbara

    2002-01-01

    Describes a program for assessing student learning at a community college using critical theories of knowledge and the learning process. Finds all measures of learning indicate significant improvement in learning outcomes. Records and discusses the program methodology within the framework of critical pedagogical theory. (Author/KDR)

  2. Using Writing to Develop and Assess Critical Thinking.

    ERIC Educational Resources Information Center

    Wade, Carole

    1995-01-01

    Asserts that written work has advantages over oral discussion in the development and assessment of students' critical thinking skills. Describes a set of short writing assignments that focuses on eight essential aspects of critical and creative thought. Provides examples of how to use writing assignments in college psychology courses. (CFR)

  3. Assessment of Critical Thinking Ability in Medical Students

    ERIC Educational Resources Information Center

    Macpherson, Karen; Owen, Cathy

    2010-01-01

    In this study conducted with 80 first-year students in a graduate medical course at the Australian National University, Canberra, students' critical thinking skills were assessed using the Watson-Glaser Critical Thinking Appraisal (Forms A and B) in a test-retest design. Results suggested that overall subjects retained consistent patterns of…

  4. Assessment of community contamination: a critical approach.

    PubMed

    Clark, Lauren; Barton, Judith A; Brown, Nancy J

    2002-01-01

    The purpose of this paper is to review data from two Superfund sites and describe the latitude of interpretation of "environmental risk" by residents living in the area, governmental agencies, and the media. The first community was located within a 5-mi perimeter of the Rocky Flats Environmental Technology Site (RFETS) outside Denver, Colorado. The second community was located on the south side of Tucson, Arizona, adjacent to the Tucson International Airport area (TIAA) Superfund site. Critical theory was the perspective used in this analysis and proposal of public health actions to attain social justice. Differences between the two populations' experiences with risk and contamination coincided with divergent levels of trust in government. RFETS residents demanded monitoring, whereas the minority residents at TIAA were ambivalent about their trust in government cleanup activities. Unraveling the purpose of "facts" and the social force of "truth" can direct nurses to address environmental justice issues. By policing governmental and business activities in halting or cleaning up environmental contamination, nurses may become mouthpieces for the concerns underlying the fragile surface of "virtual trust" in contaminated communities. Cutting through competing rhetoric to police environmental safety, the core function of assurance becomes what nurses do, not what they say. PMID:12182695

  5. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  6. Criticality safety assessment of tank 241-C-106 remediation

    SciTech Connect

    Waltar, A.E., Westinghouse Hanford

    1996-07-19

    A criticality safety assessment was performed in support of Project 320 for the retrieval of waste from tank 241-C-106 to tank 241-AY-102. The assessment was performed by a multi-disciplined team consisting of expertise covering the range of nuclear engineering, plutonium and nuclear waste chemistry,and physical mixing hydraulics. Technical analysis was performed to evaluate the physical and chemical behavior of fissile material in neutralized Hanford waste as well as modeling of the fluid dynamics for the retrieval activity. The team has not found evidence of any credible mechanism to attain neutronic criticality in either tank and has concluded that a criticality accident is incredible.

  7. A COMPUTER-ASSIST MATERIAL TRACKING SYSTEM AS A CRITICALITY SAFETY AID TO OPERATORS

    SciTech Connect

    Claybourn, R V; Huang, S T

    2007-03-30

    In today's compliant-driven environment, fissionable material handlers are inundated with work control rules and procedures in carrying out nuclear operations. Historically, human errors are one of the key contributors of various criticality accidents. Since moving and handling fissionable materials are key components of their job functions, any means that can be provided to assist operators in facilitating fissionable material moves will help improve operational efficiency and enhance criticality safety implementation. From the criticality safety perspective, operational issues have been encountered in Lawrence Livermore National Laboratory (LLNL) plutonium operations. Those issues included lack of adequate historical record keeping for the fissionable material stored in containers, a need for a better way of accommodating operations in a research and development setting, and better means of helping material handlers in carrying out various criticality safety controls. Through the years, effective means were implemented including better work control process, standardized criticality control conditions (SCCC) and relocation of criticality safety engineers to the plutonium facility. Another important measure taken was to develop a computer data acquisition system for criticality safety assessment, which is the subject of this paper. The purpose of the Criticality Special Support System (CSSS) is to integrate many of the proven operational support protocols into a software system to assist operators with assessing compliance to procedures during the handling and movement of fissionable materials. Many nuclear facilities utilize mass cards or a computer program to track fissionable material mass data in operations. Additional item specific data such as, the presence of moderators or close fitting reflectors, could be helpful to fissionable material handlers in assessing compliance to SCCC's. Computer-assist checking of a workstation material inventory against the

  8. Guidelines for a Scientific Approach to Critical Thinking Assessment

    ERIC Educational Resources Information Center

    Bensley, D. Alan; Murtagh, Michael P.

    2012-01-01

    Assessment of student learning outcomes can be a powerful tool for improvement of instruction when a scientific approach is taken; unfortunately, many educators do not take full advantage of this approach. This article examines benefits of taking a scientific approach to critical thinking assessment and proposes guidelines for planning,…

  9. Criticism and Assessment Applied to New Media Art

    ERIC Educational Resources Information Center

    Ursyn, Anna

    2015-01-01

    This text examines educational criticism and assessment with an emphasis on the new media arts. The article shares with readers the versatile, abridged to four points criteria, based on a research on assessment made by students, faculty, and non-art-related professionals, thus providing a preliminary tool for the use in the classroom environment.…

  10. Establishing the Critical Elements That Determine Authentic Assessment

    ERIC Educational Resources Information Center

    Ashford-Rowe, Kevin; Herrington, Janice; Brown, Christine

    2014-01-01

    This study sought to determine the critical elements of an authentic learning activity, design them into an applicable framework and then use this framework to guide the design, development and application of work-relevant assessment. Its purpose was to formulate an effective model of task design and assessment. The first phase of the study…

  11. Mobile sources critical review: 1998 NARSTO assessment

    NASA Astrophysics Data System (ADS)

    Sawyer, R. F.; Harley, R. A.; Cadle, S. H.; Norbeck, J. M.; Slott, R.; Bravo, H. A.

    Mobile sources of air pollutants encompass a range of vehicle, engine, and fuel combinations. They emit both of the photochemical ozone precursors, hydrocarbons and oxides of nitrogen. The most important source of hydrocarbons and oxides of nitrogen are light- and heavy-duty on-road vehicles and heavy-duty off-road vehicles, utilizing spark and compression ignition engines burning gasoline and diesel respectively. Fuel consumption data provide a convenient starting point for assessing current and future emissions. Modern light-duty, gasoline vehicles when new have very low emissions. The in-use fleet, due largely to emissions from a small "high emitter" fraction, has significantly larger emissions. Hydrocarbons and carbon monoxide are higher than reported in current inventories. Other gasoline powered mobile sources (motorcycles, recreational vehicles, lawn, garden, and utility equipment, and light aircraft) have high emissions on a per quantity of fuel consumed basis, but their contribution to total emissions is small. Additional uncertainties in spatial and temporal distribution of emissions exist. Heavy-duty diesel vehicles are becoming the dominant mobile source of oxides of nitrogen. Oxides of nitrogen emissions may be greater than reported in current inventories, but the evidence for this is mixed. Oxides of nitrogen emissions on a fuel-consumed basis are much greater from diesel mobile sources than from gasoline mobile sources. This is largely the result of stringent control of gasoline vehicle emissions and a lesser (heavy-duty trucks) or no control (construction equipment, locomotives, ships) of heavy-duty mobile sources. The use of alternative fuels, natural gas, propane, alcohols, and oxygenates in motor vehicles is increasing but remains small. Vehicles utilizing these fuels can be but are not necessarily cleaner than their gasoline or diesel counterparts. Historical vehicle kilometers traveled growth rates of about 2% annually in both the United States

  12. Critical Assessment of Correction Methods for Fisheye Lens Distortion

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Tian, C.; Huang, Y.

    2016-06-01

    A fisheye lens is widely used to create a wide panoramic or hemispherical image. It is an ultra wide-angle lens that produces strong visual distortion. The distortion modeling and estimation of the fisheye lens are the crucial step for fisheye lens calibration and image rectification in computer vision and close-range photography. There are two kinds of distortion: radial and tangential distortion. Radial distortion is large for fisheye imaging and critical for the subsequent image processing. Although many researchers have developed calibration algorithms of radial distortion of fisheye lens, quantitative evaluation of the correction performance has remained a challenge. This is the first paper that intuitively and objectively evaluates the performance of five different calibration algorithms. Upto- date research on fisheye lens calibration is comprehensively reviewed to identify the research need. To differentiate their performance in terms of precision and ease-using, five methods are then tested using a diverse set of actual images of the checkerboard that are taken at Wuhan University, China under varying lighting conditions, shadows, and shooting angles. The method of rational function model, which was generally used for wide-angle lens correction, outperforms the other methods. However, the one parameter division model is easy for practical use without compromising too much the precision. The reason is that it depends on the linear structure in the image and requires no preceding calibration. It is a tradeoff between correction precision and ease-using. By critically assessing the strengths and limitations of the existing algorithms, the paper provides valuable insight and guideline for future practice and algorithm development that are important for fisheye lens calibration. It is promising for the optimal design of lens correction models that are suitable for the millions of portable imaging devices.

  13. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  14. Assessment of (Computer-Supported) Collaborative Learning

    ERIC Educational Resources Information Center

    Strijbos, J. -W.

    2011-01-01

    Within the (Computer-Supported) Collaborative Learning (CS)CL research community, there has been an extensive dialogue on theories and perspectives on learning from collaboration, approaches to scaffold (script) the collaborative process, and most recently research methodology. In contrast, the issue of assessment of collaborative learning has…

  15. APPLICATION OF FETAX IN ECOLOGICAL RISK ASSESSMENTS: A CRITICAL ASSESSMENT

    EPA Science Inventory

    A workshop sponsored by NIEHS in 2000 evaluated the use of FETAX as a screening method for identifying the developmental toxicity potenial of chemical and environmental samples. Workshop recommendations pertinent to environmental risk assessment suggested that additional comparat...

  16. Research on computer aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  17. Computer assisted blast design and assessment tools

    SciTech Connect

    Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.

    1995-12-31

    In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing; evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.

  18. Critical thinking traits of top-tier experts and implications for computer science education

    NASA Astrophysics Data System (ADS)

    Bushey, Dean E.

    of this study suggest a need to examine how critical-thinking abilities are learned in the undergraduate computer science curriculum and the need to foster these abilities in order to produce the high-level, critical-thinking professionals necessary to fill the growing need for these experts. Due to the fact that current measures of academic performance do not adequately depict students' cognitive abilities, assessment of these skills must be incorporated into existing curricula.

  19. Computer Software Training and HRD: What Are the Critical Issues?

    ERIC Educational Resources Information Center

    Altemeyer, Brad

    2005-01-01

    The paper explores critical issues for HRD practice from a parsonian framework across the HRD legs of organizational development, adult learning, and training and development. Insights into the critical issues emerge from this approach. Identifying successful transfer of training to be critical for organizational, group, and individual success.…

  20. Accessible high performance computing solutions for near real-time image processing for time critical applications

    NASA Astrophysics Data System (ADS)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  1. An Exploration of Three-Dimensional Integrated Assessment for Computational Thinking

    ERIC Educational Resources Information Center

    Zhong, Baichang; Wang, Qiyun; Chen, Jie; Li, Yi

    2016-01-01

    Computational thinking (CT) is a fundamental skill for students, and assessment is a critical factor in education. However, there is a lack of effective approaches to CT assessment. Therefore, we designed the Three-Dimensional Integrated Assessment (TDIA) framework in this article. The TDIA has two aims: one was to integrate three dimensions…

  2. Effects of Computer-Aided Personalized System of Instruction in Developing Knowledge and Critical Thinking in Blended Learning Courses

    ERIC Educational Resources Information Center

    Svenningsen, Louis; Pear, Joseph J.

    2011-01-01

    Two experiments were conducted to assess an online version of Keller's personalized system of instruction, called computer-aided personalized system of instruction (CAPSI), as part of a blended learning design with regard to course knowledge and critical thinking development. In Experiment 1, two lecture sections of an introduction to University…

  3. Total Quality Management in Higher Education: A Critical Assessment.

    ERIC Educational Resources Information Center

    Seymour, Daniel; Collett, Casey

    This study attempted a comprehensive, critical assessment of Total Quality Management (TQM) initiatives in higher education. A survey of 25 institutions (including community colleges, private four-year colleges and universities and public colleges) with experience with TQM was developed and used. The survey utilized , attitude scales designed to…

  4. Antiracist Education in Theory and Practice: A Critical Assessment

    ERIC Educational Resources Information Center

    Niemonen, Jack

    2007-01-01

    "Antiracist Education in Theory and Practice: A Critical Assessment" As a set of pedagogical, curricular, and organizational strategies, antiracist education claims to be the most progressive way today to understand race relations. Constructed from whiteness studies and the critique of colorblindness, its foundational core is located in…

  5. What Is a Good School? Critical Thoughts about Curriculum Assessments

    ERIC Educational Resources Information Center

    Zierer, Klaus

    2013-01-01

    Within the educational field, measurements such as the Programme for International Student Assessment (PISA), the Trends in International Mathematics and Science Study (TIMSS), and the Progress in International Reading Literacy Study (PIRLS) suggest we are living in a time of competition. This article takes a critical view of the modern drive to…

  6. Conceptualising, Developing and Assessing Critical Thinking in Law

    ERIC Educational Resources Information Center

    James, Nickolas; Hughes, Clair; Cappa, Clare

    2010-01-01

    "Critical thinking" is commonly included in the lists of graduate attributes (GAs), which all Australian universities are now required to develop and implement. That efforts to do so have met with limited success is due to a range of factors including inconsistent or naive conceptualisations, the failure to explicitly develop or assess GAs, and…

  7. Implementation and Critical Assessment of the Flipped Classroom Experience

    ERIC Educational Resources Information Center

    Scheg, Abigail G., Ed.

    2015-01-01

    In the past decade, traditional classroom teaching models have been transformed in order to better promote active learning and learner engagement. "Implementation and Critical Assessment of the Flipped Classroom Experience" seeks to capture the momentum of non-traditional teaching methods and provide a necessary resource for individuals…

  8. Critical issues using brain-computer interfaces for augmentative and alternative communication.

    PubMed

    Hill, Katya; Kovacs, Thomas; Shin, Sangeun

    2015-03-01

    Brain-computer interfaces (BCIs) may potentially be of significant practical value to patients in advanced stages of amyotrophic lateral sclerosis and locked-in syndrome for whom conventional augmentative and alternative communication (AAC) systems, which require some measure of consistent voluntary muscle control, are not satisfactory options. However, BCIs have primarily been used for communication in laboratory research settings. This article discusses 4 critical issues that should be addressed as BCIs are translated out of laboratory settings to become fully functional BCI/AAC systems that may be implemented clinically. These issues include (1) identification of primary, secondary, and tertiary system features; (2) integrating BCI/AAC systems in the World Health Organization's International Classification of Functioning, Disability and Health framework; (3) implementing language-based assessment and intervention; and (4) performance measurement. A clinical demonstration project is presented as an example of research beginning to address these critical issues. PMID:25721552

  9. Assessing Terrorist Motivations for Attacking Critical "Chemical" Infrastructure

    SciTech Connect

    Ackerman, G; Bale, J; Moran, K

    2004-12-14

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security, and way of life. One particular type of CI--that relating to chemicals--constitutes both an important element of our nation's infrastructure and a particularly attractive set of potential targets. This is primarily because of the large quantities of toxic industrial chemicals (TICs) it employs in various operations and because of the essential economic functions it serves. This study attempts to minimize some of the ambiguities that presently impede chemical infrastructure threat assessments by providing new insight into the key motivational factors that affect terrorist organizations propensity to attack chemical facilities. Prepared as a companion piece to the Center for Nonproliferation Studies August 2004 study--''Assessing Terrorist Motivations for Attacking Critical Infrastructure''--it investigates three overarching research questions: (1) why do terrorists choose to attack chemical-related infrastructure over other targets; (2) what specific factors influence their target selection decisions concerning chemical facilities; and (3) which, if any, types of groups are most inclined to attack chemical infrastructure targets? The study involved a multi-pronged research design, which made use of four discrete investigative techniques to answer the above questions as comprehensively as possible. These include: (1) a review of terrorism and threat assessment literature to glean expert consensus regarding terrorist interest in targeting chemical facilities; (2) the preparation of case studies to help identify internal group factors and contextual influences that have played a significant role in leading some terrorist groups to attack chemical facilities; (3) an examination of data from the Critical Infrastructure Terrorist Incident Catalog (CrITIC) to further illuminate the nature of terrorist attacks against chemical facilities to date; and (4

  10. Computational methods for criticality safety analysis within the scale system

    SciTech Connect

    Parks, C.V.; Petrie, L.M.; Landers, N.F.; Bucholz, J.A.

    1986-01-01

    The criticality safety analysis capabilities within the SCALE system are centered around the Monte Carlo codes KENO IV and KENO V.a, which are both included in SCALE as functional modules. The XSDRNPM-S module is also an important tool within SCALE for obtaining multiplication factors for one-dimensional system models. This paper reviews the features and modeling capabilities of these codes along with their implementation within the Criticality Safety Analysis Sequences (CSAS) of SCALE. The CSAS modules provide automated cross-section processing and user-friendly input that allow criticality safety analyses to be done in an efficient and accurate manner. 14 refs., 2 figs., 3 tabs.

  11. Assessment of computational prediction of tail buffeting

    NASA Technical Reports Server (NTRS)

    Edwards, John W.

    1990-01-01

    Assessments of the viability of computational methods and the computer resource requirements for the prediction of tail buffeting are made. Issues involved in the use of Euler and Navier-Stokes equations in modeling vortex-dominated and buffet flows are discussed and the requirement for sufficient grid density to allow accurate, converged calculations is stressed. Areas in need of basic fluid dynamics research are highlighted: vorticity convection, vortex breakdown, dynamic turbulence modeling for free shear layers, unsteady flow separation for moderately swept, rounded leading-edge wings, vortex flows about wings at high subsonic speeds. An estimate of the computer run time for a buffeting response calculation for a full span F-15 aircraft indicates that an improvement in computer and/or algorithm efficiency of three orders of magnitude is needed to enable routine use of such methods. Attention is also drawn to significant uncertainties in the estimates, in particular with regard to nonlinearities contained within the modeling and the question of the repeatability or randomness of buffeting response.

  12. The Role of Computer Assisted Fluid Balance in Critical Care

    PubMed Central

    Ciccolella, Sergio A.; Halloran, Mark J.; Brimm, John E.; O'Hara, Michael R.

    1978-01-01

    Computational, reporting, and data base management needs along with growth in sophistication have propelled the application of computers in medicine. These elements are satisfying specific clinical needs in the fluid balance program design that was undertaken. Significant potential exists for extending the computer's intervention by using available transducing techniques to obtain information that is currently manually derived. Thus, the design currently satisfies the goal of maximizing information while minimizing labor intensive overhead and will continue to evolve in that direction.

  13. VOXMAT: Hybrid Computational Phantom for Dose Assessment

    SciTech Connect

    Akkurt, Hatice; Eckerman, Keith F

    2007-01-01

    The Oak Ridge National Laboratory (ORNL) computational phantoms have been the standard for assessing the radiation dose due to internal and external exposure over the past three decades. In these phantoms, the body surface and each organ are approximated by mathematical equations; hence, some of the organs are not necessarily realistic in their shape. Over the past two decades, these phantoms have been revised and updated: some of the missing internal organs have been added and the locations of the existing organs have been revised (e.g., thyroid). In the original phantom, only three elemental compositions were used to describe all body tissues. Recently, the compositions of the organs have been updated based on ICRP-89 standards. During the past decade, phantoms based on CT scans were developed for use in dose assessment. Although their shapes are realistic, some computational challenges are noted; including increased computational times and increased memory requirements. For good spatial resolution, more than several million voxels are used to represent the human body. Moreover, when CT scans are obtained, the subject is in a supine position with arms at the side. In some occupational exposure cases, it is necessary to evaluate the dose with the arms and legs in different positions. It will be very difficult and inefficient to reposition the voxels defining the arms and legs to simulate these exposure geometries. In this paper, a new approach for computational phantom development is presented. This approach utilizes the combination of a mathematical phantom and a voxelized phantom for the representation of the anatomy.

  14. Critical thinking: assessing the risks to the future security of supply of critical metals

    NASA Astrophysics Data System (ADS)

    Gunn, Gus

    2015-04-01

    Increasing world population, the spread of prosperity across the globe and the demands of new technologies have led to a revival of concerns about the availability of raw materials needed by society. Despite scare stories about resource depletion, physical exhaustion of minerals is considered to be unlikely. However, we do need to know which materials might be of concern so that we can develop strategies to secure adequate supplies and to mitigate the effects of supply disruption. This requirement has led to renewed interest in criticality, a term that is generally used to refer to metals and minerals of high economic importance that have a relatively high likelihood of supply disruption. The European Union (EU) developed a quantitative methodology for the assessment of criticality which led to the definition of 14 raw materials as critical to the EU economy (EC, 2010). This has succeeded in raising awareness of potential supply issues and in helping to prioritise requirements for new policies and supporting research. The EU has recently assessed a larger number of candidate materials of which 20 are now identified as critical to the EU (EC, 2014). These include metals such as indium, mostly used in flat-screen displays, antimony for flame retardants and cobalt for rechargeable batteries, alloys and a host of other products. Although there is no consensus on the methodology for criticality assessments and broad analyses at this scale are inevitably imperfect, they can, nevertheless, provide early warning of supply problems. However, in order to develop more rigorous and dynamic assessments of future availability detailed analysis of the whole life-cycle of individual metals to identify specific problems and develop appropriate solutions is required. New policies, such as the Raw Materials Initiative (2008) and the European Innovation Partnership on Raw Materials (2013), have been developed by the European Commission (EC) and are aimed at securing sustainable

  15. Some Techniques for Computer-Based Assessment in Medical Education.

    ERIC Educational Resources Information Center

    Mooney, G. A.; Bligh, J. G.; Leinster, S. J.

    1998-01-01

    Presents a system of classification for describing computer-based assessment techniques based on the level of action and educational activity they offer. Illustrates 10 computer-based assessment techniques and discusses their educational value. Contains 14 references. (Author)

  16. Geospatial decision support framework for critical infrastructure interdependency assessment

    NASA Astrophysics Data System (ADS)

    Shih, Chung Yan

    Critical infrastructures, such as telecommunications, energy, banking and finance, transportation, water systems and emergency services are the foundations of modern society. There is a heavy dependence on critical infrastructures at multiple levels within the supply chain of any good or service. Any disruptions in the supply chain may cause profound cascading effect to other critical infrastructures. A 1997 report by the President's Commission on Critical Infrastructure Protection states that a serious interruption in freight rail service would bring the coal mining industry to a halt within approximately two weeks and the availability of electric power could be reduced in a matter of one to two months. Therefore, this research aimed at representing and assessing the interdependencies between coal supply, transportation and energy production. A proposed geospatial decision support framework was established and applied to analyze interdependency related disruption impact. By utilizing the data warehousing approach, geospatial and non-geospatial data were retrieved, integrated and analyzed based on the transportation model and geospatial disruption analysis developed in the research. The results showed that by utilizing this framework, disruption impacts can be estimated at various levels (e.g., power plant, county, state, etc.) for preventative or emergency response efforts. The information derived from the framework can be used for data mining analysis (e.g., assessing transportation mode usages; finding alternative coal suppliers, etc.).

  17. Critical assessment of phospholipid measurement in amniotic fluid.

    PubMed

    Badham, L P; Worth, H G

    1975-09-01

    We assessed several methods of inorganic phosphate assay for their suitability in estimating phospholipids in digested extracts of amniotic fluids. The extraction and digestion procedures used for phospholipids from amniotic fluid were also examined critically. The effect of contamination by blood or obstetric cream has been examined. Accordingly, we suggest a method for measuring total phospholipids in amniotic fluids, and results of it are compared with the lecithin/sphingomyelin ratio measurement in some clinical situations. PMID:1157310

  18. Critical evaluation of oxygen-uptake assessment in swimming.

    PubMed

    Sousa, Ana; Figueiredo, Pedro; Pendergast, David; Kjendlie, Per-Ludvik; Vilas-Boas, João P; Fernandes, Ricardo J

    2014-03-01

    Swimming has become an important area of sport science research since the 1970s, with the bioenergetic factors assuming a fundamental performance-influencing role. The purpose of this study was to conduct a critical evaluation of the literature concerning oxygen-uptake (VO2) assessment in swimming, by describing the equipment and methods used and emphasizing the recent works conducted in ecological conditions. Particularly in swimming, due to the inherent technical constraints imposed by swimming in a water environment, assessment of VO2max was not accomplished until the 1960s. Later, the development of automated portable measurement devices allowed VO2max to be assessed more easily, even in ecological swimming conditions, but few studies have been conducted in swimming-pool conditions with portable breath-by-breath telemetric systems. An inverse relationship exists between the velocity corresponding to VO2max and the time a swimmer can sustain it at this velocity. The energy cost of swimming varies according to its association with velocity variability. As, in the end, the supply of oxygen (whose limitation may be due to central-O2 delivery and transportation to the working muscles-or peripheral factors-O2 diffusion and utilization in the muscles) is one of the critical factors that determine swimming performance, VO2 kinetics and its maximal values are critical in understanding swimmers' behavior in competition and to develop efficient training programs. PMID:24414133

  19. Computational Tools to Assess Turbine Biological Performance

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

  20. Laptop Computer - Based Facial Recognition System Assessment

    SciTech Connect

    R. A. Cain; G. B. Singleton

    2001-03-01

    The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results. After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in

  1. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    ERIC Educational Resources Information Center

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  2. Critical Issues Forum: A multidisciplinary educational program integrating computer technology

    SciTech Connect

    Alexander, R.J.; Robertson, B.; Jacobs, D.

    1998-09-01

    The Critical Issues Forum (CIF) funded by the US Department of Energy is a collaborative effort between the Science Education Team of Los Alamos National Laboratory (LANL) and New Mexico high schools to improve science education throughout the state of New Mexico as well as nationally. By creating an education relationship between the LANL with its unique scientific resources and New Mexico high schools, students and teachers participate in programs that increase not only their science content knowledge but also their critical thinking and problem-solving skills. The CIF program focuses on current, globally oriented topics crucial to the security of not only the US but to that of all nations. The CIF is an academic-year program that involves both teachers and students in the process of seeking solutions for real world concerns. Built around issues tied to LANL`s mission, participating students and teachers are asked to critically investigate and examine the interactions among the political, social, economic, and scientific domains while considering diversity issues that include geopolitical entities and cultural and ethnic groupings. Participants are expected to collaborate through telecommunications during the research phase and participate in a culminating multimedia activity, where they produce and deliver recommendations for the current issues being studied. The CIF was evaluated and found to be an effective approach for teacher professional training, especially in the development of skills for critical thinking and questioning. The CIF contributed to students` ability to integrate diverse disciplinary content about science-related topics and supported teachers in facilitating the understanding of their students using the CIF approach. Networking technology in CIF has been used as an information repository, resource delivery mechanism, and communication medium.

  3. Quality assessment of clinical computed tomography

    NASA Astrophysics Data System (ADS)

    Berndt, Dorothea; Luckow, Marlen; Lambrecht, J. Thomas; Beckmann, Felix; Müller, Bert

    2008-08-01

    Three-dimensional images are vital for the diagnosis in dentistry and cranio-maxillofacial surgery. Artifacts caused by highly absorbing components such as metallic implants, however, limit the value of the tomograms. The dominant artifacts observed are blowout and streaks. Investigating the artifacts generated by metallic implants in a pig jaw, the data acquisition for the patients in dentistry should be optimized in a quantitative manner. A freshly explanted pig jaw including related soft-tissues served as a model system. Images were recorded varying the accelerating voltage and the beam current. The comparison with multi-slice and micro computed tomography (CT) helps to validate the approach with the dental CT system (3D-Accuitomo, Morita, Japan). The data are rigidly registered to comparatively quantify their quality. The micro CT data provide a reasonable standard for quantitative data assessment of clinical CT.

  4. Transesophageal echocardiographic assessment in trauma and critical care

    PubMed Central

    Tousignant, Claude

    1999-01-01

    Cardiac ultrasonography, in particular transesophageal echocardiography (TEE) provides high-quality real-time images of the beating heart and mediastinal structures. The addition of Doppler technology introduces a qualitative and quantitative assessment of blood flow in the heart and vascular structures. Because of its ease of insertion and ready accessibility, TEE has become an important tool in the routine management of critically ill patients, as a monitor in certain operative settings and in the aortic and cardiac evaluation of trauma patients. The rapid assessment of cardiac preload, contractility and valve function are invaluable in patients with acute hemodynamic decompensation in the intensive care unit as well as in the operating room. Because of its ease and portability, the TEE assessment of traumatic aortic injury after blunt chest trauma can be rapidly undertaken even in patients undergoing life-saving procedures. The role of TEE in the surgical and critical care setting will no doubt increase as more people become aware of its potential. PMID:10372012

  5. Computer-assisted learning in critical care: from ENIAC to HAL.

    PubMed

    Tegtmeyer, K; Ibsen, L; Goldstein, B

    2001-08-01

    Computers are commonly used to serve many functions in today's modern intensive care unit. One of the most intriguing and perhaps most challenging applications of computers has been to attempt to improve medical education. With the introduction of the first computer, medical educators began looking for ways to incorporate their use into the modern curriculum. Prior limitations of cost and complexity of computers have consistently decreased since their introduction, making it increasingly feasible to incorporate computers into medical education. Simultaneously, the capabilities and capacities of computers have increased. Combining the computer with other modern digital technology has allowed the development of more intricate and realistic educational tools. The purpose of this article is to briefly describe the history and use of computers in medical education with special reference to critical care medicine. In addition, we will examine the role of computers in teaching and learning and discuss the types of interaction between the computer user and the computer. PMID:11496040

  6. Adaptive critic design for computer intrusion detection system

    NASA Astrophysics Data System (ADS)

    Novokhodko, Alexander; Wunsch, Donald C., II; Dagli, Cihan H.

    2001-03-01

    This paper summarizes ongoing research. A neural network is used to detect a computer system intrusion basing on data from the system audit trail generated by Solaris Basic Security Module. The data have been provided by Lincoln Labs, MIT. The system alerts the human operator, when it encounters suspicious activity logged in the audit trail. To reduce the false alarm rate and accommodate the temporal indefiniteness of moment of attack a reinforcement learning approach is chosen to train the network.

  7. Testbeds for Assessing Critical Scenarios in Power Control Systems

    NASA Astrophysics Data System (ADS)

    Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem

    The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.

  8. Critical Assessment of Endoscopic Techniques for Gastroesophageal Reflux Disease.

    PubMed

    Lo, Wai-Kit; Mashimo, Hiroshi

    2015-10-01

    Over the past 2 decades, a number of new endoscopic techniques have been developed for management of gastroesophageal (GE) reflux disease symptoms as alternatives to medical management and surgical fundoplication. These devices include application of radiofrequency treatment (Stretta), endoscopic plication (EndoCinch, Plicator, Esophyx, MUSE), and injection of bulking agents (Enteryx, Gatekeeper, Plexiglas, Duragel). Their goal was symptom relief through reduction of tissue compliance and enhancement of anatomic resistance at the GE junction. In this review, we critically assess the research behind the efficacy, safety, and durability of these treatments to better understand their roles in contemporary GE reflux disease management. PMID:26241152

  9. Computer Use in Psychometric Assessment: Evaluating Benefits and Potential Problems.

    ERIC Educational Resources Information Center

    Merrell, Kenneth W.

    1985-01-01

    The expansion of computer technology has created many possiblities for computer applications in the area of psychological testing and assessment. The ways that computers can be used in psychometric assessment, the benefits of such use, and problems that may be encountered with these uses are discussed. (Author/BL)

  10. Breadth-Oriented Outcomes Assessment in Computer Science.

    ERIC Educational Resources Information Center

    Cordes, David; And Others

    Little work has been done regarding the overall assessment of quality of computer science graduates at the undergraduate level. This paper reports on a pilot study at the University of Alabama of a prototype computer science outcomes assessment designed to evaluate the breadth of knowledge of computer science seniors. The instrument evaluated two…

  11. Computer-Based Assessments. Information Capsule. Volume 0918

    ERIC Educational Resources Information Center

    Blazer, Christie

    2010-01-01

    This Information Capsule reviews research conducted on computer-based assessments. Advantages and disadvantages associated with computer-based testing programs are summarized and research on the comparability of computer-based and paper-and-pencil assessments is reviewed. Overall, studies suggest that for most students, there are few if any…

  12. Literary and Electronic Hypertext: Borges, Criticism, Literary Research, and the Computer.

    ERIC Educational Resources Information Center

    Davison, Ned J.

    1991-01-01

    Examines what "hypertext" means to literary criticism on the one hand (i.e., intertextuality) and computing on the other, to determine how the two concepts may serve each other in a mutually productive way. (GLR)

  13. Assessing Computer Knowledge among College Students.

    ERIC Educational Resources Information Center

    Parrish, Allen; And Others

    This paper reports on a study involving the administration of two examinations that were designed to evaluate student knowledge in several areas of computing. The tests were given both to computer science majors and to those enrolled in computer science classes from other majors. They sought to discover whether computer science majors demonstrated…

  14. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  15. 78 FR 29375 - Protected Critical Infrastructure Information (PCII) Office Self-Assessment Questionnaire

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ... SECURITY Protected Critical Infrastructure Information (PCII) Office Self- Assessment Questionnaire AGENCY... Information Collection Division (IICD), Protected Critical Infrastructure Information (PCII) Program will...: The PCII Program was created by Congress under the Critical Infrastructure Information Act of...

  16. Clinical significance of computed tomography assessment for third molar surgery.

    PubMed

    Nakamori, Kenji; Tomihara, Kei; Noguchi, Makoto

    2014-07-28

    Surgical extraction of the third molar is the most commonly performed surgical procedure in the clinical practice of oral surgery. Third molar surgery is warranted when there is inadequate space for eruption, malpositioning, or risk for cyst or odontogenic tumor formation. Preoperative assessment should include a detailed morphologic analysis of the third molar and its relationship to adjacent structures and surrounding tissues. Due to developments in medical engineering technology, computed tomography (CT) now plays a critical role in providing the clear images required for adequate assessment prior to third molar surgery. Removal of the maxillary third molar is associated with a risk for maxillary sinus perforation, whereas removal of the mandibular third molar can put patients at risk for a neurosensory deficit from damage to the lingual nerve or inferior alveolar nerve. Multiple factors, including demographic, anatomic, and treatment-related factors, influence the incidence of nerve injury during or following removal of the third molar. CT assessment of the third molar prior to surgery can identify some of these risk factors, such as the absence of cortication between the mandibular third molar and the inferior alveolar canal, prior to surgery to reduce the risk for nerve damage. This topic highlight presents an overview of the clinical significance of CT assessment in third molar surgery. PMID:25071882

  17. Clinical significance of computed tomography assessment for third molar surgery

    PubMed Central

    Nakamori, Kenji; Tomihara, Kei; Noguchi, Makoto

    2014-01-01

    Surgical extraction of the third molar is the most commonly performed surgical procedure in the clinical practice of oral surgery. Third molar surgery is warranted when there is inadequate space for eruption, malpositioning, or risk for cyst or odontogenic tumor formation. Preoperative assessment should include a detailed morphologic analysis of the third molar and its relationship to adjacent structures and surrounding tissues. Due to developments in medical engineering technology, computed tomography (CT) now plays a critical role in providing the clear images required for adequate assessment prior to third molar surgery. Removal of the maxillary third molar is associated with a risk for maxillary sinus perforation, whereas removal of the mandibular third molar can put patients at risk for a neurosensory deficit from damage to the lingual nerve or inferior alveolar nerve. Multiple factors, including demographic, anatomic, and treatment-related factors, influence the incidence of nerve injury during or following removal of the third molar. CT assessment of the third molar prior to surgery can identify some of these risk factors, such as the absence of cortication between the mandibular third molar and the inferior alveolar canal, prior to surgery to reduce the risk for nerve damage. This topic highlight presents an overview of the clinical significance of CT assessment in third molar surgery. PMID:25071882

  18. A Critical Assessment of Vector Control for Dengue Prevention

    PubMed Central

    Achee, Nicole L.; Gould, Fred; Perkins, T. Alex; Reiner, Robert C.; Morrison, Amy C.; Ritchie, Scott A.; Gubler, Duane J.; Teyssou, Remy; Scott, Thomas W.

    2015-01-01

    Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations. PMID:25951103

  19. A critical assessment of vector control for dengue prevention.

    PubMed

    Achee, Nicole L; Gould, Fred; Perkins, T Alex; Reiner, Robert C; Morrison, Amy C; Ritchie, Scott A; Gubler, Duane J; Teyssou, Remy; Scott, Thomas W

    2015-05-01

    Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations. PMID:25951103

  20. Assessing monoclonal antibody product quality attribute criticality through clinical studies.

    PubMed

    Goetze, Andrew M; Schenauer, Matthew R; Flynn, Gregory C

    2010-01-01

    Recombinant therapeutic proteins, including antibodies, contain a variety of chemical and physical modifications. Great effort is expended during process and formulation development in controlling and minimizing this heterogeneity, which may not affect safety or efficacy, and, therefore, may not need to be controlled. Many of the chemical conversions also occur in vivo, and knowledge about the alterations can be applied to assessment of the potential impact on characteristics and the biological activity of therapeutic proteins. Other attributes may affect the drug clearance and thereby alter drug efficacy. In this review article, we describe attribute studies conducted using clinical samples and how information gleaned from them is applied to attribute criticality assessment. In general, how fast attributes change in vivo compared to the rate of mAb elimination is the key parameter used in these evaluations. An attribute with more rapidly changing levels may have greater potential to affect safety or efficacy and thereby reach the status of a Critical Quality Attribute (CQA) that should be controlled during production and storage, but the effect will depend on whether compositional changes are due to chemical conversion or differential clearance. PMID:20671426

  1. Critical Thinking Outcomes of Computer-Assisted Instruction versus Written Nursing Process.

    ERIC Educational Resources Information Center

    Saucier, Bonnie L.; Stevens, Kathleen R.; Williams, Gail B.

    2000-01-01

    Nursing students (n=43) who used clinical case studies via computer-assisted instruction (CAI) were compared with 37 who used the written nursing process (WNP). California Critical Thinking Skills Test results did not show significant increases in critical thinking. The WNP method was more time consuming; the CAI group was more satisfied. Use of…

  2. A CAD (Classroom Assessment Design) of a Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, Nazir S.

    2012-01-01

    This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified for the subsequent…

  3. 24 CFR 901.105 - Computing assessment score.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Computing assessment score. 901.105 Section 901.105 Housing and Urban Development Regulations Relating to Housing and Urban Development... DEVELOPMENT PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.105 Computing assessment score. (a)...

  4. The Mass Implementation and Evaluation of Computer-Based Assessments.

    ERIC Educational Resources Information Center

    Zakrzewski, Stan; Bull, Joanna

    1998-01-01

    An interactive, computer-based assessment system implemented at the University of Luton (England) delivers end-of-module examinations, formative assessments, and self-assessment options. Student and faculty response to computer-based objective testing has been positive and suggests the approach is desirable both pedagogically and economically.…

  5. Computer Simulations to Support Science Instruction and Learning: A Critical Review of the Literature

    ERIC Educational Resources Information Center

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-01-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is…

  6. Assessing Students' Critical Thinking Performance: Urging for Measurements Using Multi-Response Format

    ERIC Educational Resources Information Center

    Ku, Kelly Y. L.

    2009-01-01

    The current paper discusses ambiguities in critical thinking assessment. The paper first reviews the components of critical thinking. It then discusses the features and issues of commonly used critical thinking tests and to what extend they are made compatible to the conceptualization of critical thinking. The paper argues that critical thinking…

  7. Risk assessment for physical and cyber attacks on critical infrastructures.

    SciTech Connect

    Smith, Bryan J.; Sholander, Peter E.; Phelan, James M.; Wyss, Gregory Dane; Varnado, G. Bruce; Depoy, Jennifer Mae

    2005-08-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies. Existing risk assessment methodologies consider physical security and cyber security separately. As such, they do not accurately model attacks that involve defeating both physical protection and cyber protection elements (e.g., hackers turning off alarm systems prior to forced entry). This paper presents a risk assessment methodology that accounts for both physical and cyber security. It also preserves the traditional security paradigm of detect, delay and respond, while accounting for the possibility that a facility may be able to recover from or mitigate the results of a successful attack before serious consequences occur. The methodology provides a means for ranking those assets most at risk from malevolent attacks. Because the methodology is automated the analyst can also play 'what if with mitigation measures to gain a better understanding of how to best expend resources towards securing the facilities. It is simple enough to be applied to large infrastructure facilities without developing highly complicated models. Finally, it is applicable to facilities with extensive security as well as those that are less well-protected.

  8. Calculational assessment of critical experiments with mixed oxide fuel pin arrays moderated by organic solution

    SciTech Connect

    Smolen, G.R.

    1987-01-01

    Critical experiments have been conducted with organic-moderated mixed oxide (MOX) fuel pin assemblies at the Pacific Northwest Laboratory (PNL) Critical Mass Laboratory (CML). These experiments are part of a joint exchange program between the United States Department of Energy (USDOE) and the Power Reactor and Nuclear Fuel Development Corporation (PNC) of Japan in the area of criticality data development. The purpose of these experiments is to benchmark computer codes and cross-section libraries and to assess the reactivity difference between systems moderated by water and those moderated by an organic solution. Past studies have indicated that some organic mixtures may be better moderators than water. This topic is of particular importance to the criticality safety of fuel processing plants where fissile material is dissolved in organic solutions during the solvent extraction process. In the past, it has been assumed that the codes and libraries benchmarked with water-moderated experiments were adequate when performing design and licensing studies of organic-moderated systems. Calculations presented in this paper indicated that the SCALE code system and the 27-energy-group cross-section accurately compute k-effectives for organic moderated MOX fuel-pin assemblies. Furthermore, the reactivity of an organic solution with a 32-vol-% TBP/68-vol-% NPH mixture in a heterogeneous configuration is the same, for practical purposes, as water. 5 refs.

  9. CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY: ASSAYS, MODELS AND TOOLS FOR NEXTGEN SAFETY ASSESSMENTS

    EPA Science Inventory

    The Center will develop new methods and tools, and will continue to collaborate closely with EPA, Tox21 and other environmental scientists. New in vitro populationbased assays and computer-based models that fill critical gaps in risk assessment will be developed and deliver...

  10. Collected Wisdom: Assessment Tools for Computer Science Programs

    ERIC Educational Resources Information Center

    Sanders, Kathryn E.; McCartney, Robert

    2004-01-01

    In this paper, we investigate the question of what assessment tools are being used in practice by United States computing programs and what the faculty doing the assessment think of the tools they use. After presenting some background with regard to the design, implementation, and use of assessment, with particular attention to assessment tools,…

  11. The Acceptance and Use of Computer Based Assessment

    ERIC Educational Resources Information Center

    Terzis, Vasileios; Economides, Anastasios A.

    2011-01-01

    The effective development of a computer based assessment (CBA) depends on students' acceptance. The purpose of this study is to build a model that demonstrates the constructs that affect students' behavioral intention to use a CBA. The proposed model, Computer Based Assessment Acceptance Model (CBAAM) is based on previous models of technology…

  12. A Framework for Assessing Computer Competence: Defining Objectives.

    ERIC Educational Resources Information Center

    National Assessment of Educational Progress, Princeton, NJ.

    Computer skills objectives have been developed for the 1986 National Assessment of Educational Progress (NAEP). These items will be administered to a large number of American students aged 9, 13, and 17 in grades 3, 7, and 11. For this first national assessment of computer skills, it was necessary to consider the existing expertise of school…

  13. Computer-aided assessment of cardiac computed tomographic images

    NASA Astrophysics Data System (ADS)

    King, Martin; Giger, Maryellen; Suzuki, Kenji; Pan, Xiaochuan

    2007-03-01

    The accurate interpretation of cardiac CT images is commonly hindered by the presence of motion artifacts. Since motion artifacts commonly can obscure the presence of coronary lesions, physicians must spend much effort analyzing images at multiple cardiac phases in order to determine which coronary structures are assessable for potential lesions. In this study, an artificial neural network (ANN) classifier was designed to assign assessability indices to calcified plaques in individual region-of-interest (ROI) images reconstructed at multiple cardiac phases from two cardiac scans obtained at heart rates of 66 bpm and 90 bpm. Six individual features (volume, circularity, mean intensity, margin gradient, velocity, and acceleration) were used for analyzing images. Visually-assigned assessability indices were used as a continuous truth, and jack-knife analysis with four testing sets was used to evaluate the performance of the ANN classifier. In a study in which all six features were inputted into the ANN classifier, correlation coefficients of 0.962 +/- 0.006 and 0.935 +/- 0.023 between true and ANN-assigned assessability indices were obtained for databases corresponding to 66 bpm and 90 bpm, respectively.

  14. Data on NAEP 2011 writing assessment prior computer use.

    PubMed

    Tate, Tamara P; Warschauer, Mark; Abedi, Jamal

    2016-09-01

    This data article contains information based on the 2011 National Assessment of Educational Progress in Writing Restricted-Use Data, available from the National Center for Education Statistics (NCES Pub. No. 2014476). https://nces.ed.gov/nationsreportcard/researchcenter/datatools.aspx. The data include the statistical relationships between survey reports of teachers and students regarding prior use of computers and other technology and writing achievement levels on the 2011 computer-based NAEP writing assessment. This data article accompanies "The Effects of Prior Computer Use on Computer-Based Writing: The 2011 NAEP Writing Assessment" [1]. PMID:27508253

  15. Assessing Mathematics Automatically Using Computer Algebra and the Internet

    ERIC Educational Resources Information Center

    Sangwin, Chris

    2004-01-01

    This paper reports some recent developments in mathematical computer-aided assessment which employs computer algebra to evaluate students' work using the Internet. Technical and educational issues raised by this use of computer algebra are addressed. Working examples from core calculus and algebra which have been used with first year university…

  16. Critical assessment of the evidence for striped nanoparticles.

    PubMed

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques. PMID:25402426

  17. Critical Assessment of the Evidence for Striped Nanoparticles

    PubMed Central

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques. PMID:25402426

  18. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    Through unsustainable land use practices, mining, deforestation, urbanisation and degradation by industrial pollution, soil losses are now hypothesized to be much faster (100 times or more) than soil formation - with the consequence that soil has become a finite resource. The crucial challenge for the international research community is to understand the rates of processes that dictate soil mass stocks and their function within Earth's Critical Zone (CZ). The CZ is the environment where soils are formed, degrade and provide their essential ecosystem services. Key among these ecosystem services are food and fibre production, filtering, buffering and transformation of water, nutrients and contaminants, storage of carbon and maintaining biological habitat and genetic diversity. We have initiated a new research project to address the priority research areas identified in the European Union Soil Thematic Strategy and to contribute to the development of a global network of Critical Zone Observatories (CZO) committed to soil research. Our hypothesis is that the combined physical-chemical-biological structure of soil can be assessed from first-principles and the resulting soil functions can be quantified in process models that couple the formation and loss of soil stocks with descriptions of biodiversity and nutrient dynamics. The objectives of this research are to 1. Describe from 1st principles how soil structure influences processes and functions of soils, 2. Establish 4 European Critical Zone Observatories to link with established CZOs, 3. Develop a CZ Integrated Model of soil processes and function, 4. Create a GIS-based modelling framework to assess soil threats and mitigation at EU scale, 5. Quantify impacts of changing land use, climate and biodiversity on soil function and its value and 6. Form with international partners a global network of CZOs for soil research and deliver a programme of public outreach and research transfer on soil sustainability. The

  19. Comparison of Critical Trajectory Methods for Direct CCT Computation for Transient Stability

    NASA Astrophysics Data System (ADS)

    Priyadi, Ardyono; Yorino, Naoto; Sasaki, Yutaka; Tanaka, Masahide; Fujiwara, Takuma; Zoka, Yoshifumi; Kakui, Hironori; Takeshita, Mitsuhiro

    This paper studies new techniques for critical trajectory method, a recent new method proposed by the authors for obtaining critical clearing time (CCT) for transient stability analysis. A specific feature of the proposed method lies in its ability to provide exact CCT without approximations since no such methods have existed so far. The method is based on the computation of the critical trajectory, which is defined as the trajectory that starts from a point on a fault-on trajectory at CCT and reaches an end point. There are a few possible methods for the treatment of the end point conditions, computational performances of the methods are investigated in terms of accuracy of CCT and computational efficiency. It is shown that the proposed methods successfully provide the exact CCT that agrees with the conventional numerical simulation method.

  20. Assessing Critical Thinking in Higher Education: The HEIghten™ Approach and Preliminary Validity Evidence

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Mao, Liyang; Frankel, Lois; Xu, Jun

    2016-01-01

    Critical thinking is a learning outcome highly valued by higher education institutions and the workforce. The Educational Testing Service (ETS) has designed a next generation assessment, the HEIghten™ critical thinking assessment, to measure students' critical thinking skills in analytical and synthetic dimensions. This paper introduces the…

  1. The Effects of Using a Critical Thinking Scoring Rubric to Assess Undergraduate Students' Reading Skills

    ERIC Educational Resources Information Center

    Leist, Cathy W.; Woolwine, Mark A.; Bays, Cathy L.

    2012-01-01

    The purpose of this study was to investigate the use of a critical thinking rubric as an assessment of reading achievement for students enrolled in a reading intervention course. A reading prompt and scoring rubric, based on Richard Paul and Linda Elder's critical thinking framework, were created to assess critical reading in an intervention…

  2. Critical Assessment of Implantable Drug Delivery Devices in Glaucoma Management

    PubMed Central

    Manickavasagam, Dharani; Oyewumi, Moses O.

    2013-01-01

    Glaucoma is a group of heterogeneous disorders involving progressive optic neuropathy that can culminate into visual impairment and irreversible blindness. Effective therapeutic interventions must address underlying vulnerability of retinal ganglion cells (RGCs) to degeneration in conjunction with correcting other associated risk factors (such as elevated intraocular pressure). However, realization of therapeutic outcomes is heavily dependent on suitable delivery system that can overcome myriads of anatomical and physiological barriers to intraocular drug delivery. Development of clinically viable sustained release systems in glaucoma is a widely recognized unmet need. In this regard, implantable delivery systems may relieve the burden of chronic drug administration while potentially ensuring high intraocular drug bioavailability. Presently there are no FDA-approved implantable drug delivery devices for glaucoma even though there are several ongoing clinical studies. The paper critically assessed the prospects of polymeric implantable delivery systems in glaucoma while identifying factors that can dictate (a) patient tolerability and acceptance, (b) drug stability and drug release profiles, (c) therapeutic efficacy, and (d) toxicity and biocompatibility. The information gathered could be useful in future research and development efforts on implantable delivery systems in glaucoma. PMID:24066234

  3. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  4. A critical assessment of wind tunnel results for the NACA 0012 airfoil

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.

    1987-01-01

    A large body of experimental results, obtained in more than 40 wind tunnels on a single, well-known two-dimensional configuration, has been critically examined and correlated. An assessment of some of the possible sources of error has been made for each facility, and data which are suspect have been identified. It was found that no single experiment provided a complete set of reliable data, although one investigation stands out as superior in many respects. However, from the aggregate of data the representative properties of the NACA 0012 airfoil can be identified with reasonable confidence over wide ranges of Mach number, Reynolds number, and angles of attack. This synthesized information can now be used to assess and validate existing and future wind tunnel results and to evaluate advanced Computational Fluid Dynamics codes.

  5. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  6. Does Computer-Aided Formative Assessment Improve Learning Outcomes?

    ERIC Educational Resources Information Center

    Hannah, John; James, Alex; Williams, Phillipa

    2014-01-01

    Two first-year engineering mathematics courses used computer-aided assessment (CAA) to provide students with opportunities for formative assessment via a series of weekly quizzes. Most students used the assessment until they achieved very high (>90%) quiz scores. Although there is a positive correlation between these quiz marks and the final…

  7. Empirically Assessing the Importance of Computer Skills

    ERIC Educational Resources Information Center

    Baker, William M.

    2013-01-01

    This research determines which computer skills are important for entry-level accountants, and whether some skills are more important than others. Students participated before and after internships in public accounting. Longitudinal analysis is also provided; responses from 2001 are compared to those from 2008-2009. Responses are also compared to…

  8. Teacher Assessment of Elementary Schools' Computer Laboratories.

    ERIC Educational Resources Information Center

    Zollman, Alan; Wyrick, James

    In order to evaluate the effectiveness of the elementary computer laboratories and the Educational Systems Corporation (ESC) software in the Fayette County (Kentucky) Public Schools, a Likert-type questionnaire on teacher attitudes and beliefs was designed, field-tested, revised, and distributed at the end of the 1988 spring semester. Analyses of…

  9. Demonstration Assessment: Measuring Conceptual Understanding and Critical Thinking with Rubrics.

    ERIC Educational Resources Information Center

    Radford, David L.; And Others

    1995-01-01

    Presents the science demonstration assessment as an authentic- assessment technique to assess whether students understand basic science concepts and can use them to solve problems. Uses rubrics to prepare students for the assessment and to assign final grades. Provides examples of science demonstration assessments and the scoring of rubrics in the…

  10. The Effect of Computer Science Instruction on Critical Thinking Skills and Mental Alertness.

    ERIC Educational Resources Information Center

    Norris, Cathleen; And Others

    1992-01-01

    Pretests measuring critical thinking ability and mental alertness were administered to 72 first-year college students at the beginning of an introductory computer programing course. Posttests administered at the end of the semester showed significant improvement in both areas, indicating that instruction in programing improves students' critical…

  11. Nurturing Reflective Teaching During Critical-Thinking Instruction in a Computer Simulation Program

    ERIC Educational Resources Information Center

    Yeh, Yu-Chu

    2004-01-01

    Nurturing reflective teaching and improving critical-thinking instruction are two important goals in teacher education, but these are only achievable when teachers-in-training are provided with opportunities for building professional knowledge and for exhibiting reflective teaching practices. A computer simulation program (CS-TGCTS) was therefore…

  12. Critical Literacy in School-College Collaboration through Computer Networking: A Feminist Research Project.

    ERIC Educational Resources Information Center

    Fey, Marion

    1998-01-01

    Investigates the practice of critical literacy through asynchronous computer networking as students in a school-college collaboration examined assumptions relating to gender issues. Finds the medium proved to be an apt environment--students named experiences relating to gender issues that touched their lives, and students felt free to share ideas…

  13. Computer assessment of atherosclerosis from angiographic images

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Brooks, S. H.; Crawford, D. W.; Cashin, W. L.

    1982-01-01

    A computer method for detection and quantification of atherosclerosis from angiograms has been developed and used to measure lesion change in human clinical trials. The technique involves tracking the vessel edges and measuring individual lesions as well as the overall irregularity of the arterial image. Application of the technique to conventional arterial-injection femoral and coronary angiograms is outlined and an experimental study to extend the technique to analysis of intravenous angiograms of the carotid and cornary arteries is described.

  14. Computer code to assess accidental pollutant releases

    SciTech Connect

    Pendergast, M.M.; Huang, J.C.

    1980-07-01

    A computer code was developed to calculate the cumulative frequency distributions of relative concentrations of an air pollutant following an accidental release from a stack or from a building penetration such as a vent. The calculations of relative concentration are based on the Gaussian plume equations. The meteorological data used for the calculation are in the form of joint frequency distributions of wind and atmospheric stability.

  15. Critical Thinking Assessment: Measuring a Moving Target. Report & Recommendations of the South Carolina Higher Education Assessment Network Critical Thinking Task Force.

    ERIC Educational Resources Information Center

    Cook, Patricia; Johnson, Reid; Moore, Phil; Myers, Phyllis; Pauly, Susan; Pendarvis, Faye; Prus, Joe; Ulmer-Sottong, Lovely

    This report is part of South Carolina's effort to move toward "100 percent performance funding" for the state's public colleges and universities and results from a task force's investigation of ways to assess critical thinking. The following eight major findings are reported: (1) policy makers must determine priorities; (2) critical thinking lacks…

  16. Developing Critical Thinking Skills: Assessing the Effectiveness of Workbook Exercises

    ERIC Educational Resources Information Center

    Wallace, Elise D.; Jefferson, Renee N.

    2015-01-01

    To address the challenge of developing critical thinking skills in college students, this empirical study examines the effectiveness of cognitive exercises in developing those skills. The study uses Critical Thinking: Building the Basics by Walter, Knudsvig, and Smith (2003). This workbook is specifically designed to exercise and develop critical…

  17. Assessing Critical Thinking Performance of Postgraduate Students in Threaded Discussions

    ERIC Educational Resources Information Center

    Tan, Cheng Lee; Ng, Lee Luan

    2014-01-01

    Critical thinking has increasingly been seen as one of the important attributes where human capital is concerned and in line with this recognition, the tertiary educational institutions worldwide are putting more effort into designing courses that produce university leavers who are critical thinkers. This study aims to investigate the critical…

  18. What Do They Know? A Strategy for Assessing Critical Literacy

    ERIC Educational Resources Information Center

    Morrissette, Rhonda

    2007-01-01

    In this article, the author describes how difficult it is to know how critically literate her students are in the adult senior high school in which she is a teacher-librarian. She assumes that many would have gaps in their learning, including gaps in information and critical literacy skills, and that they were likely to approach all online…

  19. Modelling Critical Thinking through Learning-Oriented Assessment

    ERIC Educational Resources Information Center

    Lombard, B. J. J.

    2008-01-01

    One of the cornerstones peculiar to the outcomes-based approach adopted by the South African education and training sector is the so-called "critical outcomes". Included in one of these outcomes is the ability to think critically. Although this outcome articulates well with the cognitive domain of holistic development, it also gives rise to some…

  20. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  1. Conative Feedback in Computer-Based Assessment

    ERIC Educational Resources Information Center

    Economides, Anastasios A.

    2009-01-01

    Feedback is an important educational tool that can support learning and assessment. This article describes types of conative feedback that can support the student's conation, will, volition, or motivation. Any of these types of feedback can be presented to the student before, during, or after an educational activity or a test question.…

  2. Computational Toxicology in Cancer Risk Assessment

    EPA Science Inventory

    Risk assessment over the last half century has, for many individual cases served us well, but has proceeded on an extremely slow pace and has left us with considerable uncertainty. There are certainly thousands of compounds and thousands of exposure scenarios that remain unteste...

  3. Assessing Knowledge Change in Computer Science

    ERIC Educational Resources Information Center

    Nash, Jane Gradwohl; Bravaco, Ralph J.; Simonson, Shai

    2006-01-01

    The purpose of this study was to assess structural knowledge change across a two-week workshop designed to provide high-school teachers with training in Java and Object Oriented Programming. Both before and after the workshop, teachers assigned relatedness ratings to pairs of key concepts regarding Java and Object Oriented Programming. Their…

  4. Concept Map Assessment for Teaching Computer Programming

    ERIC Educational Resources Information Center

    Keppens, Jeroen; Hay, David

    2008-01-01

    A key challenge of effective teaching is assessing and monitoring the extent to which students have assimilated the material they were taught. Concept mapping is a methodology designed to model what students have learned. In effect, it seeks to produce graphical representations (called concept maps) of the concepts that are important to a given…

  5. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  6. International Computer and Information Literacy Study: Assessment Framework

    ERIC Educational Resources Information Center

    Fraillon, Julian; Schulz, Wolfram; Ainley, John

    2013-01-01

    The purpose of the International Computer and Information Literacy Study 2013 (ICILS 2013) is to investigate, in a range of countries, the ways in which young people are developing "computer and information literacy" (CIL) to support their capacity to participate in the digital age. To achieve this aim, the study will assess student…

  7. Use of Computer Assisted Assessment: Benefits to Students and Staff.

    ERIC Educational Resources Information Center

    Stephens, Derek

    2001-01-01

    Compares the use of computers with traditional paper and pencil to deliver objective tests for summative assessment with undergraduates in the United Kingdom. Considers issues of gender differences, objective testing, computer anxiety, and benefits to staff and students, and recommends the need for pre-test preparation and practice testing.…

  8. Using Computer-Assisted Assessment Heuristics for Usability Evaluations

    ERIC Educational Resources Information Center

    Sim, Gavin; Read, Janet C.

    2016-01-01

    Teaching practices within educational institutions have evolved through the increased adoption of technology to deliver the curriculum and the use of computers for assessment purposes. For educational technologists, there is a vast array of commercial computer applications available for the delivery of objective tests, and in some instances,…

  9. Computational assessment of organic photovoltaic candidate compounds

    NASA Astrophysics Data System (ADS)

    Borunda, Mario; Dai, Shuo; Olivares-Amaya, Roberto; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan

    2015-03-01

    Organic photovoltaic (OPV) cells are emerging as a possible renewable alternative to petroleum based resources and are needed to meet our growing demand for energy. Although not as efficient as silicon based cells, OPV cells have as an advantage that their manufacturing cost is potentially lower. The Harvard Clean Energy Project, using a cheminformatic approach of pattern recognition and machine learning strategies, has ranked a molecular library of more than 2.6 million candidate compounds based on their performance as possible OPV materials. Here, we present a ranking of the top 1000 molecules for use as photovoltaic materials based on their optical absorption properties obtained via time-dependent density functional theory. This computational search has revealed the molecular motifs shared by the set of most promising molecules.

  10. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    SciTech Connect

    Ivanova, T.; Laville, C.; Dyrda, J.; Mennerdahl, D.; Golovko, Y.; Raskach, K.; Tsiboulia, A.; Lee, G. S.; Woo, S. W.; Bidaud, A.; Sabouri, P.; Bledsoe, K.; Rearden, B.; Gulliford, J.; Michel-Sendis, F.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)

  11. Comparison of two pain assessment tools in nonverbal critical care patients.

    PubMed

    Paulson-Conger, Melissa; Leske, Jane; Maidl, Carolyn; Hanson, Andrew; Dziadulewicz, Laurel

    2011-12-01

    It is recommended that patient's self-report of pain should be obtained as often as possible as the "gold standard." Unfortunately in critical care, many factors can alter verbal communication with patients, making pain assessment more difficult. Scientific advances in understanding pain mechanisms, multidimensional methods of pain assessment, and analgesic pharmacology have improved pain management strategies. However, pain assessment for nonverbal patients in critical care continues to present a challenge for clinicians and researchers. The purpose of this study was to compare the Pain Assessment in Advanced Dementia (PAINAD) and the Critical-Care Pain Observation Tool (CPOT) scores for assessment in nonverbal critical care patients. A descriptive, comparative, prospective design was used in this study. A convenience sample of 100 critical care, nonverbal, adult patients of varying medical diagnoses who required pain evaluation were assessed with the PAINAD and CPOT scales. Data were collected over a 6-month period in all critical care areas. Observations of pain assessments for nonverbal patients who required pain evaluation were recorded on the PAINAD and the CPOT successively. Internal consistency reliability for the PAINAD was 0.80 and for the CPOT 0.72. Limits of agreement indicated that there was no difference in PAINAD and CPOT scores for assessing pain in nonverbal patients in critical care. Further research in the area of pain assessment for nonverbal patients in critical care is needed. PMID:22117753

  12. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  13. Review of Estelle and LOTOS with respect to critical computer applications

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.

  14. Assessing the Effectiveness of a Computer-Enhanced Classroom.

    ERIC Educational Resources Information Center

    McLean, Daniel D.; Brayley, Russell E.; Rathbun, Gail

    This paper looks at the process of assessment of a computer-enhanced classroom experience during the implementation phase. It utilizes an assessment model based on Rathbun and Goodrum (1994) that suggests multi-methods of data collection. The use of triangulation to answer a research question fits into the proposed multi-method design. This paper…

  15. Perceptions of University Students regarding Computer Assisted Assessment

    ERIC Educational Resources Information Center

    Jamil, Mubashrah

    2012-01-01

    Computer assisted assessment (CAA) is a common technique of assessment in higher educational institutions in Western countries, but a relatively new concept for students and teachers in Pakistan. It was therefore interesting to investigate students' perceptions about CAA practices from different universities of Pakistan. Information was collected…

  16. eWorkbook: A Computer Aided Assessment System

    ERIC Educational Resources Information Center

    Costagliola, Gennaro; Ferrucci, Filomena; Fuccella, Vittorio; Oliveto, Rocco

    2007-01-01

    Computer aided assessment (CAA) tools are more and more widely adopted in academic environments mixed to other assessment means. In this article, we present a CAA Web application, named eWorkbook, which can be used for evaluating learner's knowledge by creating (the tutor) and taking (the learner) on-line tests based on multiple choice, multiple…

  17. A Critical Examination of PISA's Assessment on Scientific Literacy

    ERIC Educational Resources Information Center

    Lau, Kwok-Chi

    2009-01-01

    The OECD "Programme for International Student Assessment" or (PISA) is one of the largest-scale international efforts that have been launched to assess students' scientific literacy. Such an international assessment would likely exert a profound impact on the science education policies of the participating countries/regions, including Hong Kong.…

  18. An Assessment of Post-Professional Athletic Training Students' Critical Thinking Skills and Dispositions

    ERIC Educational Resources Information Center

    Walter, Jessica Marie

    2013-01-01

    The need for outcome measures in critical thinking skills and dispositions for post-professional athletic training programs (PPATPs) is significant. It has been suggested that athletic trainers who are competent and disposed towards thinking critically will be successful in the profession. The purpose of this study is to assess critical thinking…

  19. Computer-controlled endoscopic performance assessment system.

    PubMed

    Hanna, G B; Drew, T; Clinch, P; Hunter, B; Cuschieri, A

    1998-07-01

    We have devised an advanced computer-controlled system (ADEPT) for the objective evaluation of endoscopic task performance. The system's hardware consists of a dual gimbal mechanism that accepts a variety of 5.0-mm standard endoscopic instruments for manipulation in a precisely mapped and enclosed work space. The target object consists of a sprung base plate incorporating various tasks. It is covered by a sprung perforated transparent top plate that has to be moved and held in the correct position by the operator to gain access to the various tasks. Standard video endoscope equipment provides the visual interface between the operator and the target-instrument field. Different target modules can be used, and the level of task difficulty can be adjusted by varying the manipulation, elevation, and azimuth angles. The system's software is designed to (a) prompt the surgeon with the information necessary to perform the task, (b) collect and collate data on performance during execution of specified tasks, and (c) save the data for future analysis. The system was alpha and beta tested to ensure that all functions operated correctly. PMID:9632879

  20. Workplace Educators' Interpretations of Their Assessment Practices: A View through a Critical Practice Lens

    ERIC Educational Resources Information Center

    Trede, Franziska; Smith, Megan

    2014-01-01

    In this paper, we examine workplace educators' interpretations of their assessment practices. We draw on a critical practice lens to conceptualise assessment practice as a social, relational and situated practice that becomes critical through critique and emancipation. We conducted semi-structured interviews followed by roundtable discussions…

  1. The Halpern Critical Thinking Assessment and Real-World Outcomes: Cross-National Applications

    ERIC Educational Resources Information Center

    Butler, Heather A.; Dwyer, Christopher P.; Hogan, Michael J.; Franco, Amanda; Rivas, Silvia F.; Saiz, Carlos; Almeida, Leandro S.

    2012-01-01

    The Halpern Critical Thinking Assessment (HCTA) is a reliable measure of critical thinking that has been validated with numerous qualitatively different samples and measures of academic success (Halpern, 2010a). This paper presents several cross-national applications of the assessment, and recent work to expand the validation of the HCTA with…

  2. Assessing Reliability: Critical Corrections for a Critical Examination of the Rorschach Comprehensive System.

    ERIC Educational Resources Information Center

    Meyer, Gregory J.

    1997-01-01

    In reply to criticism of the Rorschach Comprehensive System (CS) by J. Wood, M. Nezworski, and W. Stejskal (1996), this article presents a meta-analysis of published data indicating that the CS has excellent chance-corrected interrater reliability. It is noted that the erroneous assumptions of Wood et al. make their assertions about validity…

  3. Actor-critic models of the basal ganglia: new anatomical and computational perspectives.

    PubMed

    Joel, Daphna; Niv, Yael; Ruppin, Eytan

    2002-01-01

    A large number of computational models of information processing in the basal ganglia have been developed in recent years. Prominent in these are actor-critic models of basal ganglia functioning, which build on the strong resemblance between dopamine neuron activity and the temporal difference prediction error signal in the critic, and between dopamine-dependent long-term synaptic plasticity in the striatum and learning guided by a prediction error signal in the actor. We selectively review several actor-critic models of the basal ganglia with an emphasis on two important aspects: the way in which models of the critic reproduce the temporal dynamics of dopamine firing, and the extent to which models of the actor take into account known basal ganglia anatomy and physiology. To complement the efforts to relate basal ganglia mechanisms to reinforcement learning (RL), we introduce an alternative approach to modeling a critic network, which uses Evolutionary Computation techniques to 'evolve' an optimal RL mechanism, and relate the evolved mechanism to the basic model of the critic. We conclude our discussion of models of the critic by a critical discussion of the anatomical plausibility of implementations of a critic in basal ganglia circuitry, and conclude that such implementations build on assumptions that are inconsistent with the known anatomy of the basal ganglia. We return to the actor component of the actor-critic model, which is usually modeled at the striatal level with very little detail. We describe an alternative model of the basal ganglia which takes into account several important, and previously neglected, anatomical and physiological characteristics of basal ganglia-thalamocortical connectivity and suggests that the basal ganglia performs reinforcement-biased dimensionality reduction of cortical inputs. We further suggest that since such selective encoding may bias the representation at the level of the frontal cortex towards the selection of rewarded

  4. Does Computer-Based Motor Skill Assessment Training Transfer to Live Assessing?

    ERIC Educational Resources Information Center

    Kelly, Luke E.; Taliaferro, Andrea; Krause, Jennifer

    2012-01-01

    Developing competency in motor skill assessment has been identified as a critical need in physical educator preparation. We conducted this study to evaluate (a) the effectiveness of a web-based instructional program--Motor Skill Assessment Program (MSAP)--for developing assessment competency, and specifically (b) whether competency developed by…

  5. Risk Assessment Methodology for Protecting Our Critical Physical Infrastructures

    SciTech Connect

    BIRINGER,BETTY E.; DANNEELS,JEFFREY J.

    2000-12-13

    Critical infrastructures are central to our national defense and our economic well-being, but many are taken for granted. Presidential Decision Directive (PDD) 63 highlights the importance of eight of our critical infrastructures and outlines a plan for action. Greatly enhanced physical security systems will be required to protect these national assets from new and emerging threats. Sandia National Laboratories has been the lead laboratory for the Department of Energy (DOE) in developing and deploying physical security systems for the past twenty-five years. Many of the tools, processes, and systems employed in the protection of high consequence facilities can be adapted to the civilian infrastructure.

  6. Teaching in the Zone: Formative Assessments for Critical Thinking

    ERIC Educational Resources Information Center

    Maniotes, Leslie K.

    2010-01-01

    This article discusses how a school librarian can help students improve their critical thinking and strengthen their higher order thinking skills through the inquiry process. First, it will use a Guided Inquiry approach to examine how higher order thinking skills are taught within an inquiry paradigm. Next, it will consider how formative…

  7. Assess the Critical Period Hypothesis in Second Language Acquisition

    ERIC Educational Resources Information Center

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  8. Assessing the Cultural Proficiency of Teachers: A Critical Perspective

    ERIC Educational Resources Information Center

    Dennie, Deborah A. G.

    2013-01-01

    This critical case study addressed how the achievement gap reflects the culture gap between teachers and historically underrepresented students. This study allows educators to consider how attitudes on culture and diversity impact student achievement. It makes visible existing teacher and student relationships in a rural school system through the…

  9. Assessment of Prospective Teachers' Views Regarding the Concept of Criticism

    ERIC Educational Resources Information Center

    Karakus, Neslihan

    2015-01-01

    Critical thinking is one of the skills that exist in the Turkish course curriculum and is aimed to be acquired by students. The objective of the study is to determine prospective Turkish teachers' perspectives regarding the concept of critism, which is both a mental exercise and carries an important role in the world of ideas. In order to assess…

  10. Engaging Faculty in the Assessment and Improvement of Students' Critical Thinking Using the Critical Thinking Assessment Test

    ERIC Educational Resources Information Center

    Stein, Barry; Haynes, Ada

    2011-01-01

    Many assessment experts believe it is essential to develop faculty-driven assessment tools in order to engage faculty in meaningful assessment that can improve student learning. Tennessee Technological University (TTU) has been involved in an extended effort during the last ten years to develop, refine, and nationally disseminate an instrument to…

  11. Evaluation of theoretical critical angle including mass effects for channeling by computer simulation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Wataru

    2011-06-01

    The calculated critical angles using the theory included mass effects of Zheng et al. for the axial channeling of ion have been investigated by the computer simulations, making comparisons with the theory of Lindhard and the precise formula of Barrett's numerical simulations. The computer simulations employing the ACOCT program code, which treats the atomic collisions three-dimensionally and is based on the binary collision approximation (BCA), were carried out for the channeling of He, Ne, Ar, Kr, Xe and Rn ions incident along the <1 0 0> axis in Al, Cu, Ag and Pt crystals. A slight dependence of the channeling critical angle on the atomic number of incident ion in the ACOCT results is in agreement with that in the calculated ones using the theory of mass effects. The average critical angles in the ACOCT results for the channeling of six rare gas ions are approximately 5.0/ Z2 times the magnitude of the theoretical critical angles with mass effects, where Z2 is the atomic number of crystal atom. Besides, the results show that the calculated critical angles using the theory with mass effects are substantially larger than those using the theory of Lindhard, the Barrett's formula and the formula by the ACOCT simulations for He ions impinging on Al, Cu, Ag and Pt crystals, and that the channeling critical angles in the ACOCT results agree well with those in the calculated ones using Barrett's formula for 0.6-50 MeV He ions incident on Cu and Ag crystals and 5-50 MeV He ions impinging on Al and Pt crystals.

  12. Criticism or praise? The impact of verbal versus text-only computer feedback on social presence, intrinsic motivation, and recall.

    PubMed

    Bracken, Cheryl Campanella; Jeffres, Leo W; Neuendorf, Kimberly A

    2004-06-01

    The Computers Are Social Actors (CASA) paradigm asserts that human computer users interact socially with computers, and has provided extensive evidence that this is the case. In this experiment (n = 134), participants received either praise or criticism from a computer. Independent variables were the direction feedback (praise or criticism), and voice channel (verbal or text-only). Dependent variables measured via a computer-based questionnaire were recall, perceived ability, intrinsic motivation, and perceptions of the computer as a social entity. Results demonstrate that participants had similar reactions to computers as predicted by interpersonal communication research with participants who received text-only criticism reporting higher levels of intrinsic motivation, perceived ability, and recall. Additionally, the computer was seen as more intelligent. Implications for theory and application are discussed. PMID:15257835

  13. Providing Formative Feedback From a Summative Computer-aided Assessment

    PubMed Central

    Sewell, Robert D. E.

    2007-01-01

    Objectives To examine the effectiveness of providing formative feedback for summative computer-aided assessment. Design Two groups of first-year undergraduate life science students in pharmacy and neuroscience who were studying an e-learning package in a common pharmacology module were presented with a computer-based summative assessment. A sheet with individualized feedback derived from each of the 5 results sections of the assessment was provided to each student. Students were asked via a questionnaire to evaluate the form and method of feedback. Assessment The students were able to reflect on their performance and use the feedback provided to guide their future study or revision. There was no significant difference between the responses from pharmacy and neuroscience students. Students' responses on the questionnaire indicated a generally positive reaction to this form of feedback. Conclusions Findings suggest that additional formative assessment conveyed by this style and method would be appreciated and valued by students. PMID:17533442

  14. Assessing Critical Thinking: A College's Journey and Lessons Learned

    ERIC Educational Resources Information Center

    Peach, Brian E.; Mukherjee, Arup; Hornyak, Martin

    2007-01-01

    The business college at University of West Florida is currently in the throes of implementing an assessment initiative to develop student learning outcomes, design assessment devices to measure learning, analyze the measurement results to identify learning shortfalls, and establish feedback mechanisms to modify the curriculum to address the…

  15. Evaluation of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) and the Collected Data.

    PubMed

    Huff, Andrew G; Hodges, James S; Kennedy, Shaun P; Kircher, Amy

    2015-08-01

    To protect and secure food resources for the United States, it is crucial to have a method to compare food systems' criticality. In 2007, the U.S. government funded development of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) to determine which food and agriculture systems were most critical to the nation. FASCAT was developed in a collaborative process involving government officials and food industry subject matter experts (SMEs). After development, data were collected using FASCAT to quantify threats, vulnerabilities, consequences, and the impacts on the United States from failure of evaluated food and agriculture systems. To examine FASCAT's utility, linear regression models were used to determine: (1) which groups of questions posed in FASCAT were better predictors of cumulative criticality scores; (2) whether the items included in FASCAT's criticality method or the smaller subset of FASCAT items included in DHS's risk analysis method predicted similar criticality scores. Akaike's information criterion was used to determine which regression models best described criticality, and a mixed linear model was used to shrink estimates of criticality for individual food and agriculture systems. The results indicated that: (1) some of the questions used in FASCAT strongly predicted food or agriculture system criticality; (2) the FASCAT criticality formula was a stronger predictor of criticality compared to the DHS risk formula; (3) the cumulative criticality formula predicted criticality more strongly than weighted criticality formula; and (4) the mixed linear regression model did not change the rank-order of food and agriculture system criticality to a large degree. PMID:25857323

  16. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  17. Optimal recovery sequencing for critical infrastructure resilience assessment.

    SciTech Connect

    Vugrin, Eric D.; Brown, Nathanael J. K.; Turnquist, Mark Alan

    2010-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the identification of optimal recovery strategies that maximize resilience. To this goal, we formulate a bi-level optimization problem for infrastructure network models. In the 'inner' problem, we solve for network flows, and we use the 'outer' problem to identify the optimal recovery modes and sequences. We draw from the literature of multi-mode project scheduling problems to create an effective solution strategy for the resilience optimization model. We demonstrate the application of this approach to a set of network models, including a national railroad model and a supply chain for Army munitions production.

  18. Pain assessment and management in critically ill older adults.

    PubMed

    Kirksey, Kenn M; McGlory, Gayle; Sefcik, Elizabeth F

    2015-01-01

    Older adults comprise approximately 50% of patients admitted to critical care units in the United States. This population is particularly susceptible to multiple morbidities that can be exacerbated by confounding factors like age-related safety risks, polypharmacy, poor nutrition, and social isolation. The elderly are particularly vulnerable to health conditions (heart disease, stroke, and diabetes) that put them at greater risk of morbidity and mortality. When an older adult presents to the emergency department with 1 or more of these life-altering diagnoses, an admission to the intensive care unit is often inevitable. Pain is one of the most pervasive manifestations exhibited by intensive care unit patients. There are myriad challenges for critical care nurses in caring for patients experiencing pain-inadequate communication (cognitively impaired or intubated patients), addressing the concerns of family members, or gaps in patients' knowledge. The purpose of this article was to discuss the multidimensional nature of pain and identify concepts innate to pain homeostenosis for elderly patients in the critical care setting. Evidence-based strategies, including an interprofessional team approach and best practice recommendations regarding pharmacological and nonpharmacological pain management, are presented. PMID:26039645

  19. Assessment of Teaching Methods and Critical Thinking in a Course for Science Majors

    NASA Astrophysics Data System (ADS)

    Speck, Angela; Ruzhitskaya, L.; Whittington, A. G.

    2014-01-01

    Ability to think critically is a key ingredient to the scientific mindset. Students who take science courses may or may not be predisposed to critical thinking - the ability to evaluate information analytically. Regardless of their initial stages, students can significantly improve their critical thinking through learning and practicing their reasoning skills, critical assessments, conducting and reflecting on observations and experiments, building their questioning and communication skills, and through the use of other techniques. While, there are several of teaching methods that may help to improve critical thinking, there are only a few assessment instruments that can help in evaluating the efficacy of these methods. Critical thinking skills and improvement in those skills are notoriously difficult to measure. Assessments that are based on multiple-choice questions demonstrate students’ final decisions but not their thinking processes. In addition, during the course of studies students may develop subject-based critical thinking while not being able to extend the skills to the general critical thinking. As such, we wanted to design and conduct a study on efficacy of several teaching methods in which we would learn how students’ improve their thinking processes within a science discipline as well as in everyday life situations. We conducted a study among 20 astronomy, physics and geology majors-- both graduate and undergraduate students-- enrolled in our Solar System Science course (mostly seniors and early graduate students) at the University of Missouri. We used the Ennis-Weir Critical Thinking Essay test to assess students’ general critical thinking and, in addition, we implemented our own subject-based critical thinking assessment. Here, we present the results of this study and share our experience on designing a subject-based critical thinking assessment instrument.

  20. Assessment of examinations in computer science doctoral education

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types of candidacy examinations that are required. The performance of computer science departments, estimated via two common surrogate metrics, in these different categories of candidacy requirements are compared and contrasted and the correlation between candidacy requirements and program/department performance is assessed.

  1. Assessment of Critical Mass Laboratory safeguards and security upgrades

    SciTech Connect

    Merrill, B.J.; DeMyer, J.J.

    1985-05-31

    Pacific Northwest Laboratory (PNL) conducted an evaluation of the safeguards and security systems at the Critical Mass Laboratory (CML) in February 1985, to identify appropriate upgrading actions necessary to ensure that effective and efficient systems consistent with DOE-RL policies, procedures, and site priorities are in place. Since that evaluation, there have been changes in Patrol contingency philosophy, response tactics, and distribution of manpower. Because of these changes, and at the request of DOE-RL, PNL has re-evaluated the safeguards and security systems in place at CML.

  2. Investigation of the "Convince Me" Computer Environment as a Tool for Critical Argumentation about Public Policy Issues

    ERIC Educational Resources Information Center

    Adams, Stephen T.

    2003-01-01

    The "Convince Me" computer environment supports critical thinking by allowing users to create and evaluate computer-based representations of arguments. This study investigates theoretical and design considerations pertinent to using "Convince Me" as an educational tool to support reasoning about public policy issues. Among computer environments…

  3. Critical assessment of Reynolds stress turbulence models using homogeneous flows

    NASA Technical Reports Server (NTRS)

    Shabbir, Aamir; Shih, Tsan-Hsing

    1992-01-01

    In modeling the rapid part of the pressure correlation term in the Reynolds stress transport equations, extensive use has been made of its exact properties which were first suggested by Rotta. These, for example, have been employed in obtaining the widely used Launder, Reece and Rodi (LRR) model. Some recent proposals have dropped one of these properties to obtain new models. We demonstrate, by computing some simple homogeneous flows, that doing so does not lead to any significant improvements over the LRR model and it is not the right direction in improving the performance of existing models. The reason for this, in our opinion, is that violation of one of the exact properties can not bring in any new physics into the model. We compute thirteen homogeneous flows using LRR (with a recalibrated rapid term constant), IP and SSG models. The flows computed include the flow through axisymmetric contraction; axisymmetric expansion; distortion by plane strain; and homogeneous shear flows with and without rotation. Results show that for most general representation for a model linear in the anisotropic tensor, performs either better or as good as the other two models of the same level.

  4. An Overview of a Programme of Research to Support the Assessment of Critical Thinking

    ERIC Educational Resources Information Center

    Black, Beth

    2012-01-01

    Cambridge Assessment has more than 20 years experience in assessing Critical Thinking (CT) in a number of diverse tests and qualifications, unrivalled by any other body within the UK. In recent years, a number of research activities have been carried out in order to support these assessments, with a focus on the validity of measurement. This paper…

  5. Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments

    SciTech Connect

    Pevey, Ronald E.

    2005-09-15

    Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL.

  6. Computation of cross sections and dose conversion factors for criticality accident dosimetry.

    PubMed

    Devine, R T

    2004-01-01

    In the application of criticality accident dosemeters the cross sections and fluence-to-dose conversion factors have to be computed. The cross section and fluence-to-dose conversion factor for the thermal and epi-thermal contributions to neutron dose are well documented; for higher energy regions (>100 keV) these depend on the spectrum assumed. Fluence is determined using threshold detectors. The cross sections require the folding of an expected spectrum with the reaction cross sections. The fluence-to-dose conversion factors also require a similar computation. The true and effective thresholds are used to include the information on the expected spectrum. The spectra can either be taken from compendia or measured at the facility at which the exposures are to be expected. The cross sections can be taken from data computations or analytic representations and the fluence-to-dose conversion factors are determined by various standards making bodies. The problem remaining is the method of computation. The purpose of this paper is to compare two methods for computing these factors: analytic and Monte Carlo. PMID:15353697

  7. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    DOE PAGESBeta

    Jacobsen, Jesper Lykke; Scullard, Christian R.

    2013-02-04

    We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size ofmore » B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.« less

  8. Critical Inquiry and Writing Centers: A Methodology of Assessment

    ERIC Educational Resources Information Center

    Bell, Diana Calhoun; Frost, Alanna

    2012-01-01

    By examining one writing center's role in student success, this project offers two examples of the way writing centers impact student engagement. This analysis models a methodology that writing and learning center directors can utilize in order to foster effective communication with stakeholders. By conducting data-driven assessment, directors can…

  9. Needs Assessment: A Critical Tool for Guidance Planning.

    ERIC Educational Resources Information Center

    Martin, Susan A.

    This study was conducted to identify what elementary school staff and district parents believed to be important elementary guidance services. A needs assessment questionnaire was given to all 112 staff members (principals, teaching staff, teacher aides, secretaries, and school nurse) in the district's 2 elementary schools. Fifty-eight completed…

  10. Assessing Preservice Teachers' Dispositions: A Critical Dimension of Professional Preparation

    ERIC Educational Resources Information Center

    Rike, Cheryl J.; Sharp, L. Kathryn

    2008-01-01

    The early childhood faculty at the University of Memphis developed the Early Childhood Education Behaviors & Dispositions Checklist for four main purposes: (1) The faculty needed a way to clearly communicate to students the expectations for their dispositions and the means of assessment; (2) It is a professional obligation in preservice teacher…

  11. A critical review of seven selected neighborhood sustainability assessment tools

    SciTech Connect

    Sharifi, Ayyoob Murayama, Akito

    2013-01-15

    Neighborhood sustainability assessment tools have become widespread since the turn of 21st century and many communities, mainly in the developed world, are utilizing these tools to measure their success in approaching sustainable development goals. In this study, seven tools from Australia, Europe, Japan, and the United States are selected and analyzed with the aim of providing insights into the current situations; highlighting the strengths, weaknesses, successes, and failures; and making recommendations for future improvements. Using a content analysis, the issues of sustainability coverage, pre-requisites, local adaptability, scoring and weighting, participation, reporting, and applicability are discussed in this paper. The results of this study indicate that most of the tools are not doing well regarding the coverage of social, economic, and institutional aspects of sustainability; there are ambiguities and shortcomings in the weighting, scoring, and rating; in most cases, there is no mechanism for local adaptability and participation; and, only those tools which are embedded within the broader planning framework are doing well with regard to applicability. - Highlights: Black-Right-Pointing-Pointer Seven widely used assessment tools were analyzed. Black-Right-Pointing-Pointer There is a lack of balanced assessment of sustainability dimensions. Black-Right-Pointing-Pointer Tools are not doing well regarding the applicability. Black-Right-Pointing-Pointer Refinements are needed to make the tools more effective. Black-Right-Pointing-Pointer Assessment tools must be integrated into the planning process.

  12. Critical Issues in Assessing Teacher Compensation. Backgrounder. No. 2638

    ERIC Educational Resources Information Center

    Richwine, Jason; Biggs, Andrew G.

    2012-01-01

    A November 2011 Heritage Foundation report--"Assessing the Compensation of Public-School Teachers"--presented data on teacher salaries and benefits in order to inform debates about teacher compensation reform. The report concluded that public-school teacher compensation is far ahead of what comparable private-sector workers enjoy, and that…

  13. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    SciTech Connect

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.

  14. Incorporating Active-Learning Techniques and Competency Assessment into a Critical Care Elective Course

    PubMed Central

    Hibbs, Jennifer L.

    2012-01-01

    Objective. To design, implement, and measure the effectiveness of a critical care elective course for second-year students in a 3-year accelerated doctor of pharmacy (PharmD) program. Design. A critical care elective course was developed that used active-learning techniques, including cooperative learning and group presentations, to deliver content on critical care topics. Group presentations had to include a disease state overview, practice guidelines, and clinical recommendations, and were evaluated by course faculty members and peers. Assessment. Students’ mean scores on a 20-question critical-care competency assessment administered before and after the course improved by 11% (p < 0.05). Course evaluations and comments were positive. Conclusion. A critical care elective course resulted in significantly improved competency in critical care and was well-received by students. PMID:23049101

  15. Computer-aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.

    1984-01-01

    This research on pilot response to critical in-flight events employs a unique methodology including an interactive computer-aided scenario-testing system. Navigation displays, instrument-panel displays, and assorted textual material are presented on a touch-sensitive CRT screen. Problem diagnosis scenarios, destination-diversion scenarios and combined destination/diagnostic tests are available. A complete time history of all data inquiries and responses is maintained. Sample results of diagnosis scenarios obtained from testing 38 licensed pilots are presented and discussed.

  16. Ecological risk assessment of acidification in the Northern Eurasia using critical load concept

    SciTech Connect

    Bashkin, V.; Golinets, O.

    1995-12-31

    This research presents the risk analysis of acid forming compounds input using critical loads (CL) values of sulfur, nitrogen, and acidity under the computer calculations for terrestrial and freshwater ecosystems of Northern Eurasia. The Cl values are used to set goals for future deposition rates of acidifying and eutrophication compounds so that the environment is protected. CL values for various ecosystems are determined using EM GIS approach. The most influential sources, such as nitrogen, sulfur and base cations uptake by vegetation, surface and groundwater leaching from terrestrial to freshwater ecosystems are described for the whole territory under study regarding uncertainty analysis and the level of corresponding risk assessment. This may be explained by many factors of which the most important are: the estimation of plant uptake is carried out on the basis of data on the biogeochemical cycling of various elements, for which adequate quantitative characterization for all ecosystems under study is either absent or insufficient; reliable information on the quantitative assessment of the ratio between perennial plant biomes increase and dead matter is absent for the required level of spatial and temporal resolution; reliable data on surface and underground runoff in various ecosystems are rare; the influence of hydrothermic factors on the above mentioned processes has not been quantitatively determined at required level of model resolution.

  17. Nuclear criticality safety assessment of the proposed CFC replacement coolants

    SciTech Connect

    Jordan, W.C.; Dyer, H.R.

    1993-12-01

    The neutron multiplication characteristics of refrigerant-114 (R-114) and proposed replacement coolants perfluorobutane (C{sub 4}F{sub 10}) and cycloperfluorobutane C{sub 4}F{sub 8}) have been compared by evaluating the infinite media multiplication factors of UF{sub 6}/H/coolant systems and by replacement calculations considering a 10-MW freezer/sublimer. The results of these comparisons demonstrate that R-114 is a neutron absorber, due to its chlorine content, and that the alternative fluorocarbon coolants are neutron moderators. Estimates of critical spherical geometries considering mixtures of UF{sub 6}/HF/C{sub 4}F{sub 10} indicate that the flourocarbon-moderated systems are large compared with water-moderated systems. The freezer/sublimer calculations indicate that the alternative coolants are more reactive than R-114, but that the reactivity remains significantly below the condition of water in the tubes, which was a limiting condition. Based on these results, the alternative coolants appear to be acceptable; however, several follow-up tasks have been recommended, and additional evaluation will be required on an individual equipment basis.

  18. Prediction of State Mandated Assessment Mathematics Scores from Computer Based Mathematics and Reading Preview Assessments

    ERIC Educational Resources Information Center

    Costa-Guerra, Boris

    2012-01-01

    The study sought to understand whether MAPs computer based assessment of math and language skills using MAPs reading scores can predict student scores on the NMSBA. A key question was whether or not the prediction can be improved by including student language skill scores. The study explored the effectiveness of computer based preview assessments…

  19. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  20. [Risk assessment for pressure ulcer in critical patients].

    PubMed

    Gomes, Flávia Sampaio Latini; Bastos, Marisa Antonini Ribeiro; Matozinhos, Fernanda Penido; Temponi, Hanrieti Rotelli; Velásquez-Meléndez, Gustavo

    2011-04-01

    Bedridden patients are in risk to developing pressure ulcers and represent a priority group to be studied to identify this condition. To reach this goal, specific instruments are used to assess this problem. The objective of this study was to analyze the risk factors to developing pressure ulcers in adult patients hospitalized in ICUs. This is a sectional analytical study, in which evaluations were performed on 140 patients, hospitalized in 22 ICUs, using the Braden scale. Results showed that patients hospitalized from 15 days or more showed some level of risk. The highest frequencies of pressure ulcers were found in patients in the following categories: sensorial perception (completely limited), moistness (constantly moist), mobility (completely immobilized), activity (bedridden), nutrition (adequate) and friction and shear (problem). In conclusion, the use of this scale is an important strategy when providing care to patients in intensive treatment. PMID:21655778

  1. Ensuring reliability of safety-critical clinical applications of computational cardiac models

    PubMed Central

    Pathmanathan, Pras; Gray, Richard A.

    2013-01-01

    Computational models of cardiac electrophysiology have been used for over half a century to investigate physiological mechanisms and generate hypotheses for experimental testing, and are now starting to play a role in clinical applications. There is currently a great deal of interest in using models as diagnostic or therapeutic aids, for example using patient-specific whole-heart simulations to optimize cardiac resynchronization therapy, ablation therapy, and defibrillation. However, if models are to be used in safety-critical clinical decision making, the reliability of their predictions needs to be thoroughly investigated. In engineering and the physical sciences, the field of “verification, validation and uncertainty quantification” (VVUQ) [also known as “verification and validation” (V&V)] has been developed for rigorously evaluating the credibility of computational model predictions. In this article we first discuss why it is vital that cardiac models be developed and evaluated within a VVUQ framework, and then consider cardiac models in the context of each of the stages in VVUQ. We identify some of the major difficulties which may need to be overcome for cardiac models to be used in safely-critical clinical applications. PMID:24376423

  2. Ensuring reliability of safety-critical clinical applications of computational cardiac models.

    PubMed

    Pathmanathan, Pras; Gray, Richard A

    2013-01-01

    Computational models of cardiac electrophysiology have been used for over half a century to investigate physiological mechanisms and generate hypotheses for experimental testing, and are now starting to play a role in clinical applications. There is currently a great deal of interest in using models as diagnostic or therapeutic aids, for example using patient-specific whole-heart simulations to optimize cardiac resynchronization therapy, ablation therapy, and defibrillation. However, if models are to be used in safety-critical clinical decision making, the reliability of their predictions needs to be thoroughly investigated. In engineering and the physical sciences, the field of "verification, validation and uncertainty quantification" (VVUQ) [also known as "verification and validation" (V&V)] has been developed for rigorously evaluating the credibility of computational model predictions. In this article we first discuss why it is vital that cardiac models be developed and evaluated within a VVUQ framework, and then consider cardiac models in the context of each of the stages in VVUQ. We identify some of the major difficulties which may need to be overcome for cardiac models to be used in safely-critical clinical applications. PMID:24376423

  3. Criticality Model

    SciTech Connect

    A. Alsaed

    2004-09-14

    computational method will be used for evaluating the criticality potential of configurations of fissionable materials (in-package and external to the waste package) within the repository at Yucca Mountain, Nevada for all waste packages/waste forms. The criticality computational method is also applicable to preclosure configurations. The criticality computational method is a component of the methodology presented in ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003). How the criticality computational method fits in the overall disposal criticality analysis methodology is illustrated in Figure 1 (YMP 2003, Figure 3). This calculation will not provide direct input to the total system performance assessment for license application. It is to be used as necessary to determine the criticality potential of configuration classes as determined by the configuration probability analysis of the configuration generator model (BSC 2003a).

  4. Advanced criticality assessment method for sewer pipeline assets.

    PubMed

    Syachrani, S; Jeong, H D; Chung, C S

    2013-01-01

    For effective management of water and wastewater infrastructure, the United States Environmental Protection Agency (US-EPA) has long emphasized the significant role of risk in prioritizing and optimizing asset management decisions. High risk assets are defined as assets with a high probability of failure (e.g. soon to fail, old, poor condition) and high consequences of failure (e.g. environmental impact, high expense, safety concerns, social disruption). In practice, the consequences of failure are often estimated by experts through a Delphi method. However, the estimation of the probability of failure has been challenging as it requires the thorough analysis of the historical condition assessment data, repair and replacement records, and other factors influencing the deterioration of the asset. The most common predictor in estimating the probability of failure is calendar age. However, a simple reliance on calendar age as a basis for estimating the asset's deterioration pattern completely ignores the different aging characteristics influenced by various operational and environmental conditions. This paper introduces a new approach of using 'real age' in estimating the probability of failure. Unlike the traditional calendar age method, the real age represents the adjusted age based on the unique operational and environmental conditions of the asset. Depending on the individual deterioration pattern, the real age could be higher or lower than its calendar age. Using the concept of real age, the probability of failure of an asset can be more accurately estimated. PMID:23508155

  5. Critical comparison of elastography methods to assess chronic liver disease.

    PubMed

    Friedrich-Rust, Mireen; Poynard, Thierry; Castera, Laurent

    2016-07-01

    Staging of liver fibrosis and diagnosis, or exclusion, of early compensated liver cirrhosis are important in the treatment decisions and surveillance of patients with chronic liver disease. Good diagnostic accuracy, increased availability and the possibility to perform follow-up examinations led to the implementation of noninvasive methods into clinical practice. Noninvasive tests are increasingly included in national and international guidelines, leaving liver biopsy reserved for patients with unexplained discordance or suspected additional aetiologies of liver disease. In addition to staging of liver fibrosis, data on the prognostic value of these methods have increased in the past few years and are of great importance for patient care. This Review focuses on elastography methods for noninvasive assessment of liver fibrosis, disease severity and prognosis. Although liver elastography started with transient elastography, at present all large ultrasonography companies offer an elastography technique integrated in their machines. The goal of this Review is to summarize the methodological problems of noninvasive tests in general, in addition to providing an overview on currently available techniques and latest developments in liver elastography. PMID:27273167

  6. Assessing the Amazon Cloud Suitability for CLARREO's Computational Needs

    NASA Technical Reports Server (NTRS)

    Goldin, Daniel; Vakhnin, Andrei A.; Currey, Jon C.

    2015-01-01

    In this document we compare the performance of the Amazon Web Services (AWS), also known as Amazon Cloud, with the CLARREO (Climate Absolute Radiance and Refractivity Observatory) cluster and assess its suitability for computational needs of the CLARREO mission. A benchmark executable to process one month and one year of PARASOL (Polarization and Anistropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar) data was used. With the optimal AWS configuration, adequate data-processing times, comparable to the CLARREO cluster, were found. The assessment of alternatives to the CLARREO cluster continues and several options, such as a NASA-based cluster, are being considered.

  7. Assessment of nonequilibrium radiation computation methods for hypersonic flows

    NASA Technical Reports Server (NTRS)

    Sharma, Surendra

    1993-01-01

    The present understanding of shock-layer radiation in the low density regime, as appropriate to hypersonic vehicles, is surveyed. Based on the relative importance of electron excitation and radiation transport, the hypersonic flows are divided into three groups: weakly ionized, moderately ionized, and highly ionized flows. In the light of this division, the existing laboratory and flight data are scrutinized. Finally, an assessment of the nonequilibrium radiation computation methods for the three regimes in hypersonic flows is presented. The assessment is conducted by comparing experimental data against the values predicted by the physical model.

  8. Evidence Based Clinical Assessment of Child and Adolescent Social Phobia: A Critical Review of Rating Scales

    ERIC Educational Resources Information Center

    Tulbure, Bogdan T.; Szentagotai, Aurora; Dobrean, Anca; David, Daniel

    2012-01-01

    Investigating the empirical support of various assessment instruments, the evidence based assessment approach expands the scientific basis of psychotherapy. Starting from Hunsley and Mash's evaluative framework, we critically reviewed the rating scales designed to measure social anxiety or phobia in youth. Thirteen of the most researched social…

  9. Using Art to Assess Reading Comprehension and Critical Thinking in Adolescents

    ERIC Educational Resources Information Center

    Holdren, Tara Shoemaker

    2012-01-01

    In the current testing environment, high school reading teachers may often rely on a multiple-choice assessment as the best practice. This study suggests that a visual arts assessment of reading comprehension can rigorously measure critical thinking. This action research study follows 21 high school juniors through the selection, creation, and…

  10. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    SciTech Connect

    Jacobsen, Jesper Lykke; Scullard, Christian R.

    2013-02-04

    We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size of B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.

  11. Does carbon black disaggregate in lung fluid? A critical assessment.

    PubMed

    Levy, Len; Chaudhuri, Ishrat S; Krueger, Nils; McCunney, Robert J

    2012-10-15

    Carbon black is an industrially produced particulate form of nearly pure elemental carbon. The basic building blocks of carbon black are (1) primary particles, minute pieces of matter with defined physical boundaries; (2) aggregates, collections of strongly bound or fused particles; and (3) agglomerates, collections of weakly bound aggregates. Industrial carbon black is produced within a closed reactor where the primary particles form aggregates, which become the indivisible entities of carbon black. These aggregates then form agglomerates, which are the typical form of carbon black in commerce. Carbon black is often used in in vitro and in vivo particle toxicology investigations as a reference nanoparticle. The toxicology studies often report the sizes of the primary particles but rarely the sizes of the aggregates or agglomerates. It appears in many cases that there is a limited understanding of the fact that carbon black typically does not exist as primary particles but instead exists as aggregates and agglomerates. Moreover, many toxicology studies manipulate carbon black particles in order to disperse them so that the form of carbon black used in these toxicology studies may be substantially different from the form that may be encountered in the workplace environment. Since the main exposure route for carbon black is inhalation, the question arose as to whether inhaled carbon black may deagglomerate or disaggregate to either smaller aggregates or primary particles when in contact with lung fluids. This question relates to the concern that there may be additional hazards of smaller particles, such as their ability to translocate to tissues and organs beyond the lung and the ability to pass through the blood-brain barrier. The purpose of this assessment is to review the existing literature for evidence as to whether carbon black deagglomerates or disaggregates into smaller aggregates or primary particles when in contact with lung fluid. On the basis of a review

  12. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  13. Prediction of critical heat flux in water-cooled plasma facing components using computational fluid dynamics.

    SciTech Connect

    Bullock, James H.; Youchison, Dennis Lee; Ulrickson, Michael Andrew

    2010-11-01

    Several commercial computational fluid dynamics (CFD) codes now have the capability to analyze Eulerian two-phase flow using the Rohsenow nucleate boiling model. Analysis of boiling due to one-sided heating in plasma facing components (pfcs) is now receiving attention during the design of water-cooled first wall panels for ITER that may encounter heat fluxes as high as 5 MW/m2. Empirical thermalhydraulic design correlations developed for long fission reactor channels are not reliable when applied to pfcs because fully developed flow conditions seldom exist. Star-CCM+ is one of the commercial CFD codes that can model two-phase flows. Like others, it implements the RPI model for nucleate boiling, but it also seamlessly transitions to a volume-of-fluid model for film boiling. By benchmarking the results of our 3d models against recent experiments on critical heat flux for both smooth rectangular channels and hypervapotrons, we determined the six unique input parameters that accurately characterize the boiling physics for ITER flow conditions under a wide range of absorbed heat flux. We can now exploit this capability to predict the onset of critical heat flux in these components. In addition, the results clearly illustrate the production and transport of vapor and its effect on heat transfer in pfcs from nucleate boiling through transition to film boiling. This article describes the boiling physics implemented in CCM+ and compares the computational results to the benchmark experiments carried out independently in the United States and Russia. Temperature distributions agreed to within 10 C for a wide range of heat fluxes from 3 MW/m2 to 10 MW/m2 and flow velocities from 1 m/s to 10 m/s in these devices. Although the analysis is incapable of capturing the stochastic nature of critical heat flux (i.e., time and location may depend on a local materials defect or turbulence phenomenon), it is highly reliable in determining the heat flux where boiling instabilities begin

  14. Computer assessment of interview data using latent semantic analysis.

    PubMed

    Dam, Gregory; Kaufmann, Stefan

    2008-02-01

    Clinical interviews are a powerful method for assessing students' knowledge and conceptualdevelopment. However, the analysis of the resulting data is time-consuming and can create a "bottleneck" in large-scale studies. This article demonstrates the utility of computational methods in supporting such an analysis. Thirty-four 7th-grade student explanations of the causes of Earth's seasons were assessed using latent semantic analysis (LSA). Analyses were performed on transcriptions of student responses during interviews administered, prior to (n = 21) and after (n = 13) receiving earth science instruction. An instrument that uses LSA technology was developed to identify misconceptions and assess conceptual change in students' thinking. Its accuracy, as determined by comparing its classifications to the independent coding performed by four human raters, reached 90%. Techniques for adapting LSA technology to support the analysis of interview data, as well as some limitations, are discussed. PMID:18411522

  15. Open-ended approaches to science assessment using computers

    NASA Astrophysics Data System (ADS)

    Singley, Mark K.; Taft, Hessy L.

    1995-03-01

    We discuss the potential role of technology in evaluating learning outcomes in large-scale, widespread science assessments of the kind typically done at ETS, such as the GRE, or the College Board SAT II Subject Tests. We describe the current state-of-the-art in this area, as well as briefly outline the history of technology in large-scale science assessment and ponder possibilities for the future. We present examples from our own work in the domain of chemistry, in which we are designing problem solving interfaces and scoring programs for stoichiometric and other kinds of quantitative problem solving. We also present a new scientific reasoning item type that we are prototyping on the computer. It is our view that the technological infrastructure for large-scale constructed response science assessment is well on its way to being available, although many technical and practical hurdles remain.

  16. Computer technology futures for the improvement of assessment

    NASA Astrophysics Data System (ADS)

    Baker, Eva L.; O'Neil, Harold F.

    1995-03-01

    With a focus on the interaction between computer technology and assessment, we first review the typical functions served by technology in the support of various assessment purposes. These include efficiencies in person and item sampling and in administration, analysis, and reporting. Our major interest is the extent to which technology can provide unique opportunities to understand performance. Two examples are described: a tool-based knowledge representation approach to assess content understanding and a team problem-solving task involving negotiation. The first example, using HyperCard as well as paper-and-pencil variations, has been tested in science and history fields. Its continuing challenge is to determine a strategy for creating and validating scoring criteria. The second example, involving a workforce readiness task for secondary school, has used expert-novice comparisons to infer performance standards. These examples serve as the context for the exploration of validity, equity, and utility.

  17. Assessment of asthmatic inflammation using hybrid fluorescence molecular tomography-x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Ma, Xiaopeng; Prakash, Jaya; Ruscitti, Francesca; Glasl, Sarah; Stellari, Fabio Franco; Villetti, Gino; Ntziachristos, Vasilis

    2016-01-01

    Nuclear imaging plays a critical role in asthma research but is limited in its readings of biology due to the short-lived signals of radio-isotopes. We employed hybrid fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) for the assessment of asthmatic inflammation based on resolving cathepsin activity and matrix metalloproteinase activity in dust mite, ragweed, and Aspergillus species-challenged mice. The reconstructed multimodal fluorescence distribution showed good correspondence with ex vivo cryosection images and histological images, confirming FMT-XCT as an interesting alternative for asthma research.

  18. Assessment of Zero Power Critical Experiments and Needs for a Fission Surface Power System

    SciTech Connect

    Jim R Parry; John Darrell bess; Brad T. Rearden; Gary A. Harms

    2009-06-01

    The National Aeronautics and Space Administration (NASA) is providing funding to the Department of Energy (DOE) to assess, develop, and test nuclear technologies that could provide surface power to a lunar outpost. Sufficient testing of this fission surface power (FSP) system will need to be completed to enable a decision by NASA for flight development. The near-term goal for the FSP work is to conduct the minimum amount of testing needed to validate the system performance within an acceptable risk. This report attempts to assess the current modeling capabilities and quantify any bias associated with the modeling methods for designing the nuclear reactor. The baseline FSP system is a sodium-potassium (NaK) cooled, fast spectrum reactor with 93% 235U enriched HEU-O2 fuel, SS316 cladding, and beryllium reflectors with B4C control drums. The FSP is to produce approximately 40 kWe net power with a lifetime of at least 8 years at full power. A flight-ready FSP is to be ready for launch and deployment by 2020. Existing benchmarks from the International Criticality Safety Benchmark Evaluation Program (ICSBEP) were reviewed and modeled in MCNP. An average bias of less than 0.6% was determined using the ENDF/B-VII cross-section libraries except in the case of subcritical experiments, which exhibited an average bias of approximately 1.5%. The bias increases with increasing reflector worth of the beryllium. The uncertainties and sensitivities in cross section data for the FSP model and ZPPR-20 configurations were assessed using TSUNAMI-3D. The cross-section covariance uncertainty in the FSP model was calculated as 2.09%, which was dominated by the uncertainty in the 235U(n,?) reactions. Global integral indices were generated in TSUNAMI-IP using pre-release SCALE 6 cross-section covariance data. The ZPPR-20 benchmark models exhibit strong similarity with the FSP model. A penalty assessment was performed to determine the degree of which the FSP model could not be characterized

  19. Improving Educational Assessment: A Computer-Adaptive Multiple Choice Assessment Using NRET as the Scoring Method

    ERIC Educational Resources Information Center

    Sie Hoe, Lau; Ngee Kiong, Lau; Kian Sam, Hong; Bin Usop, Hasbee

    2009-01-01

    Assessment is central to any educational process. Number Right (NR) scoring method is a conventional scoring method for multiple choice items, where students need to pick one option as the correct answer. One point is awarded for the correct response and zero for any other responses. However, it has been heavily criticized for guessing and failure…

  20. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    PubMed

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture. PMID:27441149

  1. Control System Applicable Use Assessment of the Secure Computing Corporation - Secure Firewall (Sidewinder)

    SciTech Connect

    Hadley, Mark D.; Clements, Samuel L.

    2009-01-01

    Battelle’s National Security & Defense objective is, “applying unmatched expertise and unique facilities to deliver homeland security solutions. From detection and protection against weapons of mass destruction to emergency preparedness/response and protection of critical infrastructure, we are working with industry and government to integrate policy, operational, technological, and logistical parameters that will secure a safe future”. In an ongoing effort to meet this mission, engagements with industry that are intended to improve operational and technical attributes of commercial solutions that are related to national security initiatives are necessary. This necessity will ensure that capabilities for protecting critical infrastructure assets are considered by commercial entities in their development, design, and deployment lifecycles thus addressing the alignment of identified deficiencies and improvements needed to support national cyber security initiatives. The Secure Firewall (Sidewinder) appliance by Secure Computing was assessed for applicable use in critical infrastructure control system environments, such as electric power, nuclear and other facilities containing critical systems that require augmented protection from cyber threat. The testing was performed in the Pacific Northwest National Laboratory’s (PNNL) Electric Infrastructure Operations Center (EIOC). The Secure Firewall was tested in a network configuration that emulates a typical control center network and then evaluated. A number of observations and recommendations are included in this report relating to features currently included in the Secure Firewall that support critical infrastructure security needs.

  2. Tongue-Tie Assessment and Division: A Time-Critical Intervention to Optimise Breastfeeding

    PubMed Central

    Donati-Bourne, Jack; Batool, Zainab; Hendrickse, Charles; Bowley, Douglas

    2015-01-01

    Objectives: Recent reports have highlighted the benefits of surgical division of tongue-tie (frenulotomy) in infants with breastfeeding difficulties. There is no clear consensus defining the appropriate age for this procedure to be undertaken in selected infants. We aimed to evaluate the impact of delays in time between referral and frenulotomy in relation to maternal abandonment of breastfeeding. Materials and Methods: This was a prospective cohort study done in out-patient Neonatal Surgery Department, Birmingham Heartlands Hospital, Birmingham, UK, between April 2013 and July 2013. All infants, referred to our tongue-tie clinic between April and July 2013, were studied prospectively. Referral time lags were calculated using computer records; details regarding breastfeeding were collected by an independent interviewer completing a questionnaire. Results: Seventy patients were included. The median infant age at clinic was 28.5 days [range 1-126]. Fifty eight [82%] of mothers had breastfeeding difficulty and their infants were confirmed to have a prominent tongue-tie. By the time of their clinic attendance, breastfeeding had either not been established or abandoned in 21%. Despite difficulty, 61% of mothers persisted breastfeeding and all these mothers consented for frenulotomy. At time of clinic, median age of infants whose mothers had abandoned breastfeeding was 37 days [range 1-80] compared to 27 days [range 1-126] in infants whose mothers had persisted. Conclusions: We demonstrated a time-critical dimension for frenulotomy: delay beyond 4-weeks from referral to assessment of neonatal tongue-tie is more likely to be associated with abandonment of breastfeeding. Timely assessment and division of tongue-tie in selected infants can therefore play an important role in a birthing unit’s breastfeeding strategy. PMID:26023527

  3. Computational Fluid Dynamics Framework for Turbine Biological Performance Assessment

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Carlson, Thomas J.; Ebner, Laurie L.; Sick, Mirjam; Cada, G. F.

    2011-05-04

    In this paper, a method for turbine biological performance assessment is introduced to bridge the gap between field and laboratory studies on fish injury and turbine design. Using this method, a suite of biological performance indicators is computed based on simulated data from a computational fluid dynamics (CFD) model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. If the relationship between the dose of an injury mechanism and frequency of injury (dose-response) is known from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from various turbine designs, the engineer can identify the more-promising designs. Discussion here is focused on Kaplan-type turbines, although the method could be extended to other designs. Following the description of the general methodology, we will present sample risk assessment calculations based on CFD data from a model of the John Day Dam on the Columbia River in the USA.

  4. Computational Pollutant Environment Assessment from Propulsion-System Testing

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; McConnaughey, Paul; Chen, Yen-Sen; Warsi, Saif

    1996-01-01

    An asymptotic plume growth method based on a time-accurate three-dimensional computational fluid dynamics formulation has been developed to assess the exhaust-plume pollutant environment from a simulated RD-170 engine hot-fire test on the F1 Test Stand at Marshall Space Flight Center. Researchers have long known that rocket-engine hot firing has the potential for forming thermal nitric oxides, as well as producing carbon monoxide when hydrocarbon fuels are used. Because of the complex physics involved, most attempts to predict the pollutant emissions from ground-based engine testing have used simplified methods, which may grossly underpredict and/or overpredict the pollutant formations in a test environment. The objective of this work has been to develop a computational fluid dynamics-based methodology that replicates the underlying test-stand flow physics to accurately and efficiently assess pollutant emissions from ground-based rocket-engine testing. A nominal RD-170 engine hot-fire test was computed, and pertinent test-stand flow physics was captured. The predicted total emission rates compared reasonably well with those of the existing hydrocarbon engine hot-firing test data.

  5. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention

    PubMed Central

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2014-01-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald et al, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on “joining,” which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached .83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings. PMID:24500022

  6. Can Dental Cone Beam Computed Tomography Assess Bone Mineral Density?

    PubMed Central

    2014-01-01

    Mineral density distribution of bone tissue is altered by active bone modeling and remodeling due to bone complications including bone disease and implantation surgery. Clinical cone beam computed tomography (CBCT) has been examined whether it can assess oral bone mineral density (BMD) in patient. It has been indicated that CBCT has disadvantages of higher noise and lower contrast than conventional medical computed tomography (CT) systems. On the other hand, it has advantages of a relatively lower cost and radiation dose but higher spatial resolution. However, the reliability of CBCT based mineral density measurement has not yet been fully validated. Thus, the objectives of this review are to discuss 1) why assessment of BMD distribution is important and 2) whether the clinical CBCT can be used as a potential tool to measure the BMD. Brief descriptions of image artefacts associated with assessment of gray value, which has been used to account for mineral density, in CBCT images are provided. Techniques to correct local and conversion errors in obtaining the gray values in CBCT images are also introduced. This review can be used as a quick reference for users who may encounter these errors during analysis of CBCT images. PMID:25006568

  7. The development and testing of a qualitative instrument designed to assess critical thinking

    NASA Astrophysics Data System (ADS)

    Clauson, Cynthia Louisa

    This study examined a qualitative approach to assess critical thinking. An instrument was developed that incorporates an assessment process based on Dewey's (1933) concepts of self-reflection and critical thinking as problem solving. The study was designed to pilot test the critical thinking assessment process with writing samples collected from a heterogeneous group of students. The pilot test included two phases. Phase 1 was designed to determine the validity and inter-rater reliability of the instrument using two experts in critical thinking, problem solving, and literacy development. Validity of the instrument was addressed by requesting both experts to respond to ten questions in an interview. The inter-rater reliability was assessed by analyzing the consistency of the two experts' scorings of the 20 writing samples to each other, as well as to my scoring of the same 20 writing samples. Statistical analyses included the Spearman Rho and the Kuder-Richardson (Formula 20). Phase 2 was designed to determine the validity and reliability of the critical thinking assessment process with seven science teachers. Validity was addressed by requesting the teachers to respond to ten questions in a survey and interview. Inter-rater reliability was addressed by comparing the seven teachers' scoring of five writing samples with my scoring of the same five writing samples. Again, the Spearman Rho and the Kuder-Richardson (Formula 20) were used to determine the inter-rater reliability. The validity results suggest that the instrument is helpful as a guide for instruction and provides a systematic method to teach and assess critical thinking while problem solving with students in the classroom. The reliability results show the critical thinking assessment instrument to possess fairly high reliability when used by the experts, but weak reliability when used by classroom teachers. A major conclusion was drawn that teachers, as well as students, would need to receive instruction

  8. Computed radionuclide urogram for assessing acute renal failure

    SciTech Connect

    Schlegel, J.U.; Lang, E.K.

    1980-05-01

    The computed radionuclide urogram is advocated as a noninvasive diagnostic method for differentiation of the most common prerenal, renal, and postrenal causes of acute renal failure. On the basis of characteristic changes in the effective renal plasma flow rate, the calculated filtration fraction, and the calculated glomerular filtration rate, prerenal conditions such as renal artery stenosis or thrombosis, renal conditions such as acute rejection or acute tubular necrosis, and postrenal conditions such as obstruction or leakage, which are the most common causes of acute renal failure, can be differentiated. In conjunction with morphologic criteria derived from sonograms, a diagnosis with acceptable confidence can be rendered in most instances. Both the computed radionuclide urogram and sonogram are noninvasive and can be used without adverse effects in the presence of azotemia and even anuria. This also makes feasible reexamination at intervals to assess effect of therapy and offer prognostic information.

  9. Computational techniques for the assessment of fracture repair.

    PubMed

    Anderson, Donald D; Thomas, Thaddeus P; Campos Marin, Ana; Elkins, Jacob M; Lack, William D; Lacroix, Damien

    2014-06-01

    The combination of high-resolution three-dimensional medical imaging, increased computing power, and modern computational methods provide unprecedented capabilities for assessing the repair and healing of fractured bone. Fracture healing is a natural process that restores the mechanical integrity of bone and is greatly influenced by the prevailing mechanical environment. Mechanobiological theories have been proposed to provide greater insight into the relationships between mechanics (stress and strain) and biology. Computational approaches for modelling these relationships have evolved from simple tools to analyze fracture healing at a single point in time to current models that capture complex biological events such as angiogenesis, stochasticity in cellular activities, and cell-phenotype specific activities. The predictive capacity of these models has been established using corroborating physical experiments. For clinical application, mechanobiological models accounting for patient-to-patient variability hold the potential to predict fracture healing and thereby help clinicians to customize treatment. Advanced imaging tools permit patient-specific geometries to be used in such models. Refining the models to study the strain fields within a fracture gap and adapting the models for case-specific simulation may provide more accurate examination of the relationship between strain and fracture healing in actual patients. Medical imaging systems have significantly advanced the capability for less invasive visualization of injured musculoskeletal tissues, but all too often the consideration of these rich datasets has stopped at the level of subjective observation. Computational image analysis methods have not yet been applied to study fracture healing, but two comparable challenges which have been addressed in this general area are the evaluation of fracture severity and of fracture-associated soft tissue injury. CT-based methodologies developed to assess and quantify

  10. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    NASA Astrophysics Data System (ADS)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step

  11. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment

    PubMed Central

    Jain, Sangeeta; Hutchison, James; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees. PMID:27610029

  12. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment.

    PubMed

    Jain, Sangeeta; Menon, Kusum; Piquette, Dominique; Gottesman, Ronald; Hutchison, James; Gilfoyle, Elaine; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees. PMID:27610029

  13. Computer database takes confusion out of multi-property assessments

    SciTech Connect

    Kinworthy, M.L.

    1996-03-01

    Managing environmental site assessments in multi-property transactions poses a special challenge. Multi-site ESAs require a tremendous amount of coordination, data collection and interpretation; often, these tasks must be completed according to accelerated timeframes to meet client deadlines. The tasks can be particularly challenging when several hundred sites are included in the transaction. In such cases, a computer database can be an effective, powerful tool for tracking and managing property data, and generating customized reports for large, multi-site ESAs.

  14. Assessment of a human computer interface prototyping environment

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1993-01-01

    A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.

  15. RESRAD-CHEM: A computer code for chemical risk assessment

    SciTech Connect

    Cheng, J.J.; Yu, C.; Hartmann, H.M.; Jones, L.G.; Biwer, B.M.; Dovel, E.S.

    1993-10-01

    RESRAD-CHEM is a computer code developed at Argonne National Laboratory for the U.S. Department of Energy to evaluate chemically contaminated sites. The code is designed to predict human health risks from multipathway exposure to hazardous chemicals and to derive cleanup criteria for chemically contaminated soils. The method used in RESRAD-CHEM is based on the pathway analysis method in the RESRAD code and follows the U.S. Environmental Protection Agency`s (EPA`s) guidance on chemical risk assessment. RESRAD-CHEM can be used to evaluate a chemically contaminated site and, in conjunction with the use of the RESRAD code, a mixed waste site.

  16. Pain assessment and management in the critically ill: wizardry or science?

    PubMed

    Puntillo, Kathleen

    2003-07-01

    Assessment and management of patients' pain across practice settings have recently received the increased attention of providers, patients, patients' families, and regulatory agencies. Scientific advances in understanding pain mechanisms, multidimensional methods of pain assessment, and analgesic pharmacology have aided in the improvement of pain management practices. However, pain assessment and management for critical care patients, especially those with communication barriers, continue to present challenges to clinicians and researchers. The state of nursing science of pain in critically ill patients, including development and testing of pain assessment methods and clinical trials of pharmacological interventions, is described. Special emphasis is placed on results from the Thunder Project II, a major multisite investigation of procedural pain. PMID:12882060

  17. Assessment of liver ablation using cone beam computed tomography

    PubMed Central

    Abdel-Rehim, Mohamed; Ronot, Maxime; Sibert, Annie; Vilgrain, Valérie

    2015-01-01

    AIM: To investigate the feasibility and accuracy of cone beam computed tomography (CBCT) in assessing the ablation zone after liver tumor ablation. METHODS: Twenty-three patients (17 men and 6 women, range: 45-85 years old, mean age 65 years) with malignant liver tumors underwent ultrasound-guided percutaneous tumor ablation [radiofrequency (n = 14), microwave (n = 9)] followed by intravenous contrast-enhanced CBCT. Baseline multidetector computed tomography (MDCT) and peri-procedural CBCT images were compared. CBCT image quality was assessed as poor, good, or excellent. Image fusion was performed to assess tumor coverage, and quality of fusion was rated as bad, good, or excellent. Ablation zone volumes on peri-procedural CBCT and post-procedural MDCT were compared using the non-parametric paired Wilcoxon t-test. RESULTS: Rate of primary ablation effectiveness was 100%. There were no complications related to ablation. Local tumor recurrence and new liver tumors were found 3 mo after initial treatment in one patient (4%). The ablation zone was identified in 21/23 (91.3%) patients on CBCT. The fusion of baseline MDCT and peri-procedural CBCT images was feasible in all patients and showed satisfactory tumor coverage (at least 5-mm margin). CBCT image quality was poor, good, and excellent in 2 (9%), 8 (35%), and 13 (56%), patients respectively. Registration quality between peri-procedural CBCT and post-procedural MDCT images was good to excellent in 17/23 (74%) patients. The median ablation volume on peri-procedural CBCT and post-procedural MDCT was 30 cm3 (range: 4-95 cm3) and 30 cm3 (range: 4-124 cm3), respectively (P-value > 0.2). There was a good correlation (r = 0.79) between the volumes of the two techniques. CONCLUSION: Contrast-enhanced CBCT after tumor ablation of the liver allows early assessment of the ablation zone. PMID:25593467

  18. Assessment of spare reliability for multi-state computer networks within tolerable packet unreliability

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Kuei; Huang, Cheng-Fu

    2015-04-01

    From a quality of service viewpoint, the transmission packet unreliability and transmission time are both critical performance indicators in a computer system when assessing the Internet quality for supervisors and customers. A computer system is usually modelled as a network topology where each branch denotes a transmission medium and each vertex represents a station of servers. Almost every branch has multiple capacities/states due to failure, partial failure, maintenance, etc. This type of network is known as a multi-state computer network (MSCN). This paper proposes an efficient algorithm that computes the system reliability, i.e., the probability that a specified amount of data can be sent through k (k ≥ 2) disjoint minimal paths within both the tolerable packet unreliability and time threshold. Furthermore, two routing schemes are established in advance to indicate the main and spare minimal paths to increase the system reliability (referred to as spare reliability). Thus, the spare reliability can be readily computed according to the routing scheme.

  19. Assessing executive function using a computer game: computational modeling of cognitive processes.

    PubMed

    Hagler, Stuart; Jimison, Holly Brugge; Pavel, Misha

    2014-07-01

    Early and reliable detection of cognitive decline is one of the most important challenges of current healthcare. In this project, we developed an approach whereby a frequently played computer game can be used to assess a variety of cognitive processes and estimate the results of the pen-and-paper trail making test (TMT)--known to measure executive function, as well as visual pattern recognition, speed of processing, working memory, and set-switching ability. We developed a computational model of the TMT based on a decomposition of the test into several independent processes, each characterized by a set of parameters that can be estimated from play of a computer game designed to resemble the TMT. An empirical evaluation of the model suggests that it is possible to use the game data to estimate the parameters of the underlying cognitive processes and using the values of the parameters to estimate the TMT performance. Cognitive measures and trends in these measures can be used to identify individuals for further assessment, to provide a mechanism for improving the early detection of neurological problems, and to provide feedback and monitoring for cognitive interventions in the home. PMID:25014944

  20. Connecting Assessment and Instruction to Help Students Become More Critical Producers of Multimedia

    ERIC Educational Resources Information Center

    Ostenson, Jonathan William

    2012-01-01

    Classroom teachers have been encouraged to incorporate more multimedia production in the classroom as a means of helping students develop critical media literacy skills. However, they have not always been well trained in how to evaluate the work students create; many teachers struggle to know which criteria to use in assessing student work. This…

  1. Assessing Change in Student Critical Thinking for Introduction to Sociology Classes

    ERIC Educational Resources Information Center

    Rickles, Michael L.; Schneider, Rachel Zimmer; Slusser, Suzanne R.; Williams, Dana M.; Zipp, John F.

    2013-01-01

    Although there is widespread agreement among academics that critical thinking is an important component to the college classroom, there is little empirical evidence to verify that it is being taught in courses. Using four sections of introductory sociology, we developed an experimental design using pretests and posttests to assess students'…

  2. Development of Critical Thinking Self-Assessment System Using Wearable Device

    ERIC Educational Resources Information Center

    Gotoh, Yasushi

    2015-01-01

    In this research the author defines critical thinking as skills and dispositions which enable one to solve problems logically and to attempt to reflect autonomously by means of meta-cognitive activities on one's own problem-solving processes. The author focuses on providing meta-cognitive knowledge to help with self-assessment. To develop…

  3. A Study on Critical Thinking Assessment System of College English Writing

    ERIC Educational Resources Information Center

    Dong, Tian; Yue, Lu

    2015-01-01

    This research attempts to discuss the validity of introducing the evaluation of students' critical thinking skills (CTS) into the assessment system of college English writing through an empirical study. In this paper, 30 College English Test Band 4 (CET-4) writing samples were collected and analyzed. Students' CTS and the final scores of collected…

  4. Developing Institutional Standards for Critical Thinking Using the Collegiate Learning Assessment. Research Brief

    ERIC Educational Resources Information Center

    Hardison, Chaitra M.; Vilamovska, Anna-Marie

    2009-01-01

    The Collegiate Learning Assessment (CLA) measures students' critical thinking skills, but some institutions remain uncertain how to interpret the results. RAND researchers designed a method that institutions can use to develop their own standards. It consists of a three-step process and a system of checks to validate the results. This method will…

  5. Critical Thinking and Formative Assessments: Increasing the Rigor in Your Classroom

    ERIC Educational Resources Information Center

    Moore, Betsy; Stanley, Todd

    2010-01-01

    Develop your students' critical thinking skills and prepare them to perform competitively in the classroom, on state tests, and beyond. In this book, Moore and Stanley show you how to effectively instruct your students to think on higher levels, and how to assess their progress. As states move toward common achievement standards, teachers have…

  6. Development and Evaluation of the Diagnostic Power for a Computer-Based Two-Tier Assessment

    ERIC Educational Resources Information Center

    Lin, Jing-Wen

    2016-01-01

    This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…

  7. Primary School Students' Attitudes towards Computer Based Testing and Assessment in Turkey

    ERIC Educational Resources Information Center

    Yurdabakan, Irfan; Uzunkavak, Cicek

    2012-01-01

    This study investigated the attitudes of primary school students towards computer based testing and assessment in terms of different variables. The sample for this research is primary school students attending a computer based testing and assessment application via CITO-OIS. The "Scale on Attitudes towards Computer Based Testing and Assessment" to…

  8. An Assessment of Thermodynamic Models for HFC Refrigerant Mixtures Through the Critical-Point Calculation

    NASA Astrophysics Data System (ADS)

    Akasaka, Ryo

    2008-08-01

    An assessment of thermodynamic models for HFC refrigerant mixtures based on Helmholtz energy equations of state was made through critical-point calculations for ternary and quaternary mixtures. The calculations were performed using critical-point criteria expressed in terms of the Helmholtz free energy. For three ternary mixtures: difluoromethane (R-32) + pentafluoroethane (R-125) + 1,1,1,2-tetrafluoroethane (R-134a), R-125 + R-134a + 1,1,1-trifluoroethane (R-143a), and carbon dioxide (CO2) + R-32 + R-134a, and one quaternary mixture, R-32 + R-125 + R-134a + R-143a, calculated critical points were compared with experimental values, and the capability of the mixture models for representing the critical behavior was discussed.

  9. Sedimentation equilibria in polydisperse ferrofluids: critical comparisons between experiment, theory, and computer simulation.

    PubMed

    Elfimova, Ekaterina A; Ivanov, Alexey O; Lakhtina, Ekaterina V; Pshenichnikov, Alexander F; Camp, Philip J

    2016-05-14

    The sedimentation equilibrium of dipolar particles in a ferrofluid is studied using experiment, theory, and computer simulation. A theory of the particle-concentration profile in a dipolar hard-sphere fluid is developed, based on the local-density approximation and accurate expressions from a recently introduced logarithmic free energy approach. The theory is tested critically against Monte Carlo simulation results for monodisperse and bidisperse dipolar hard-sphere fluids in homogeneous gravitational fields. In the monodisperse case, the theory is very accurate over broad ranges of gravitational field strength, volume fraction, and dipolar coupling constant. In the bidisperse case, with realistic dipolar coupling constants and compositions, the theory is excellent at low volume fraction, but is slightly inaccurate at high volume fraction in that it does not capture a maximum in the small-particle concentration profile seen in simulations. Possible reasons for this are put forward. Experimental measurements of the magnetic-susceptibility profile in a real ferrofluid are then analysed using the theory. The concentration profile is linked to the susceptibility profile using the second-order modified mean-field theory. It is shown that the experimental results are not consistent with the sample being monodisperse. By introducing polydispersity in the simplest possible way, namely by assuming the system is a binary mixture, almost perfect agreement between theory and experiment is achieved. PMID:27042815

  10. Crosswords to computers: a critical review of popular approaches to cognitive enhancement.

    PubMed

    Jak, Amy J; Seelye, Adriana M; Jurick, Sarah M

    2013-03-01

    Cognitive enhancement strategies have gained recent popularity and have the potential to benefit clinical and non-clinical populations. As technology advances and the number of cognitively healthy adults seeking methods of improving or preserving cognitive functioning grows, the role of electronic (e.g., computer and video game based) cognitive training becomes more relevant and warrants greater scientific scrutiny. This paper serves as a critical review of empirical evaluations of publically available electronic cognitive training programs. Many studies have found that electronic training approaches result in significant improvements in trained cognitive tasks. Fewer studies have demonstrated improvements in untrained tasks within the trained cognitive domain, non-trained cognitive domains, or on measures of everyday function. Successful cognitive training programs will elicit effects that generalize to untrained, practical tasks for extended periods of time. Unfortunately, many studies of electronic cognitive training programs are hindered by methodological limitations such as lack of an adequate control group, long-term follow-up and ecologically valid outcome measures. Despite these limitations, evidence suggests that computerized cognitive training has the potential to positively impact one's sense of social connectivity and self-efficacy. PMID:23423553

  11. An assessment of criticality safety at the Department of Energy Rocky Flats Plant, Golden, Colorado, July--September 1989

    SciTech Connect

    Mattson, Roger J.

    1989-09-01

    This is a report on the 1989 independent Criticality Safety Assessment of the Rocky Flats Plant, primarily in response to public concerns that nuclear criticality accidents involving plutonium may have occurred at this nuclear weapon component fabrication and processing plant. The report evaluates environmental issues, fissile material storage practices, ventilation system problem areas, and criticality safety practices. While no evidence of a criticality accident was found, several recommendations are made for criticality safety improvements. 9 tabs.

  12. Pain assessment in the critically ill adult: Recent evidence and new trends.

    PubMed

    Gélinas, Céline

    2016-06-01

    Pain assessment in the critically ill adult remains a daily clinical challenge. Position statements and practice guidelines exist to guide the ICU care team in the pain assessment process. The patient's self-report of pain remains the gold standard measure for pain and should be obtained as often as possible. When self-report is impossible to obtain, observational pain scales including the Behavioural Pain Scale (BPS) and the Critical-Care Pain Observation Tool (CPOT) have been recommended for clinical use in the critically ill adult. However, their adaptation and validation in brain-injured and burn ICU patients is required. Family caregivers may help in the identification of pain-related behaviours and should be more involved in the ICU pain assessment process. Fluctuations in vital signs should only be considered as cues for further assessment of pain with appropriate tools, and may better represent adverse events of severe pain. Other physiologic measures of pain should be explored in the ICU, and pupillometry appears as a promising technique to further study. Implementation of systematic pain assessment approaches using tools adapted to the patient's ability to communicate and condition has shown positive effects on ICU pain practices and patient outcomes, but randomised control trials are needed to confirm these conclusions. PMID:27067745

  13. Construct validity of the Chelsea critical care physical assessment tool: an observational study of recovery from critical illness

    PubMed Central

    2014-01-01

    Introduction Intensive care unit-acquired weakness (ICU-AW) is common in survivors of critical illness, resulting in global weakness and functional deficit. Although ICU-AW is well described subjectively in the literature, the value of objective measures has yet to be established. This project aimed to evaluate the construct validity of the Chelsea Critical Care Physical Assessment tool (CPAx) by analyzing the association between CPAx scores and hospital-discharge location, as a measure of functional outcome. Methods The CPAx was integrated into practice as a service-improvement initiative in an 11-bed intensive care unit (ICU). For patients admitted for more than 48 hours, between 10 May 2010 and 13 November 2013, the last CPAx score within 24 hours of step down from the ICU or death was recorded (n = 499). At hospital discharge, patients were separated into seven categories, based on continued rehabilitation and care needs. Descriptive statistics were used to explore the association between ICU discharge CPAx score and hospital-discharge location. Results Of the 499 patients, 171 (34.3%) returned home with no ongoing rehabilitation or care input; 131 (26.2%) required community support; 28 (5.6%) went to inpatient rehabilitation for <6 weeks; and 25 (5.0%) went to inpatient rehabilitation for >6 weeks; 27 (5.4%) required nursing home level of care; 80 (16.0%) died in the ICU, and 37 (7.4%) died in hospital. A significant difference was found in the median CPAx score between groups (P < 0.0001). Four patients (0.8%) scored full marks (50) on the CPAx, all of whom went home with no ongoing needs; 16 patients (3.2%) scored 0 on the CPAx, all of whom died within 24 hours. A 0.8% ceiling effect and a 3.2% floor effect of the CPAx is found in the ICU. Compliance with completion of the CPAx stabilized at 78% of all ICU admissions. Conclusion The CPAx score at ICU discharge has displayed construct validity by crudely discriminating between groups with

  14. Alcohol Withdrawal Syndrome in Critically Ill Patients: Identification, Assessment, and Management.

    PubMed

    Sutton, Lynsey J; Jutel, Annemarie

    2016-02-01

    Management of alcohol withdrawal in critically ill patients is a challenge. The alcohol consumption histories of intensive care patients are often incomplete, limiting identification of patients with alcohol use disorders. Abrupt cessation of alcohol places these patients at risk for alcohol withdrawal syndrome. Typically benzodiazepines are used as first-line therapy to manage alcohol withdrawal. However, if patients progress to more severe withdrawal or delirium tremens, extra adjunctive medications in addition to benzodiazepines may be required. Sedation and mechanical ventilation may also be necessary. Withdrawal assessment scales such as the Clinical Institute of Withdrawal Assessment are of limited use in these patients. Instead, general sedation-agitation scales and delirium detection tools have been used. The important facets of care are the rapid identification of at-risk patients through histories of alcohol consumption, management with combination therapies, and ongoing diligent assessment and evaluation. (Critical Care Nurse. 2016;36[1]:28-39). PMID:26830178

  15. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  16. Prediction of critical illness in elderly outpatients using elder risk assessment: a population-based study

    PubMed Central

    Biehl, Michelle; Takahashi, Paul Y; Cha, Stephen S; Chaudhry, Rajeev; Gajic, Ognjen; Thorsteinsdottir, Bjorg

    2016-01-01

    Rationale Identifying patients at high risk of critical illness is necessary for the development and testing of strategies to prevent critical illness. The aim of this study was to determine the relationship between high elder risk assessment (ERA) score and critical illness requiring intensive care and to see if the ERA can be used as a prediction tool to identify elderly patients at the primary care visit who are at high risk of critical illness. Methods A population-based historical cohort study was conducted in elderly patients (age >65 years) identified at the time of primary care visit in Rochester, MN, USA. Predictors including age, previous hospital days, and comorbid health conditions were identified from routine administrative data available in the electronic medical record. The main outcome was critical illness, defined as sepsis, need for mechanical ventilation, or death within 2 years of initial visit. Patients with an ERA score of 16 were considered to be at high risk. The discrimination of the ERA score was assessed using area under the receiver operating characteristic curve. Results Of the 13,457 eligible patients, 9,872 gave consent for medical record review and had full information on intensive care unit utilization. The mean age was 75.8 years (standard deviation ±7.6 years), and 58% were female, 94% were Caucasian, 62% were married, and 13% were living in nursing homes. In the overall group, 417 patients (4.2%) suffered from critical illness. In the 1,134 patients with ERA >16, 154 (14%) suffered from critical illness. An ERA score ≥16 predicted critical illness (odds ratio 6.35; 95% confidence interval 3.51–11.48). The area under the receiver operating characteristic curve was 0.75, which indicated good discrimination. Conclusion A simple model based on easily obtainable administrative data predicted critical illness in the next 2 years in elderly outpatients with up to 14% of the highest risk population suffering from critical illness

  17. Collaborative mobile sensing and computing for civil infrastructure condition assessment: framework and applications

    NASA Astrophysics Data System (ADS)

    Chen, Jianfei; Chen, ZhiQiang

    2012-04-01

    Multi-function sensing and imaging devices, GPS, communication and computing devices are being ubiquitously used in field by engineers in civil engineering and emergence response practice. Field engineers, however, still have difficulty to balance between ever-increasing data collection demand and capacity of real-time data processing and knowledge sharing. In addition, field engineers usually work collaboratively in a geospatially large area; however, the existing sensing and computing modalities used in the field are not designed to accommodate this condition. In this paper, we present a solution framework of collaborative mobile sensing and computing (CMSC) for civil infrastructure condition assessment, with the Android-based mobile devices as the basic nodes in the framework with a potential of adding other auxiliary imaging and sensing devices into the network. Difficulties in mixed C++ and Java code programming that are critical to realize the framework are discussed. With a few prototypes illustrated in this paper, we envisage that the proposed CMSC framework will enable seamless integration of sensing, imaging, real-time processing and knowledge discovery in future engineers-centered field reconnaissances and civil infrastructure condition assessment.

  18. [The importance of assessing the "quality of life" in surgical interventions for critical lower limb ischaemia].

    PubMed

    Papp, László

    2004-02-01

    'Patency' and 'limb salvage' are not automatically valid parameters when the functional outcome of treatment for critical limb ischaemia is assessed. In a small number of patients the functional result is not favourable despite the anatomical patency and limb salvage. The considerable investment of human/financial resources in the treatment of these patients is retrospectively questionable in such cases. Quality of Life questionnaires give valuable information on the functional outcome of any means of treatment for critical ischaemia. The problem with the generic tools in one particular sub-group of patients is the reliability and validity of the tests. The first disease-specific test in critical limb ischaemia is the King's College Vascular Quality of Life (VascuQoL) Questionnaire. Its use is recommended in patients with critical lower limb ischaemia. It is very useful for scientific reporting and is able to show retrospectively that particular group of patients in whom the technical success of the treatment did not result in improvement in quality of life. In general practice the use of the questionnaire can decrease the factor of subjectivity in the assessment of the current status of a patient with newly diagnosed or previously treated critical ischaemia. PMID:15270520

  19. Computer-based assessment for facioscapulohumeral dystrophy diagnosis.

    PubMed

    Chambers, O; Milenković, J; Pražnikar, A; Tasič, J F

    2015-06-01

    The paper presents a computer-based assessment for facioscapulohumeral dystrophy (FSHD) diagnosis through characterisation of the fat and oedema percentages in the muscle region. A novel multi-slice method for the muscle-region segmentation in the T1-weighted magnetic resonance images is proposed using principles of the live-wire technique to find the path representing the muscle-region border. For this purpose, an exponential cost function is used that incorporates the edge information obtained after applying the edge-enhancement algorithm formerly designed for the fingerprint enhancement. The difference between the automatic segmentation and manual segmentation performed by a medical specialists is characterised using the Zijdenbos similarity index, indicating a high accuracy of the proposed method. Finally, the fat and oedema are quantified from the muscle region in the T1-weighted and T2-STIR magnetic resonance images, respectively, using the fuzzy c-mean clustering approach for 10 FSHD patients. PMID:25910520

  20. Approaches for the computationally efficient assessment of the plug-in HEV impact on the grid

    NASA Astrophysics Data System (ADS)

    Lee, Tae-Kyung; Filipi, Zoran S.

    2012-11-01

    Realistic duty cycles are critical for design and assessment of hybrid propulsion systems, in particular, plug-in hybrid electric vehicles. The analysis of the PHEV impact requires a large amount of data about daily missions for ensuring realism in predicted temporal loads on the grid. This paper presents two approaches for the reduction of the computational effort while assessing the large scale PHEV impact on the grid, namely 1) "response surface modelling" approach; and 2) "daily driving schedule modelling" approach. The response surface modelling approach replaces the time-consuming vehicle simulations by response surfaces constructed off-line with the consideration of the real-world driving. The daily driving modelling approach establishes a correlation between departure and arrival times, and it predicts representative driving patterns with a significantly reduced number of simulation cases. In both cases, representative synthetic driving cycles are used to capture the naturalistic driving characteristics for a given trip length. The proposed approaches enable construction of 24-hour missions, assessments of charging requirements at the time of plugging-in, and temporal distributions of the load on the grid with high computational efficiency.

  1. Preliminary performance assessment of computer automated facial approximations using computed tomography scans of living individuals.

    PubMed

    Parks, Connie L; Richard, Adam H; Monson, Keith L

    2013-12-10

    ReFace (Reality Enhancement Facial Approximation by Computational Estimation) is a computer-automated facial approximation application jointly developed by the Federal Bureau of Investigation and GE Global Research. The application derives a statistically based approximation of a face from a unidentified skull using a dataset of ~400 human head computer tomography (CT) scans of living adult American individuals from four ancestry groups: African, Asian, European and Hispanic (self-identified). To date only one unpublished subjective recognition study has been conducted using ReFace approximations. It indicated that approximations produced by ReFace were recognized above chance rates (10%). This preliminary study assesses: (i) the recognizability of five ReFace approximations; (ii) the recognizability of CT-derived skin surface replicas of the same individuals whose skulls were used to create the ReFace approximations; and (iii) the relationship between recognition performance and resemblance ratings of target individuals. All five skin surface replicas were recognized at rates statistically significant above chance (22-50%). Four of five ReFace approximations were recognized above chance (5-18%), although with statistical significance only at the higher rate. Such results suggest reconsideration of the usefulness of the type of output format utilized in this study, particularly in regard to facial approximations employed as a means of identifying unknown individuals. PMID:24314512

  2. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    SciTech Connect

    Lo, P. Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G.; Argula, R.; Strange, C.

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  3. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  4. Color calculations for and perceptual assessment of computer graphic images

    SciTech Connect

    Meyer, G.W.

    1986-01-01

    Realistic image synthesis involves the modelling of an environment in accordance with the laws of physics and the production of a final simulation that is perceptually acceptable. To be considered a scientific endeavor, synthetic image generation should also include the final step of experimental verification. This thesis concentrates on the color calculations that are inherent in the production of the final simulation and on the perceptual assessment of the computer graphic images that result. The fundamental spectral sensitivity functions that are active in the human visual system are introduced and are used to address color-blindness issues in computer graphics. A digitally controlled color television monitor is employed to successfully implement both the Farnsworth Munsell 100 hues test and a new color vision test that yields more accurate diagnoses. Images that simulate color blind vision are synthesized and are used to evaluate color scales for data display. Gaussian quadrature is used with a set of opponent fundamental to select the wavelengths at which to perform synthetic image generation.

  5. Quantitative assessment of computed radiography quality control parameters.

    PubMed

    Rampado, O; Isoardi, P; Ropolo, R

    2006-03-21

    Quality controls for testing the performance of computed radiography (CR) systems have been recommended by manufacturers and medical physicists' organizations. The purpose of this work was to develop a set of image processing tools for quantitative assessment of computed radiography quality control parameters. Automatic image analysis consisted in detecting phantom details, defining regions of interest and acquiring measurements. The tested performance characteristics included dark noise, uniformity, exposure calibration, linearity, low-contrast and spatial resolution, spatial accuracy, laser beam function and erasure thoroughness. CR devices from two major manufacturers were evaluated. We investigated several approaches to quantify the detector response uniformity. We developed methods to characterize the spatial accuracy and resolution properties across the entire image area, based on the Fourier analysis of the image of a fine wire mesh. The implemented methods were sensitive to local blurring and allowed us to detect a local distortion of 4% or greater in any part of an imaging plate. The obtained results showed that the developed image processing tools allow us to implement a quality control program for CR with short processing time and with absence of subjectivity in the evaluation of the parameters. PMID:16510964

  6. Assessing Critical Thinking Outcomes of Dental Hygiene Students Utilizing Virtual Patient Simulation: A Mixed Methods Study.

    PubMed

    Allaire, Joanna L

    2015-09-01

    Dental hygiene educators must determine which educational practices best promote critical thinking, a quality necessary to translate knowledge into sound clinical decision making. The aim of this small pilot study was to determine whether virtual patient simulation had an effect on the critical thinking of dental hygiene students. A pretest-posttest design using the Health Science Reasoning Test was used to evaluate the critical thinking skills of senior dental hygiene students at The University of Texas School of Dentistry at Houston Dental Hygiene Program before and after their experience with computer-based patient simulation cases. Additional survey questions sought to identify the students' perceptions of whether the experience had helped develop their critical thinking skills and improved their ability to provide competent patient care. A convenience sample of 31 senior dental hygiene students completed both the pretest and posttest (81.5% of total students in that class); 30 senior dental hygiene students completed the survey on perceptions of the simulation (78.9% response rate). Although the results did not show a significant increase in mean scores, the students reported feeling that the use of virtual patients was an effective teaching method to promote critical thinking, problem-solving, and confidence in the clinical realm. The results of this pilot study may have implications to support the use of virtual patient simulations in dental hygiene education. Future research could include a larger controlled study to validate findings from this study. PMID:26329033

  7. Computational Performance Assessment of k-mer Counting Algorithms.

    PubMed

    Pérez, Nelson; Gutierrez, Miguel; Vera, Nelson

    2016-04-01

    This article is about the assessment of several tools for k-mer counting, with the purpose to create a reference framework for bioinformatics researchers to identify computational requirements, parallelizing, advantages, disadvantages, and bottlenecks of each of the algorithms proposed in the tools. The k-mer counters evaluated in this article were BFCounter, DSK, Jellyfish, KAnalyze, KHMer, KMC2, MSPKmerCounter, Tallymer, and Turtle. Measured parameters were the following: RAM occupied space, processing time, parallelization, and read and write disk access. A dataset consisting of 36,504,800 reads was used corresponding to the 14th human chromosome. The assessment was performed for two k-mer lengths: 31 and 55. Obtained results were the following: pure Bloom filter-based tools and disk-partitioning techniques showed a lesser RAM use. The tools that took less execution time were the ones that used disk-partitioning techniques. The techniques that made the major parallelization were the ones that used disk partitioning, hash tables with lock-free approach, or multiple hash tables. PMID:26982880

  8. TRECII: a computer program for transportation risk assessment

    SciTech Connect

    Franklin, A.L.

    1980-05-01

    A risk-based fault tree analysis method has been developed at the Pacific Northwest Laboratory (PNL) for analysis of nuclear fuel cycle operations. This methodology was developed for the Department of Energy (DOE) as a risk analysis tool for evaluating high level waste management systems. A computer package consisting of three programs was written at that time to assist in the performance of risk assessment: ACORN (draws fault trees), MFAULT (analyzes fault trees), and RAFT (calculates risk). This methodology evaluates release consequences and estimates the frequency of occurrence of these consequences. This document describes an additional risk calculating code which can be used in conjunction with two of the three codes for transportation risk assessment. TRECII modifies the definition of risk used in RAFT (prob. x release) to accommodate release consequences in terms of fatalities. Throughout this report risk shall be defined as probability times consequences (fatalities are one possible health effect consequence). This methodology has been applied to a variety of energy material transportation systems. Typically the material shipped has been radioactive, although some adaptation to fossil fuels has occurred. The approach is normally applied to truck or train transport systems with some adaptation to pipelines and aircraft. TRECII is designed to be used primarily in conjunction with MFAULT; however, with a moderate amount of effort by the user, it can be implemented independent of the risk analysis package developed at PNL. Code description and user instructions necessary for the implementation of the TRECII program are provided.

  9. Computational fluid dynamics framework for aerodynamic model assessment

    NASA Astrophysics Data System (ADS)

    Vallespin, D.; Badcock, K. J.; Da Ronch, A.; White, M. D.; Perfect, P.; Ghoreyshi, M.

    2012-07-01

    This paper reviews the work carried out at the University of Liverpool to assess the use of CFD methods for aircraft flight dynamics applications. Three test cases are discussed in the paper, namely, the Standard Dynamic Model, the Ranger 2000 jet trainer and the Stability and Control Unmanned Combat Air Vehicle. For each of these, a tabular aerodynamic model based on CFD predictions is generated along with validation against wind tunnel experiments and flight test measurements. The main purpose of the paper is to assess the validity of the tables of aerodynamic data for the force and moment prediction of realistic aircraft manoeuvres. This is done by generating a manoeuvre based on the tables of aerodynamic data, and then replaying the motion through a time-accurate computational fluid dynamics calculation. The resulting forces and moments from these simulations were compared with predictions from the tables. As the latter are based on a set of steady-state predictions, the comparisons showed perfect agreement for slow manoeuvres. As manoeuvres became more aggressive some disagreement was seen, particularly during periods of large rates of change in attitudes. Finally, the Ranger 2000 model was used on a flight simulator.

  10. In Just What Sense Should I Be Critical? An Exploration into the Notion of "Assumption" and Some Implications for Assessment

    ERIC Educational Resources Information Center

    Mejia D., Andres

    2009-01-01

    The current dominant approach on the assessment of critical thinking takes as a starting point a conception of criticality that does not commit to any substantive view or context of meaning concerning what issues are relevant to be critical about in society or in life. Nevertheless, as a detailed examination of the identification of assumptions…

  11. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect

    Jaeger, Calvin D.; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  12. Assessment team report on flight-critical systems research at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Siewiorek, Daniel P. (Compiler); Dunham, Janet R. (Compiler)

    1989-01-01

    The quality, coverage, and distribution of effort of the flight-critical systems research program at NASA Langley Research Center was assessed. Within the scope of the Assessment Team's review, the research program was found to be very sound. All tasks under the current research program were at least partially addressing the industry needs. General recommendations made were to expand the program resources to provide additional coverage of high priority industry needs, including operations and maintenance, and to focus the program on an actual hardware and software system that is under development.

  13. Multi-intelligence critical rating assessment of fusion techniques (MiCRAFT)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik

    2015-06-01

    Assessment of multi-intelligence fusion techniques includes credibility of algorithm performance, quality of results against mission needs, and usability in a work-domain context. Situation awareness (SAW) brings together low-level information fusion (tracking and identification), high-level information fusion (threat and scenario-based assessment), and information fusion level 5 user refinement (physical, cognitive, and information tasks). To measure SAW, we discuss the SAGAT (Situational Awareness Global Assessment Technique) technique for a multi-intelligence fusion (MIF) system assessment that focuses on the advantages of MIF against single intelligence sources. Building on the NASA TLX (Task Load Index), SAGAT probes, SART (Situational Awareness Rating Technique) questionnaires, and CDM (Critical Decision Method) decision points; we highlight these tools for use in a Multi-Intelligence Critical Rating Assessment of Fusion Techniques (MiCRAFT). The focus is to measure user refinement of a situation over the information fusion quality of service (QoS) metrics: timeliness, accuracy, confidence, workload (cost), and attention (throughput). A key component of any user analysis includes correlation, association, and summarization of data; so we also seek measures of product quality and QuEST of information. Building a notion of product quality from multi-intelligence tools is typically subjective which needs to be aligned with objective machine metrics.

  14. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  15. The Computing Alliance of Hispanic-Serving Institutions: Supporting Hispanics at Critical Transition Points

    ERIC Educational Resources Information Center

    Gates, Ann Quiroz; Hug, Sarah; Thiry, Heather; Alo, Richard; Beheshti, Mohsen; Fernandez, John; Rodriguez, Nestor; Adjouadi, Malek

    2011-01-01

    Hispanics have the highest growth rates among all groups in the U.S., yet they remain considerably underrepresented in computing careers and in the numbers who obtain advanced degrees. Hispanics constituted about 7% of undergraduate computer science and computer engineering graduates and 1% of doctoral graduates in 2007-2008. The small number of…

  16. Computer-Based Testing: An Alternative for the Assessment of Turkish Undergraduate Students

    ERIC Educational Resources Information Center

    Akdemir, Omur; Oguz, Ayse

    2008-01-01

    Virtually errorless high speed data processing feature has made computers popular assessment tools in education. An important concern in developing countries considering integrating computers as an educational assessment tool before making substantial investment is the effects of computer-based testing on students' test scores as compared to…

  17. Computer Science: A Historical Perspective and a Current Assessment

    NASA Astrophysics Data System (ADS)

    Wirth, Niklaus

    We begin with a brief review of the early years of Computer Science. This period was dominated by large, remote computers and the struggle to master the complex problems of programming. The remedy was found in programming languages providing suitable abstractions and programming models. Outstanding was the language Algol 60, designed by an international committee, and intended as a publication language for algorithms. The early period ends with the advent of the microcomputer in the mid 1970s, bringing computing into homes and schools. The outstanding computer was the Alto, the first personal computer with substantial computing power. It changed the world of computing.

  18. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    NASA Astrophysics Data System (ADS)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  19. Developing Critical Thinking Skills in Computer-Aided Extended Reading Classes

    ERIC Educational Resources Information Center

    Daud, Nuraihan Mat; Husin, Zamnah

    2004-01-01

    One of the skills that can be taught in an English proficiency class that adopts literary texts for teaching the language is critical thinking. The background, characters and their motives are among those that invite critical inquiry and interpretation. Although it has been claimed that discussing literary texts in the traditional way can help…

  20. Examining the Critical Thinking Dispositions and the Problem Solving Skills of Computer Engineering Students

    ERIC Educational Resources Information Center

    Özyurt, Özcan

    2015-01-01

    Problem solving is an indispensable part of engineering. Improving critical thinking dispositions for solving engineering problems is one of the objectives of engineering education. In this sense, knowing critical thinking and problem solving skills of engineering students is of importance for engineering education. This study aims to determine…

  1. Benchmark Problems Used to Assess Computational Aeroacoustics Codes

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Envia, Edmane

    2005-01-01

    The field of computational aeroacoustics (CAA) encompasses numerical techniques for calculating all aspects of sound generation and propagation in air directly from fundamental governing equations. Aeroacoustic problems typically involve flow-generated noise, with and without the presence of a solid surface, and the propagation of the sound to a receiver far away from the noise source. It is a challenge to obtain accurate numerical solutions to these problems. The NASA Glenn Research Center has been at the forefront in developing and promoting the development of CAA techniques and methodologies for computing the noise generated by aircraft propulsion systems. To assess the technological advancement of CAA, Glenn, in cooperation with the Ohio Aerospace Institute and the AeroAcoustics Research Consortium, organized and hosted the Fourth CAA Workshop on Benchmark Problems. Participants from industry and academia from both the United States and abroad joined to present and discuss solutions to benchmark problems. These demonstrated technical progress ranging from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The results are documented in the proceedings of the workshop. Problems were solved in five categories. In three of the five categories, exact solutions were available for comparison with CAA results. A fourth category of problems representing sound generation from either a single airfoil or a blade row interacting with a gust (i.e., problems relevant to fan noise) had approximate analytical or completely numerical solutions. The fifth category of problems involved sound generation in a viscous flow. In this case, the CAA results were compared with experimental data.

  2. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  3. Orthogonal arrays for computer experiments to assess important inputs

    SciTech Connect

    Moore, L. M.; McKay, Michael D.

    2002-01-01

    The topic of this paper is experiment planning, particularly fractional factorial designs or orthogonal arrays, for computer experiments to assess important inputs. The work presented in the paper is motivated by considering a non-stochastic computer simulation which has many inputs and which can, in a reasonable period of time, be run thousands of times. With many inputs, information that allows focus on a subset of important inputs is valuable. The characterization of 'importance' is expected to follow suggestions in McKay (1995) or McKay, et al. (1992). This analysis approach leads to considering factorial experiment designs. Inputs are associated with a finite number of discrete values, referred to as levels, so if each input has K levels and there are p inputs then there are K{sup P} possible distinct runs which constitute the K{sup P} factorial design space. The suggested size of p has been 35 to 50 so that even with K=2 the complete 2{sup P} factorial design space would not be run. Further, it is expected that the complexity of the simulation code and discrete levels possibly associated with equi-probable intervals from the input distribution make it desirable to consider more than 2 level inputs. Inputs levels of 5 and 7 have been investigated. In this paper, orthogonal array experiment designs, which are subsets of factorial designs also referred to as fractional factorial designs, are suggested as candidate experiments which provide meaningful basis for calculating and comparing R{sup 2} across subsets of inputs.

  4. A more efficient computational procedure for calculating the critical current of a multi-junction superconducting interferometer

    NASA Astrophysics Data System (ADS)

    Lutes, C. L.; Gershenson, M.; Schneider, R. J.

    1985-03-01

    The textbook procedure for the solution of the critical current of an N-junction superconducting interferometer is a 2N-1 dimensional steepest descent problem. A solution by this procedure is complicated by the existence of multiple local minima. The equations are reformulated to reduce the problem to a three-dimensional steepest descent problem. From this reduced equation set, a non-steepest-descent procedure is developed. This technique produces a solution by adjusting a trial critical current value until tangency between a straight line and a special error function is achieved. For a 10-junction test case, an 80-to-1 reduction in computer time was achieved.

  5. Intelligent computer based reliability assessment of multichip modules

    NASA Astrophysics Data System (ADS)

    Grosse, Ian R.; Katragadda, Prasanna; Bhattacharya, Sandeepan; Kulkarni, Sarang

    1994-04-01

    To deliver reliable Multichip (MCM's) in the face of rapidly changing technology, computer-based tools are needed for predicting the thermal mechanical behavior of various MCM package designs and selecting the most promising design in terms of performance, robustness, and reliability. The design tool must be able to address new design technologies manufacturing processes, novel materials, application criteria, and thermal environmental conditions. Reliability is one of the most important factors for determining design quality and hence must be a central condition in the design of Multichip Module packages. Clearly, design engineers need computer based simulation tools for rapid and efficient electrical, thermal, and mechanical modeling and optimization of advanced devices. For three dimensional thermal and mechanical simulation of advanced devices, the finite element method (FEM) is increasingly becoming the numerical method of choice. FEM is a versatile and sophisticated numerical techniques for solving the partial differential equations that describe the physical behavior of complex designs. AUTOTHERM(TM) is a MCM design tool developed by Mentor Graphics for Motorola, Inc. This tool performs thermal analysis of MCM packages using finite element analysis techniques. The tools used the philosophy of object oriented representation of components and simplified specification of boundary conditions for the thermal analysis so that the user need not be an expert in using finite element techniques. Different package types can be assessed and environmental conditions can be modeled. It also includes a detailed reliability module which allows the user to choose a desired failure mechanism (model). All the current tools perform thermal and/or stress analysis and do not address the issues of robustness and optimality of the MCM designs and the reliability prediction techniques are based on closed form analytical models and can often fail to predict the cycles of failure (N

  6. Development and Evaluation of Computer-Assisted Assessment in Higher Education in Relation to BS7988

    ERIC Educational Resources Information Center

    Shephard, Kerry; Warburton, Bill; Maier, Pat; Warren, Adam

    2006-01-01

    A university-wide project team of academic and administrative staff worked together to prepare, deliver and evaluate a number of diagnostic, formative and summative computer-based assessments. The team also attempted to assess the University of Southampton's readiness to deliver computer-assisted assessment (CAA) within the "Code of practice for…

  7. Improving Student Performance through Computer-Based Assessment: Insights from Recent Research.

    ERIC Educational Resources Information Center

    Ricketts, C.; Wilks, S. J.

    2002-01-01

    Compared student performance on computer-based assessment to machine-graded multiple choice tests. Found that performance improved dramatically on the computer-based assessment when students were not required to scroll through the question paper. Concluded that students may be disadvantaged by the introduction of online assessment unless care is…

  8. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  9. Quantitative assessment of computational models for retinotopic map formation.

    PubMed

    Hjorth, J J Johannes; Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2015-06-01

    Molecular and activity-based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity-based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2-EphA3(ki/ki), Isl2-EphA3(ki/+), ephrin-A2,A3,A5 triple knock-out (TKO), and Math5(-/-) (Atoh7). Two models successfully reproduced the extent of the Math5(-/-) anteromedial projection, but only one of those could account for the collapse point in Isl2-EphA3(ki/+). The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin-A2,A3,A5 TKO phenotype, suggesting either an incomplete knock-out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. PMID:25367067

  10. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  11. Self-motion perception: assessment by computer-generated animations

    NASA Technical Reports Server (NTRS)

    Parker, D. E.; Harm, D. L.; Sandoz, G. R.; Skinner, N. C.

    1998-01-01

    The goal of this research is more precise description of adaptation to sensory rearrangements, including microgravity, by development of improved procedures for assessing spatial orientation perception. Thirty-six subjects reported perceived self-motion following exposure to complex inertial-visual motion. Twelve subjects were assigned to each of 3 perceptual reporting procedures: (a) animation movie selection, (b) written report selection and (c) verbal report generation. The question addressed was: do reports produced by these procedures differ with respect to complexity and reliability? Following repeated (within-day and across-day) exposures to 4 different "motion profiles," subjects either (a) selected movies presented on a laptop computer, or (b) selected written descriptions from a booklet, or (c) generated self-motion verbal descriptions that corresponded most closely with their motion experience. One "complexity" and 2 reliability "scores" were calculated. Contrary to expectations, reliability and complexity scores were essentially equivalent for the animation movie selection and written report selection procedures. Verbal report generation subjects exhibited less complexity than did subjects in the other conditions and their reports were often ambiguous. The results suggest that, when selecting from carefully written descriptions and following appropriate training, people may be better able to describe their self-motion experience with words than is usually believed.

  12. Combining destination diversion decisions and critical in-flight event diagnosis in computer aided testing of pilots

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.; Romer, D. J.

    1984-01-01

    Rockwell and Giffin (1982) and Giffin and Rockwell (1983) have discussed the use of computer aided testing (CAT) in the study of pilot response to critical in-flight events. The present investigation represents an extension of these earlier studies. In testing pilot responses to critical in-flight events, use is made of a Plato-touch CRT system operating on a menu based format. In connection with the typical diagnostic problem, the pilot was presented with symptoms within a flight scenario. In one problem, the pilot has four minutes for obtaining the information which is needed to make a diagnosis of the problem. In the reported research, the attempt has been made to combine both diagnosis and diversion scenario into a single computer aided test. Tests with nine subjects were conducted. The obtained results and their significance are discussed.

  13. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al

  14. Overview of BioCreAtIvE: critical assessment of information extraction for biology

    PubMed Central

    Hirschman, Lynette; Yeh, Alexander; Blaschke, Christian; Valencia, Alfonso

    2005-01-01

    Background The goal of the first BioCreAtIvE challenge (Critical Assessment of Information Extraction in Biology) was to provide a set of common evaluation tasks to assess the state of the art for text mining applied to biological problems. The results were presented in a workshop held in Granada, Spain March 28–31, 2004. The articles collected in this BMC Bioinformatics supplement entitled "A critical assessment of text mining methods in molecular biology" describe the BioCreAtIvE tasks, systems, results and their independent evaluation. Results BioCreAtIvE focused on two tasks. The first dealt with extraction of gene or protein names from text, and their mapping into standardized gene identifiers for three model organism databases (fly, mouse, yeast). The second task addressed issues of functional annotation, requiring systems to identify specific text passages that supported Gene Ontology annotations for specific proteins, given full text articles. Conclusion The first BioCreAtIvE assessment achieved a high level of international participation (27 groups from 10 countries). The assessment provided state-of-the-art performance results for a basic task (gene name finding and normalization), where the best systems achieved a balanced 80% precision / recall or better, which potentially makes them suitable for real applications in biology. The results for the advanced task (functional annotation from free text) were significantly lower, demonstrating the current limitations of text-mining approaches where knowledge extrapolation and interpretation are required. In addition, an important contribution of BioCreAtIvE has been the creation and release of training and test data sets for both tasks. There are 22 articles in this special issue, including six that provide analyses of results or data quality for the data sets, including a novel inter-annotator consistency assessment for the test set used in task 2. PMID:15960821

  15. Assessment and Utility of Frailty Measures in Critical Illness, Cardiology, and Cardiac Surgery.

    PubMed

    Rajabali, Naheed; Rolfson, Darryl; Bagshaw, Sean M

    2016-09-01

    Frailty is a clearly emerging theme in acute care medicine, with obvious prognostic and health resource implications. "Frailty" is a term used to describe a multidimensional syndrome of loss of homeostatic reserves that gives rise to a vulnerability to adverse outcomes after relatively minor stressor events. This is conceptually simple, yet there has been little consensus on the operational definition. The gold standard method to diagnose frailty remains a comprehensive geriatric assessment; however, a variety of validated physical performance measures, judgement-based tools, and multidimensional scales are being applied in critical care, cardiology, and cardiac surgery settings, including open cardiac surgery and transcatheter aortic value replacement. Frailty is common among patients admitted to the intensive care unit and correlates with an increased risk for adverse events, increased resource use, and less favourable patient-centred outcomes. Analogous findings have been described across selected acute cardiology and cardiac surgical settings, in particular those that commonly intersect with critical care services. The optimal methods for screening and diagnosing frailty across these settings remains an active area of investigation. Routine assessment for frailty conceivably has numerous purported benefits for patients, families, health care providers, and health administrators through better informed decision-making regarding treatments or goals of care, prognosis for survival, expectations for recovery, risk of complications, and expected resource use. In this review, we discuss the measurement of frailty and its utility in patients with critical illness and in cardiology and cardiac surgery settings. PMID:27476983

  16. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Baker, Ann E; Bland, Arthur S Buddy; Hack, James J; Barker, Ashley D; Boudwin, Kathlyn J.; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; Wells, Jack C; White, Julia C

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and where

  17. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    SciTech Connect

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.; Miley, Terri B.; Nichols, William E.; Strenge, Dennis L.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  18. Static and dynamic critical behavior of a symmetrical binary fluid: a computer simulation.

    PubMed

    Das, Subir K; Horbach, Jürgen; Binder, Kurt; Fisher, Michael E; Sengers, Jan V

    2006-07-14

    A symmetrical binary, A+B Lennard-Jones mixture is studied by a combination of semi-grand-canonical Monte Carlo (SGMC) and molecular dynamics (MD) methods near a liquid-liquid critical temperature T(c). Choosing equal chemical potentials for the two species, the SGMC switches identities (A-->B-->A) to generate well-equilibrated configurations of the system on the coexistence curve for Tcritical concentration, x(c)=12, for T>T(c). A finite-size scaling analysis of the concentration susceptibility above T(c) and of the order parameter below T(c) is performed, varying the number of particles from N=400 to 12 800. The data are fully compatible with the expected critical exponents of the three-dimensional Ising universality class. The equilibrium configurations from the SGMC runs are used as initial states for microcanonical MD runs, from which transport coefficients are extracted. Self-diffusion coefficients are obtained from the Einstein relation, while the interdiffusion coefficient and the shear viscosity are estimated from Green-Kubo expressions. As expected, the self-diffusion constant does not display a detectable critical anomaly. With appropriate finite-size scaling analysis, we show that the simulation data for the shear viscosity and the mutual diffusion constant are quite consistent both with the theoretically predicted behavior, including the critical exponents and amplitudes, and with the most accurate experimental evidence. PMID:16848591

  19. Development and psychometric properties of a questionnaire to assess barriers to feeding critically ill patients

    PubMed Central

    2013-01-01

    Background To successfully implement the recommendations of critical care nutrition guidelines, one potential approach is to identify barriers to providing optimal enteral nutrition (EN) in the intensive care unit (ICU), and then address these barriers systematically. Therefore, the purpose of this study was to develop a questionnaire to assess barriers to enterally feeding critically ill patients and to conduct preliminary validity testing of the new instrument. Methods The content of the questionnaire was guided by a published conceptual framework, literature review, and consultation with experts. The questionnaire was pre-tested on a convenience sample of 32 critical care practitioners, and then field tested with 186 critical care providers working at 5 hospitals in North America. The revised questionnaire was pilot tested at another ICU (n = 43). Finally, the questionnaire was distributed to a random sample of ICU nurses twice, two weeks apart, to determine test retest reliability (n = 17). Descriptive statistics, exploratory factor analysis, Cronbach alpha, intraclass correlations (ICC), and kappa coefficients were conducted to assess validity and reliability. Results We developed a questionnaire with 26 potential barriers to delivery of EN asking respondents to rate their importance as barriers in their ICU. Face and content validity of the questionnaire was established through literature review and expert input. The factor analysis indicated a five-factor solution and accounted for 72% of the variance in barriers: guideline recommendations and implementation strategies, delivery of EN to the patient, critical care provider attitudes and behavior, dietitian support, and ICU resources. Overall, the indices of internal reliability for the derived factor subscales and the overall instrument were acceptable (subscale Cronbach alphas range 0.84 – 0.89). However, the test retest reliability was variable and below acceptable thresholds for the majority of

  20. Development of a structural health monitoring system for the life assessment of critical transportation infrastructure.

    SciTech Connect

    Roach, Dennis Patrick; Jauregui, David Villegas; Daumueller, Andrew Nicholas

    2012-02-01

    Recent structural failures such as the I-35W Mississippi River Bridge in Minnesota have underscored the urgent need for improved methods and procedures for evaluating our aging transportation infrastructure. This research seeks to develop a basis for a Structural Health Monitoring (SHM) system to provide quantitative information related to the structural integrity of metallic structures to make appropriate management decisions and ensuring public safety. This research employs advanced structural analysis and nondestructive testing (NDT) methods for an accurate fatigue analysis. Metal railroad bridges in New Mexico will be the focus since many of these structures are over 100 years old and classified as fracture-critical. The term fracture-critical indicates that failure of a single component may result in complete collapse of the structure such as the one experienced by the I-35W Bridge. Failure may originate from sources such as loss of section due to corrosion or cracking caused by fatigue loading. Because standard inspection practice is primarily visual, these types of defects can go undetected due to oversight, lack of access to critical areas, or, in riveted members, hidden defects that are beneath fasteners or connection angles. Another issue is that it is difficult to determine the fatigue damage that a structure has experienced and the rate at which damage is accumulating due to uncertain history and load distribution in supporting members. A SHM system has several advantages that can overcome these limitations. SHM allows critical areas of the structure to be monitored more quantitatively under actual loading. The research needed to apply SHM to metallic structures was performed and a case study was carried out to show the potential of SHM-driven fatigue evaluation to assess the condition of critical transportation infrastructure and to guide inspectors to potential problem areas. This project combines the expertise in transportation infrastructure at New

  1. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    SciTech Connect

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent on the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.

  2. Volcanic hazards at distant critical infrastructure: A method for bespoke, multi-disciplinary assessment

    NASA Astrophysics Data System (ADS)

    Odbert, H. M.; Aspinall, W.; Phillips, J.; Jenkins, S.; Wilson, T. M.; Scourse, E.; Sheldrake, T.; Tucker, P.; Nakeshree, K.; Bernardara, P.; Fish, K.

    2015-12-01

    Societies rely on critical services such as power, water, transport networks and manufacturing. Infrastructure may be sited to minimise exposure to natural hazards but not all can be avoided. The probability of long-range transport of a volcanic plume to a site is comparable to other external hazards that must be considered to satisfy safety assessments. Recent advances in numerical models of plume dispersion and stochastic modelling provide a formalized and transparent approach to probabilistic assessment of hazard distribution. To understand the risks to critical infrastructure far from volcanic sources, it is necessary to quantify their vulnerability to different hazard stressors. However, infrastructure assets (e.g. power plantsand operational facilities) are typically complex systems in themselves, with interdependent components that may differ in susceptibility to hazard impact. Usually, such complexity means that risk either cannot be estimated formally or that unsatisfactory simplifying assumptions are prerequisite to building a tractable risk model. We present a new approach to quantifying risk by bridging expertise of physical hazard modellers and infrastructure engineers. We use a joint expert judgment approach to determine hazard model inputs and constrain associated uncertainties. Model outputs are chosen on the basis of engineering or operational concerns. The procedure facilitates an interface between physical scientists, with expertise in volcanic hazards, and infrastructure engineers, with insight into vulnerability to hazards. The result is a joined-up approach to estimating risk from low-probability hazards to critical infrastructure. We describe our methodology and show preliminary results for vulnerability to volcanic hazards at a typical UK industrial facility. We discuss our findings in the context of developing bespoke assessment of hazards from distant sources in collaboration with key infrastructure stakeholders.

  3. Approaches for assessment of vulnerability of critical infrastructures to weather-related hazards

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Uzielli, Marco; Vidar Vangelsten, Bjørn

    2016-04-01

    Critical infrastructures are essential components for the modern society to maintain its function, and malfunctioning of one of the critical infrastructure systems may have far-reaching consequences. Climate changes may lead to increase in frequency and intensity of weather-related hazards, creating challenges for the infrastructures. This paper outlines approaches to assess vulnerability posed by weather-related hazards to infrastructures. The approaches assess factors that affect the probability of a malfunctioning of the infrastructure should a weather-related threat occur, as well factors that affect the societal consequences of the infrastructure malfunctioning. Even if vulnerability factors are normally very infrastructure specific and hazard dependent, generic factors could be defined and analyzed. For the vulnerability and resilience of the infrastructure, such factors include e.g. robustness, buffer capacity, protection, quality, age, adaptability and transparency. For the vulnerability of the society in relation to the infrastructure, such factors include e.g. redundancy, substitutes and cascading effects. A semi-quantitative, indicator-based approach is proposed, providing schemes for ranking of the most important vulnerability indicators relevant for weather-related hazards on a relative scale. The application of the indicators in a semi-quantitative risk assessment is also demonstrated. In addition, a quantitative vulnerability model is proposed in terms of vulnerability (representing degree of loss) as a function of intensity, which is adaptable to different types of degree of loss (e.g. fraction of infrastructure users that lose their service, fraction of repair costs to full reconstruction costs). The vulnerability model can be calibrated with empirical data using deterministic calibration or a variety of probabilistic calibration approaches to account for the uncertainties within the model. The research leading to these results has received funding

  4. A Critical Evaluation of the Validity and the Reliability of Global Competency Constructs for Supervisor Assessment of Junior Medical Trainees

    ERIC Educational Resources Information Center

    McGill, D. A.; van der Vleuten, C. P. M.; Clarke, M. J.

    2013-01-01

    Supervisor assessments are critical for both formative and summative assessment in the workplace. Supervisor ratings remain an important source of such assessment in many educational jurisdictions even though there is ambiguity about their validity and reliability. The aims of this evaluation is to explore the: (1) construct validity of ward-based…

  5. Evaluation of critical materials for five advanced design photovoltaic cells with an assessment of indium and gallium

    SciTech Connect

    Watts, R.L.; Gurwell, W.E.; Jamieson, W.M.; Long, L.W.; Pawlewicz, W.T.; Smith, S.A.; Teeter, R.R.

    1980-05-01

    The objective of this study is to identify potential material supply constraints due to the large-scale deployment of five advanced photovoltaic (PV) cell designs, and to suggest strategies to reduce the impacts of these production capacity limitations and potential future material shortages. This report presents the results of the screening of the five following advanced PV cell designs: polycrystalline silicon, amorphous silicon, cadmium sulfide/copper sulfide frontwall, polycrystalline gallium arsenide MIS, and advanced concentrator-500X. Each of these five cells is screened individually assuming that they first come online in 1991, and that 25 GWe of peak capacity is online by the year 2000. A second computer screening assumes that each cell first comes online in 1991 and that each cell has 5 GWe of peak capacity by the year 2000, so that the total online cpacity for the five cells is 25 GWe. Based on a review of the preliminary basline screening results, suggestions were made for varying such parameters as the layer thickness, cell production processes, etc. The resulting PV cell characterizations were then screened again by the CMAP computer code. Earlier DOE sponsored work on the assessment of critical materials in PV cells conclusively identtified indium and gallium as warranting further investigation as to their availability. Therefore, this report includes a discussion of the future availability of gallium and indium. (WHK)

  6. A critical evaluation of the predictions of the NASA-Lockheed multielement airfoil computer program

    NASA Technical Reports Server (NTRS)

    Brune, G. W.; Manke, J. W.

    1978-01-01

    Theoretical predictions of several versions of the multielement airfoil computer program are evaluated. The computed results are compared with experimental high lift data of general aviation airfoils with a single trailing edge flap, and of airfoils with a leading edge flap and double slotted trailing edge flaps. Theoretical and experimental data include lift, pitching moment, profile drag and surface pressure distributions, boundary layer integral parameters, skin friction coefficients, and velocity profiles.

  7. Developing "The Critic's Corner": Computer-Assisted Language Learning for Upper-Level Russian Students.

    ERIC Educational Resources Information Center

    Nicholas, Mary A.; Toporski, Neil

    1993-01-01

    A Lehigh University project to develop interactive video materials for use in upper-level Russian language courses is reported. Russian film clips on laser disc are used to improve writing and speaking skills, stimulate students' critical thinking, and encourage students' collaborative learning. (Contains 23 references.) (Author/LB)

  8. Critical Thinking in and through Interactive Computer Hypertext and Art Education

    ERIC Educational Resources Information Center

    Taylor, Pamela G.

    2006-01-01

    As part of a two-year study, Pamela G. Taylor's high school art students constructed hypertext webs that linked the subject matter they studied in class to their own independent research and personal experiences, which in turn encouraged them to think critically about the material. Taylor bases this use of hypertext on the thinking of Paulo Freire…

  9. [Assessment of surgical risk in patients with lower limb chronic critical ischaemia].

    PubMed

    Kazakov, Iu I; Lukin, I B; Sokolova, N Iu; Strakhov, M A

    2016-01-01

    Analysed herein are both immediate and remote results of surgical treatment of 93 patients presenting with chronic atherosclerotic occlusion of the femoral-popliteal-tibial segment in the stage of critical ischaemia. The patients were subjected to autovenous femoropopliteal bypass grafting to the isolated arterial segment or balloon angioplasty with stenting of the superficial femoral artery. While choosing the method of arterial reconstruction we assessed concomitant diseases, primarily lesions of the coronary and cerebral circulation. In order to objectively evaluate the patient state, we worked out a scale for assessing surgical risk. Survival rate without amputation after three years in patients with low risk amounted to 71.4%, in those with moderate risk to 60.0%, and in high-risk patients to 43.3%. Patients with initially high risk were found to have a high incidence rate of cardiac and cerebrovascular complications, exceeding 40%. It was shown that the worked out system of assessing the level of surgical risk objectively reflects the prognosis of patient survival following a reconstructive operation. This system of assessment may be appropriate while choosing an optimal method of arterial reconstruction (bypassing operation or endovascular intervention) in patients with atherosclerotic lesions of arteries of the femoropopliteal-tibial segment and critical ischaemia accompanied by severe concomitant pathology. Patients with high surgical risk should preferably be subjected to endovascular reconstruction, while those with low surgical risk should better undergo open shunting bypassing operation, and for those with moderate risk it is acceptable to perform both methods of arterial reconstruction. PMID:27626262

  10. A nuclear criticality safety assessment of the loss of moderation control in 2 1/2 and 10-ton cylinders containing enriched UF{sub 6}

    SciTech Connect

    Newvahner, R.L.; Pryor, W.A.

    1991-12-31

    Moderation control for maintaining nuclear criticality safety in 2 {1/2}-ton, 10-ton, and 14-ton cylinders containing enriched uranium hexafluoride (UF{sub 6}) has been used safely within the nuclear industry for over thirty years, and is dependent on cylinder integrity and containment. This assessment evaluates the loss of moderation control by the breaching of containment and entry of water into the cylinders. The first objective of this study was to estimate the required amounts of water entering these large UF{sub 6} cylinders to react with, and to moderate the uranium compounds sufficiently to cause criticality. Hypothetical accident situations were modeled as a uranyl fluoride (UO{sub 2}F{sub 2}) slab above a UF{sub 6} hemicylinder, and a UO{sub 2}F{sub 2} sphere centered within a UF{sub 6} hemicylinder. These situations were investigated by computational analyses utilizing the KENO V.a Monte Carlo Computer Code. The results were used to estimate both the masses of water required for criticality, and the limiting masses of water that could be considered safe. The second objective of the assessment was to calculate the time available for emergency control actions before a criticality would occur, i.e., a {open_quotes}safetime{close_quotes}, for various sources of water and different size openings in a breached cylinder. In the situations considered, except the case for a fire hose, the safetime appears adequate for emergency control actions. The assessment shows that current practices for handling moderation controlled cylinders of low enriched UF{sub 6}, along with the continuation of established personnel training programs, ensure nuclear criticality safety for routine and emergency operations.

  11. Quality assessment and authentication of virgin olive oil by NMR spectroscopy: a critical review.

    PubMed

    Dais, Photis; Hatzakis, Emmanuel

    2013-02-26

    Nuclear Magnetic Resonance (NMR) Spectroscopy has been extensively used for the analysis of olive oil and it has been established as a valuable tool for its quality assessment and authenticity. To date, a large number of research and review articles have been published with regards to the analysis of olive oil reflecting the potential of the NMR technique in these studies. In this critical review, we cover recent results in the field and discuss deficiencies and precautions of the three NMR techniques ((1)H, (13)C, (31)P) used for the analysis of olive oil. The two methodological approaches of metabonomics, metabolic profiling and metabolic fingerprinting, and the statistical methods applied for the classification of olive oils will be discussed in critical way. Some useful information about sample preparation, the required instrumentation for an effective analysis, the experimental conditions and data processing for obtaining high quality spectra will be presented as well. Finally, a constructive criticism will be exercised on the present methodologies used for the quality control and authentication of olive oil. PMID:23410622

  12. Resourcefulness training intervention: assessing critical parameters from relocated older adults' perspectives.

    PubMed

    Bekhet, Abir K; Zauszniewski, Jaclene A; Matel-Anderson, Denise M

    2012-07-01

    The population of American elders is increasing rapidly and relocation to retirement communities has been found to adversely affect their adjustment. This pilot study of 38 relocated elders evaluated, from elders' perspectives, six critical parameters of a resourcefulness training (RT) intervention designed to help elders adjust to relocation. Within the context of Zauszniewski's theory of resourcefulness, a pre-/post-test design with random assignment to RT or to diversionary activities (DA) was used. Objective questionnaires measured demographic and relocation factors. An intervention evaluation questionnaire was designed and given to the relocated elders in order to assess the six critical parameters--necessity, acceptability, feasibility, safety, fidelity, and effectiveness. Data concerning the critical parameters were collected during structured interviews within a week after the intervention. Seventy-six of the elders who scored less than 120 in the resourcefulness scale indicated a strong need for RT. While all non-white elders reported needing RT, 43% of white elders reported the same need. Elders indicated that learning about the experiences of others and taking part in discussions were the most interesting part of the RT. Approximately 95% of participants mentioned that they learned all parts of the intervention; few suggested having a stronger leader to keep the group on track. The qualitative findings from this pilot intervention study will inform future, larger clinical trials to help recently relocated elders adjust to relocation. PMID:22757595

  13. Assessment of Examinations in Computer Science Doctoral Education

    ERIC Educational Resources Information Center

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types…

  14. Assessment of Computer Self-Efficacy: Instrument Development and Validation.

    ERIC Educational Resources Information Center

    Murphy, Christine A.; And Others

    A 32-item Computer Self-Efficacy Scale (CSE) was developed to measure perceptions of capability regarding specific computer-related knowledge and skills. Bandura's theory of self-efficacy (1986) and Schunk's model of classroom learning (1985) guided the development of the CSE. Each of the skill-related items is preceded by the phrase "I feel…

  15. On the use of shannon entropy of the fission distribution for assessing convergence of Monte Carlo criticality calculations

    SciTech Connect

    Brown, F. B.

    2006-07-01

    Monte Carlo calculations of k-eigenvalue problems are based on a power iteration procedure. To obtain correct results free of contamination from the initial guess for the fission distribution, it is imperative to determine when the iteration procedure has converged, so that a sufficient number of the initial batches are discarded prior to beginning the Monte Carlo tallies. In this paper, we examine the convergence behavior using both theory and numerical testing, demonstrating that k{sub eff} may converge before the fission distribution for problems with a high dominance ratio. Thus, it is necessary to assess convergence of both k{sub eff} and the fission distribution to obtain correct results. To this end, the Shannon entropy of the fission distribution has been found to be a highly effective means of characterizing convergence of the fission distribution. The latest version of MCNP5 includes new capabilities for computing and plotting the Shannon entropy of the fission distribution as an important new tool for assessing problem convergence. Examples of the application of this new tool are presented for a variety of practical criticality problems. (authors)

  16. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses

    PubMed Central

    Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas

    2009-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  17. Profiling of energy deposition fields in a modular HTHR with annular core: Computational/experimental studies at the ASTRA critical facility

    SciTech Connect

    Boyarinov, V. F.; Garin, V. P.; Glushkov, E. S.; Zimin, A. A.; Kompaniets, G. V.; Nevinitsa, V. A.; Polyakov, D. N.; Ponomarev, A. S.; Ponomarev-Stepnoi, N. N.; Smirnov, O. N.; Fomichenko, P. A.; Chunyaev, E. I.; Marova, E. V.; Sukharev, Yu. P.

    2010-12-15

    The paper presents the results obtained from the computational/experimental studies of the spatial distribution of the {sup 235}U fission reaction rate in a critical assembly with an annular core and poison profiling elements inserted into the inner graphite reflector. The computational analysis was carried out with the codes intended for design computation of an HTHR-type reactor.

  18. QAM: A Competency Based Need Assessment Methodology and Computer Program.

    ERIC Educational Resources Information Center

    Gale, Larrie E.

    A needs assessment methodology is described which can be used (1) to assess the competencies required for functioning in a particular position, (2) to provide data for planning inservice and preservice educational programs, (3) to assess job performance, and (4) to provide information for personnel planners. Quadrants are formed using four…

  19. Identifying Reading Problems with Computer-Adaptive Assessments

    ERIC Educational Resources Information Center

    Merrell, C.; Tymms, P.

    2007-01-01

    This paper describes the development of an adaptive assessment called Interactive Computerised Assessment System (InCAS) that is aimed at children of a wide age and ability range to identify specific reading problems. Rasch measurement has been used to create the equal interval scales that form each part of the assessment. The rationale for the…

  20. Ethical Considerations in the Use of Computers in Psychological Testing and Assessment.

    ERIC Educational Resources Information Center

    Walker, N. William; Myrick, Carolyn Cobb

    1985-01-01

    Ethical considerations in use of computers in psychological testing and assessment are discussed. Existing ethics and standards that provide guidance to users of computerized test interpretation and report-writing programs are discussed and guidelines are suggested. Areas of appropriate use of computers in testing and assessment are explored.…

  1. Effects of Computer versus Paper Administration of an Adult Functional Writing Assessment

    ERIC Educational Resources Information Center

    Chen, Jing; White, Sheida; McCloskey, Michael; Soroui, Jaleh; Chun, Young

    2011-01-01

    This study investigated the comparability of paper and computer versions of a functional writing assessment administered to adults 16 and older. Three writing tasks were administered in both paper and computer modes to volunteers in the field test of an assessment of adult literacy in 2008. One set of analyses examined mode effects on scoring by…

  2. Continuance Acceptance of Computer Based Assessment through the Integration of User's Expectations and Perceptions

    ERIC Educational Resources Information Center

    Terzis, Vasileios; Moridis, Christos N.; Economides, Anastasios A.

    2013-01-01

    The Information Systems (IS) community has put considerable effort on identifying constructs that may explain the initial/continuance use of computer based learning or assessment systems. This study is a further step toward IS continuance acceptance delivered in Computer Based Assessment (CBA) context. Specifically, it aims at the exploration of…

  3. Assessment of Computer Literacy of Secondary School Teachers in Ekiti State, Nigeria

    ERIC Educational Resources Information Center

    Oluwatayo, James Ayodele

    2012-01-01

    The study assessed computer literacy of secondary school teachers in Ekiti State. Three hundred teachers (Male = 150; Female = 150) selected from 30 public schools in 15 out of 16 local government areas participated. The instrument for collecting data was a 25-item Self-Assessment of Computer Literacy questionnaire and each item was rated on a…

  4. Computer-Assisted Assessment in Higher Education. Staff and Educational Development Series.

    ERIC Educational Resources Information Center

    Brown, Sally, Ed.; Race, Phil, Ed.; Bull, Joanna, Ed.

    This book profiles how computer-assisted assessment can help both staff and students by drawing on the experience and expertise of practitioners, in the United Kingdom and internationally, who are already using computer-assisted assessment. The publication is organized into three main sections--"Pragmatics and Practicalities of CAA,""Using CAA for…

  5. Critical Factors Affecting the Assessment of Student Learning Outcomes: A Delphi Study of the Opinions of Community College Personnel

    ERIC Educational Resources Information Center

    Somerville, Jerry

    2008-01-01

    The purpose of this qualitative study was to identify critically important factors that affect the meaningful assessment of student learning outcomes and study why these factors were critically important. A three-round Delphi process was used to solicit the opinions of individuals who were actively involved in student learning outcomes assessment…

  6. A critical analysis of mini peer assessment tool (mini-PAT).

    PubMed

    Abdulla, Aza

    2008-01-01

    The structured evaluation of doctors' performance through peer review is a relatively new phenomenon brought about by public demand for accountability to patients. Medical knowledge (as assessed by examination score) is no longer a good predictor of individual performance, humanistic qualities and communication skills. The process of peer review (or multi-source assessment) was developed over the last two decades in the USA and has started to pick up momentum in the UK through the introduction of Modernizing Medical Careers. However the concept is not new. Driven by market forces, it was initially developed by industrial organizations to improve leadership qualities with a view to increasing productivity through positive behaviour change and self-awareness. Multi-source feedback is not without its problems and may not always produce its desired outcomes. In this article we review the evidence for peer review and critically discuss the current process of mini peer assessment tool (mini-PAT) as the assessment tool for peer review employed in UK. PMID:18263910

  7. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  8. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    ERIC Educational Resources Information Center

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  9. Ebola preparedness: a rapid needs assessment of critical care in a tertiary hospital

    PubMed Central

    Sutherland, Stephanie; Robillard, Nicholas; Kim, John; Dupuis, Kirsten; Thornton, Mary; Mansour, Marlene; Cardinal, Pierre

    2015-01-01

    Background: The current outbreak of Ebola has been declared a public health emergency of international concern. We performed a rigorous and rapid needs assessment to identify the desired results, the gaps in current practice, and the barriers and facilitators to the development of solutions in the provision of critical care to patients with suspected or confirmed Ebola. Methods: We conducted a qualitative study with an emergent design at a tertiary hospital in Ontario, Canada, recently designated as an Ebola centre, from Oct. 21 to Nov. 7, 2014. Participants included physicians, nurses, respiratory therapists, and staff from infection control, housekeeping, waste management, administration, facilities, and occupational health and safety. Data collection included document analysis, focus groups, interviews and walk-throughs of critical care areas with key stakeholders. Results: Fifteen themes and 73 desired results were identified, of which 55 had gaps. During the study period, solutions were implemented to fully address 8 gaps and partially address 18 gaps. Themes identified included the following: screening; response team activation; personal protective equipment; postexposure to virus; patient placement, room setup, logging and signage; intrahospital patient movement; interhospital patient movement; critical care management; Ebola-specific diagnosis and treatment; critical care staffing; visitation and contacts; waste management, environmental cleaning and management of linens; postmortem; conflict resolution; and communication. Interpretation: This investigation identified widespread gaps across numerous themes; as such, we have been able to develop a set of credible and measureable results. All hospitals need to be prepared for contact with a patient with Ebola, and the preparedness plan will need to vary based on local context, resources and site designation. PMID:26389098

  10. Cone beam computed tomography radiation dose and image quality assessments.

    PubMed

    Lofthag-Hansen, Sara

    2010-01-01

    Diagnostic radiology has undergone profound changes in the last 30 years. New technologies are available to the dental field, cone beam computed tomography (CBCT) as one of the most important. CBCT is a catch-all term for a technology comprising a variety of machines differing in many respects: patient positioning, volume size (FOV), radiation quality, image capturing and reconstruction, image resolution and radiation dose. When new technology is introduced one must make sure that diagnostic accuracy is better or at least as good as the one it can be expected to replace. The CBCT brand tested was two versions of Accuitomo (Morita, Japan): 3D Accuitomo with an image intensifier as detector, FOV 3 cm x 4 cm and 3D Accuitomo FPD with a flat panel detector, FOVs 4 cm x 4 cm and 6 cm x 6 cm. The 3D Accuitomo was compared with intra-oral radiography for endodontic diagnosis in 35 patients with 46 teeth analyzed, of which 41 were endodontically treated. Three observers assessed the images by consensus. The result showed that CBCT imaging was superior with a higher number of teeth diagnosed with periapical lesions (42 vs 32 teeth). When evaluating 3D Accuitomo examinations in the posterior mandible in 30 patients, visibility of marginal bone crest and mandibular canal, important anatomic structures for implant planning, was high with good observer agreement among seven observers. Radiographic techniques have to be evaluated concerning radiation dose, which requires well-defined and easy-to-use methods. Two methods: CT dose index (CTDI), prevailing method for CT units, and dose-area product (DAP) were evaluated for calculating effective dose (E) for both units. An asymmetric dose distribution was revealed when a clinical situation was simulated. Hence, the CTDI method was not applicable for these units with small FOVs. Based on DAP values from 90 patient examinations effective dose was estimated for three diagnostic tasks: implant planning in posterior mandible and

  11. Self-Regulation of Learning within Computer-Based Learning Environments: A Critical Analysis

    ERIC Educational Resources Information Center

    Winters, Fielding I.; Greene, Jeffrey A.; Costich, Claudine M.

    2008-01-01

    Computer-based learning environments (CBLEs) present important opportunities for fostering learning; however, studies have shown that students have difficulty when learning with these environments. Research has identified that students' self-regulatory learning (SRL) processes may mediate the hypothesized positive relations between CBLEs and…

  12. Fostering Critical Reflection in a Computer-Based, Asynchronously Delivered Diversity Training Course

    ERIC Educational Resources Information Center

    Givhan, Shawn T.

    2013-01-01

    This dissertation study chronicles the creation of a computer-based, asynchronously delivered diversity training course for a state agency. The course format enabled efficient delivery of a mandatory curriculum to the Massachusetts Department of State Police workforce. However, the asynchronous format posed a challenge to achieving the learning…

  13. Embodying Our Values in Our Teaching Practices: Building Open and Critical Discourse through Computer Mediated Communication.

    ERIC Educational Resources Information Center

    Geelan, David R.; Taylor, Peter C.

    2001-01-01

    Describes the use of computer-mediated communication to develop a cooperative learning community among students in a Web-based distance education unit for practicing science and mathematics educators in Australia and Pacific Rim countries. Discusses use of the social constructivist and constructionist conceptions of teaching and learning.…

  14. Embodying Our Values in Our Teaching Practices: Building Open and Critical Discourse through Computer Mediated Communication

    ERIC Educational Resources Information Center

    Geelan, David R.; Taylor, Peter C.

    2004-01-01

    Computer mediated communication--including web pages, email and web-based bulletin boards--was used to support the development of a cooperative learning community among students in a web-based distance education unit for practicing science and mathematics educators. The students lived in several Australian states and a number of Pacific Rim…

  15. Evaluating How the Computer-Supported Collaborative Learning Community Fosters Critical Reflective Practices

    ERIC Educational Resources Information Center

    Ma, Ada W.W.

    2013-01-01

    In recent research, little attention has been paid to issues of methodology and analysis methods to evaluate the quality of the collaborative learning community. To address such issues, an attempt is made to adopt the Activity System Model as an analytical framework to examine the relationship between computer supported collaborative learning…

  16. The Use of Computer Technology in University Teaching and Learning: A Critical Perspective

    ERIC Educational Resources Information Center

    Selwyn, N.

    2007-01-01

    Despite huge efforts to position information and communication technology (ICT) as a central tenet of university teaching and learning, the fact remains that many university students and faculty make only limited formal academic use of computer technology. Whilst this is usually attributed to a variety of operational deficits on the part of…

  17. The statistical-thermodynamic basis for computation of binding affinities: a critical review.

    PubMed Central

    Gilson, M K; Given, J A; Bush, B L; McCammon, J A

    1997-01-01

    Although the statistical thermodynamics of noncovalent binding has been considered in a number of theoretical papers, few methods of computing binding affinities are derived explicitly from this underlying theory. This has contributed to uncertainty and controversy in certain areas. This article therefore reviews and extends the connections of some important computational methods with the underlying statistical thermodynamics. A derivation of the standard free energy of binding forms the basis of this review. This derivation should be useful in formulating novel computational methods for predicting binding affinities. It also permits several important points to be established. For example, it is found that the double-annihilation method of computing binding energy does not yield the standard free energy of binding, but can be modified to yield this quantity. The derivation also makes it possible to define clearly the changes in translational, rotational, configurational, and solvent entropy upon binding. It is argued that molecular mass has a negligible effect upon the standard free energy of binding for biomolecular systems, and that the cratic entropy defined by Gurney is not a useful concept. In addition, the use of continuum models of the solvent in binding calculations is reviewed, and a formalism is presented for incorporating a limited number of solvent molecules explicitly. PMID:9138555

  18. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  19. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    NASA Astrophysics Data System (ADS)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  20. Computer Simulation as a Tool for Assessing Decision-Making in Pandemic Influenza Response Training

    PubMed Central

    Leaming, James M.; Adoff, Spencer; Terndrup, Thomas E.

    2013-01-01

    Introduction: We sought to develop and test a computer-based, interactive simulation of a hypothetical pandemic influenza outbreak. Fidelity was enhanced with integrated video and branching decision trees, built upon the 2007 federal planning assumptions. We conducted a before-and-after study of the simulation effectiveness to assess the simulations' ability to assess participants' beliefs regarding their own hospitals' mass casualty incident preparedness. Methods: Development: Using a Delphi process, we finalized a simulation that serves up a minimum of over 50 key decisions to 6 role-players on networked laptops in a conference area. The simulation played out an 8-week scenario, beginning with pre-incident decisions. Testing: Role-players and trainees (N=155) were facilitated to make decisions during the pandemic. Because decision responses vary, the simulation plays out differently, and a casualty counter quantifies hypothetical losses. The facilitator reviews and critiques key factors for casualty control, including effective communications, working with external organizations, development of internal policies and procedures, maintaining supplies and services, technical infrastructure support, public relations and training. Pre- and post-survey data were compared on trainees. Results: Post-simulation trainees indicated a greater likelihood of needing to improve their organization in terms of communications, mass casualty incident planning, public information and training. Participants also recognized which key factors required immediate attention at their own home facilities. Conclusion: The use of a computer-simulation was effective in providing a facilitated environment for determining the perception of preparedness, evaluating general preparedness concepts and introduced participants to critical decisions involved in handling a regional pandemic influenza surge. PMID:23687542

  1. Development and assessment of a clinically viable system for breast ultrasound computer-aided diagnosis

    NASA Astrophysics Data System (ADS)

    Gruszauskas, Nicholas Peter

    The chances of surviving a breast cancer diagnosis as well as the effectiveness of any potential treatments increase significantly with early detection of the disease. As such, a considerable amount of research is being conducted to augment the breast cancer detection and diagnosis process. One such area of research involves the investigation and application of sophisticated computer algorithms to assist clinicians in detecting and diagnosing breast cancer on medical images (termed generally as "computer-aided diagnosis" or CAD). This study investigated a previously-developed breast ultrasound CAD system with the intent of translating it into a clinically-viable system. While past studies have demonstrated that breast ultrasound CAD may be a beneficial aid during the diagnosis of breast cancer on ultrasound, there are no investigations concerning its potential clinical translation and there are currently no commercially-available implementations of such systems. This study "bridges the gap" between the laboratory-developed system and the steps necessary for clinical implementation. A novel observer study was conducted that mimicked the clinical use of the breast ultrasound CAD system in order to assess the impact it had on the diagnostic performance of the user. Several robustness studies were also performed: the sonographic features used by the system were evaluated and the databases used for calibration and testing were characterized, the effect of the user's input was assessed by evaluating the performance of the system with variations in lesion identification and image selection, and the performance of the system on different patient populations was investigated by evaluating its performance on a database consisting solely of patients with Asian ethnicity. The analyses performed here indicate that the breast ultrasound CAD system under investigation is robust and demonstrates only minor variability when subjected to "real-world" use. All of these results are

  2. Bound on quantum computation time: Quantum error correction in a critical environment

    SciTech Connect

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-08-15

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  3. Ensuring critical event sequences in high consequence computer based systems as inspired by path expressions

    SciTech Connect

    Kidd, M.E.C.

    1997-02-01

    The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.

  4. A critical review of environmental assessment tools for sustainable urban design

    SciTech Connect

    Ameen, Raed Fawzi Mohammed; Mourshed, Monjur; Li, Haijiang

    2015-11-15

    Cities are responsible for the depletion of natural resources and agricultural lands, and 70% of global CO{sub 2} emissions. There are significant risks to cities from the impacts of climate change in addition to existing vulnerabilities, primarily because of rapid urbanization. Urban design and development are generally considered as the instrument to shape the future of the city and they determine the pattern of a city's resource usage and resilience to change, from climate or otherwise. Cities are inherently dynamic and require the participation and engagement of their diverse stakeholders for the effective management of change, which enables wider stakeholder involvement and buy-in at various stages of the development process. Sustainability assessment of urban design and development is increasingly being seen as indispensable for informed decision-making. A sustainability assessment tool also acts as a driver for the uptake of sustainable pathways by recognizing excellence through their rating system and by creating a market demand for sustainable products and processes. This research reviews six widely used sustainability assessment tools for urban design and development: BREEAM Communities, LEED-ND, CASBEE-UD, SBTool{sup PT}–UP, Pearl Community Rating System (PCRS) and GSAS/QSAS, to identify, compare and contrast the aim, structure, assessment methodology, scoring, weighting and suitability for application in different geographical contexts. Strengths and weaknesses of each tool are critically discussed. The study highlights the disparity in local and international contexts for global sustainability assessment tools. Despite their similarities in aim on environmental aspects, differences exist in the relative importance and share of mandatory vs optional indicators in both environmental and social dimensions. PCRS and GSAS/QSAS are new incarnations, but have widely varying shares of mandatory indicators, at 45.4% and 11.36% respectively, compared to 30% in

  5. The Sixth Rhino: A Taxonomic Re-Assessment of the Critically Endangered Northern White Rhinoceros

    PubMed Central

    Groves, Colin P.; Fernando, Prithiviraj; Robovský, Jan

    2010-01-01

    Background The two forms of white rhinoceros; northern and southern, have had contrasting conservation histories. The Northern form, once fairly numerous is now critically endangered, while the southern form has recovered from a few individuals to a population of a few thousand. Since their last taxonomic assessment over three decades ago, new material and analytical techniques have become available, necessitating a review of available information and re-assessment of the taxonomy. Results Dental morphology and cranial anatomy clearly diagnosed the southern and northern forms. The differentiation was well supported by dental metrics, cranial growth and craniometry, and corresponded with differences in post-cranial skeleton, external measurements and external features. No distinctive differences were found in the limited descriptions of their behavior and ecology. Fossil history indicated the antiquity of the genus, dating back at least to early Pliocene and evolution into a number of diagnosable forms. The fossil skulls examined fell outside the two extant forms in the craniometric analysis. Genetic divergence between the two forms was consistent across both nuclear and mitochondrial genomes, and indicated a separation of over a million years. Conclusions On re-assessing the taxonomy of the two forms we find them to be morphologically and genetically distinct, warranting the recognition of the taxa formerly designated as subspecies; Ceratotherium simum simum the southern form and Ceratotherium simum cottoni the northern form, as two distinct species Ceratotherium simum and Ceratotherium cottoni respectively. The recognition of the northern form as a distinct species has profound implications for its conservation. PMID:20383328

  6. Validity Evidence for a Computer-Based Alternate Assessment Instrument

    ERIC Educational Resources Information Center

    Dyehouse, Melissa A.; Bennett, Deborah E.

    2006-01-01

    This study investigated the validity of a statewide alternate assessment program, IASEP (Indiana Assessment System of Educational Proficiencies) by examining supporting profile patterns on the 100 core IASEP items of individuals with significant disabilities. Participants were 5,192 students ranging in age from 7-21 years with special education…

  7. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    ERIC Educational Resources Information Center

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  8. Current Assessment and Classification of Suicidal Phenomena using the FDA 2012 Draft Guidance Document on Suicide Assessment: A Critical Review

    PubMed Central

    Giddens, Jennifer M.; Sheehan, Kathy Harnett

    2014-01-01

    Objective: Standard international classification criteria require that classification categories be comprehensive to avoid type II error. Categories should be mutually exclusive and definitions should be clear and unambiguous (to avoid type I and type II errors). In addition, the classification system should be robust enough to last over time and provide comparability between data collections. This article was designed to evaluate the extent to which the classification system contained in the United States Food and Drug Administration 2012 Draft Guidance for the prospective assessment and classification of suicidal ideation and behavior in clinical trials meets these criteria. Method: A critical review is used to assess the extent to which the proposed categories contained in the Food and Drug Administration 2012 Draft Guidance are comprehensive, unambiguous, and robust. Assumptions that underlie the classification system are also explored. Results: The Food and Drug Administration classification system contained in the 2012 Draft Guidance does not capture the full range of suicidal ideation and behavior (type II error). Definitions, moreover, are frequently ambiguous (susceptible to multiple interpretations), and the potential for misclassification (type I and type II errors) is compounded by frequent mismatches in category titles and definitions. These issues have the potential to compromise data comparability within clinical trial sites, across sites, and over time. Conclusion: These problems need to be remedied because of the potential for flawed data output and consequent threats to public health, to research on the safety of medications, and to the search for effective medication treatments for suicidality. PMID:25520889

  9. Critical anatomic region of nasopalatine canal based on tridimensional analysis: cone beam computed tomography

    PubMed Central

    Fernández-Alonso, Ana; Antonio Suárez-Quintanilla, Juan; Muinelo-Lorenzo, Juan; Varela-Mallou, Jesús; Smyth Chamosa, Ernesto; Mercedes Suárez-Cunqueiro, María

    2015-01-01

    The study aim of this was to define the critical anatomic region of the premaxilla by evaluating dimensions of nasopalatine canal, buccal bone plate (BBP) and palatal bone plate (PBP). 230 CBCTs were selected with both, one or no upper central incisors present (+/+, −/+, −/−) and periodontal condition was evaluated. T-student test, ANOVA, Pearson´s correlation and a multivariant-linear regression model (MLRM) were used. Regarding gender, significant differences at level 1 (lower NC) were found for: buccal-palatal, transversal and sagittal NC diameters, and NC length (NCL). Regarding dental status, significant differences were found for: total BBP length (tBL) and PBP width (PW2) at level 2 (NCL midpoint). NCL was correlated with PW2, tBL, and PBP length at level 3 (foramina of Stenson level). An MLRM had a high prediction value for NCL (69.3%). Gender is related to NC dimensions. Dental status has an influence on BBP dimensions, but does not influence on NC and PBP. Periodontal condition should be evaluated for precise premaxillae analysis NC diameters at the three anatomical planes are related to each other, while NCL is related to BBP and PBP lengths. A third of premaxilla is taken up by NC, thus, establishing the critical anatomic region. PMID:26245884

  10. A Model for Computer-based Assessment: The Catherine Wheel Principle.

    ERIC Educational Resources Information Center

    Zakrzewski, Stan; Steven, Christine

    2000-01-01

    This paper proposes a model for computer-based assessment systems that utilizes a step-wise approach to assessment design and implementation, within which the management and assessment of operational, technical, pedagogic, and financial risks are made explicit. The cyclic model has five components: planning, risk analysis and management,…

  11. Authoring of Adaptive Computer Assisted Assessment of Free-Text Answers

    ERIC Educational Resources Information Center

    Alfonseca, Enrique; Carro, Rosa M.; Freire, Manuel; Ortigosa, Alvaro; Perez, Diana; Rodriguez, Pilar

    2005-01-01

    Adaptation techniques can be applied not only to the multimedia contents or navigational possibilities of a course, but also to the assessment. In order to facilitate the authoring of adaptive free-text assessment and its integration within adaptive web-based courses, Adaptive Hypermedia techniques and Free-text Computer Assisted Assessment are…

  12. Assessment of left ventricular ejection fraction using an ultrasonic stethoscope in critically ill patients

    PubMed Central

    2012-01-01

    Introduction Assessment of cardiac function is key in the management of intensive care unit (ICU) patients and frequently relies on the use of standard transthoracic echocardiography (TTE). A commercially available new generation ultrasound system with two-dimensional imaging capability, which has roughly the size of a mobile phone, is adequately suited to extend the physical examination. The primary endpoint of this study was to evaluate the additional value of this new miniaturized device used as an ultrasonic stethoscope (US) for the determination of left ventricular (LV) systolic function, when compared to conventional clinical assessment by experienced intensivists. The secondary endpoint was to validate the US against TTE for the semi-quantitative assessment of left ventricular ejection fraction (LVEF) in ICU patients. Methods In this single-center prospective descriptive study, LVEF was independently assessed clinically by the attending physician and echocardiographically by two experienced intensivists trained in critical care echocardiography who used the US (size: 135 × 73 × 28 mm; weight: 390 g) and TTE. LVEF was visually estimated semi-quantitatively and classified in one of the following categories: increased (LVEF > 75%), normal (LVEF: 50 to 75%), moderately reduced (LVEF: 30 to 49%), or severely reduced (LVEF < 30%). Biplane LVEF measured using the Simpson's rule on TTE loops by an independent investigator was used as reference. Results A total of 94 consecutive patients were studied (age: 60 ± 17 years; simplified acute physiologic score 2: 41 ± 15), 63 being mechanically ventilated and 36 receiving vasopressors and/or inotropes. Diagnostic concordance between the clinically estimated LVEF and biplane LVEF was poor (Kappa: 0.33; 95% CI: 0.16 to 0.49) and only slightly improved by the knowledge of a previously determined LVEF value (Kappa: 0.44; 95% CI: 0.22 to 0.66). In contrast, the diagnostic agreement was good between visually assessed LVEF

  13. A systematic review and critical assessment of incentive strategies for discovery and development of novel antibiotics.

    PubMed

    Renwick, Matthew J; Brogan, David M; Mossialos, Elias

    2016-02-01

    Despite the growing threat of antimicrobial resistance, pharmaceutical and biotechnology firms are reluctant to develop novel antibiotics because of a host of market failures. This problem is complicated by public health goals that demand antibiotic conservation and equitable patient access. Thus, an innovative incentive strategy is needed to encourage sustainable investment in antibiotics. This systematic review consolidates, classifies and critically assesses a total of 47 proposed incentives. Given the large number of possible strategies, a decision framework is presented to assist with the selection of incentives. This framework focuses on addressing market failures that result in limited investment, public health priorities regarding antibiotic stewardship and patient access, and implementation constraints and operational realities. The flexible nature of this framework allows policy makers to tailor an antibiotic incentive package that suits a country's health system structure and needs. PMID:26464014

  14. A Tool for Music Preference Assessment in Critically Ill Patients Receiving Mechanical Ventilatory Support

    PubMed Central

    CHLAN, LINDA; HEIDERSCHEIT, ANNIE

    2010-01-01

    Music is an ideal intervention to reduce anxiety and promote relaxation in critically ill patients. This article reviews the research studies on music-listening interventions to manage distressful symptoms in this population, and describes the development and implementation of the Music Assessment Tool (MAT) to assist professionals in ascertaining patients’ music preferences in the challenging, dynamic clinical environment of the intensive care unit (ICU). The MAT is easy to use with these patients who experience profound communication challenges due to fatigue and inability to speak because of endotracheal tube placement. The music therapist and ICU nursing staff are encouraged to work collaboratively to implement music in a personalized manner to ensure the greatest benefit for mechanically ventilated patients. PMID:24489432

  15. A comparative and critical evaluation of odour assessment methods on a landfill site

    NASA Astrophysics Data System (ADS)

    Capelli, Laura; Sironi, Selena; Del Rosso, Renato; Céntola, Paolo; Il Grande, Massimiliano

    This paper discusses the application of three different odour characterization techniques, i.e. chemical analyses, dynamic olfactometry and electronic noses, for the assessment of odour emissions from a landfill site. The results of the chemical analyses, which are useful to determine the chemical composition of odours, show no correlation with the odour concentration values measured by dynamic olfactometry. Olfactometric analyses enabled to measure odour concentration and thereby to quantify the sensory impact of odours. Finally, the continuous ambient air monitoring by electronic noses allowed to quantify the time percentage in which odours are detected at the landfill boundaries and at a receptor, which always turned out to be lower than 15%. This study represents a critical review of employing three different odour characterization methods for differing reasons on the same site, showing that, whilst the results don't necessarily correlate, they do have an intrinsic value, and therefore demonstrating the complexity of environmental odour measurement.

  16. Taste-masking assessment of solid oral dosage forms--a critical review.

    PubMed

    Pein, Miriam; Preis, Maren; Eckert, Carolin; Kiene, Florian E

    2014-04-25

    Approaches to improve the taste of oral dosage forms that contain unpleasant tasting drugs are versatile. Likewise, the analytical in vitro and in vivo methods to assess taste-masking efficacy are diverse. Taste-masking has gained in importance since the EU legislation on medicines for children came into force in 2007, and taste-masking attributes are often required by regulatory authorities. However, standardized guidance for the analytical evaluation is still poor. Published protocols rarely consider real conditions, such as the volume of saliva or the residence time of solid oral dosage forms in the mouth. Methodological limitations and problems regarding time point of evaluation, sampling or sample pretreatment are hardly ever addressed. This critical review aims to evaluate and discuss published strategies in this context. PMID:24509066

  17. Regulatory assessment of safety critical software used in upgrades to analog systems

    SciTech Connect

    Taylor, R.P.

    1994-12-31

    As a result of the difficulties encountered by both licensee and regulator during the licensing of the Darlington nuclear generating station software-based shutdown systems, Ontario Hydro was directed by the Atomic Energy Control Board (AECL) to produce improved company standards and procedures for safety-critical software development. In partnership with Atomic Energy of Canada Ltd. (AECL), a joint committee called OASES (Ontario Hydro/AECL Software Engineering Standards) has developed a suite of standards and procedures for software specification, design, implementation, verification, testing, and safety analysis. These standards are now being applied to new systems and are being adapted for use on upgrades to existing systems. Several digital protection systems have been installed recently in Canadian nuclear plants, such as a primary heat transport pump trip and an emergency powerhouse venting system. We have learned from the experience of assessing these systems and are now applying these lessons to systems developed under the new OASES standards.

  18. Application of fracture mechanics in maintenance of high temperature equipment -- An assessment of critical needs

    SciTech Connect

    Saxena, A.

    1997-12-31

    Extending the operating life of power-plant, chemical reactor and land, sea and air based gas turbine components beyond their original design life has considerable economic advantages. Fracture mechanics is used extensively to predict the remaining life and safe inspection intervals as part of maintenance programs for these systems. The presence of creep deformation and time-dependent damage accumulation in these components present very significant challenges. Therefore, the emphasis of this paper is on the time-dependent fracture mechanics concepts. A critical assessment of the current state-of-the-art of time dependent fracture mechanics concepts, test techniques, analytical procedures and application tools is made to demonstrate the potential of this technology in maintenance engineering. In addition, future developments that are needed to enhance the application of this technology are also described and limits of the current approaches are also discussed.

  19. A systematic review and critical assessment of incentive strategies for discovery and development of novel antibiotics

    PubMed Central

    Renwick, Matthew J; Brogan, David M; Mossialos, Elias

    2016-01-01

    Despite the growing threat of antimicrobial resistance, pharmaceutical and biotechnology firms are reluctant to develop novel antibiotics because of a host of market failures. This problem is complicated by public health goals that demand antibiotic conservation and equitable patient access. Thus, an innovative incentive strategy is needed to encourage sustainable investment in antibiotics. This systematic review consolidates, classifies and critically assesses a total of 47 proposed incentives. Given the large number of possible strategies, a decision framework is presented to assist with the selection of incentives. This framework focuses on addressing market failures that result in limited investment, public health priorities regarding antibiotic stewardship and patient access, and implementation constraints and operational realities. The flexible nature of this framework allows policy makers to tailor an antibiotic incentive package that suits a country's health system structure and needs. PMID:26464014

  20. PROBABILISTIC ASSESSMENT OF A CRITICALITY IN A WASTE CONTAINER AT SRS

    SciTech Connect

    Eghbali, D

    2006-12-26

    Transuranic solid waste that has been generated as a result of the production of nuclear material for the United States defense program at the Savannah River Site (SRS) has been stored in more than 30,000 55-gallon drums and various size carbon steel boxes since 1953. Nearly two thirds of those containers have been processed and shipped to the Waste Isolation Pilot Plant. Among the containers assayed so far, the results indicate several drums with fissile inventories significantly higher (600-1000 grams {sup 239}Pu) than their original assigned values. While part of this discrepancy can be attributed to the past limited assay capabilities, human errors are believed to be the primary contributor. This paper summarizes an assessment of the probability of occurrence of a criticality accident during handling of the remaining transuranic waste containers at SRS.

  1. Assessing a Critical Aspect of Construct Continuity when Test Specifications Change or Test Forms Deviate from Specifications

    ERIC Educational Resources Information Center

    Liu, Jinghua; Dorans, Neil J.

    2013-01-01

    We make a distinction between two types of test changes: inevitable deviations from specifications versus planned modifications of specifications. We describe how score equity assessment (SEA) can be used as a tool to assess a critical aspect of construct continuity, the equivalence of scores, whenever planned changes are introduced to testing…

  2. Faculty Approaches to Assessing Critical Thinking in the Humanities and the Natural and Social Sciences: Implications for General Education

    ERIC Educational Resources Information Center

    Nicholas, Mark C.; Labig, Chalmer E., Jr.

    2013-01-01

    An analysis of interviews, focus-group discussions, assessment instruments, and assignment prompts revealed that within general education, faculty assessed critical thinking as faceted using methods and criteria that varied epistemically across disciplines. Faculty approaches were misaligned with discipline-general institutional approaches.…

  3. Temporal discounting in life cycle assessment: A critical review and theoretical framework

    SciTech Connect

    Yuan, Chris; Wang, Endong; Zhai, Qiang; Yang, Fan

    2015-02-15

    Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting in LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.

  4. Cone beam computed tomography aided diagnosis and treatment of endodontic cases: Critical analysis

    PubMed Central

    Yılmaz, Funda; Kamburoglu, Kıvanç; Yeta, Naz Yakar; Öztan, Meltem Dartar

    2016-01-01

    Although intraoral radiographs still remain the imaging method of choice for the evaluation of endodontic patients, in recent years, the utilization of cone beam computed tomography (CBCT) in endodontics showed a significant jump. This case series presentation shows the importance of CBCT aided diagnosis and treatment of complex endodontic cases such as; root resorption, missed extra canal, fusion, oblique root fracture, non-diagnosed periapical pathology and horizontal root fracture. CBCT may be a useful diagnostic method in several endodontic cases where intraoral radiography and clinical examination alone are unable to provide sufficient information. PMID:27551342

  5. Cone beam computed tomography aided diagnosis and treatment of endodontic cases: Critical analysis.

    PubMed

    Yılmaz, Funda; Kamburoglu, Kıvanç; Yeta, Naz Yakar; Öztan, Meltem Dartar

    2016-07-28

    Although intraoral radiographs still remain the imaging method of choice for the evaluation of endodontic patients, in recent years, the utilization of cone beam computed tomography (CBCT) in endodontics showed a significant jump. This case series presentation shows the importance of CBCT aided diagnosis and treatment of complex endodontic cases such as; root resorption, missed extra canal, fusion, oblique root fracture, non-diagnosed periapical pathology and horizontal root fracture. CBCT may be a useful diagnostic method in several endodontic cases where intraoral radiography and clinical examination alone are unable to provide sufficient information. PMID:27551342

  6. Long-Term Assessment of Critical Radionuclides and Associated Environmental Media at the Savannah River Site

    SciTech Connect

    Jannik, G. T.; Baker, R. A.; Lee, P. L.; Eddy, T. P.; Blount, G. C.; Whitney, G. R.

    2012-11-06

    During the operational history of the Savannah River Site (SRS), many different radionuclides have been released from site facilities. However, only a relatively small number of the released radionuclides have been significant contributors to doses and risks to the public. At SRS dose and risk assessments indicate tritium oxide in air and surface water, and Cs-137 in fish and deer have been, and continue to be, the critical radionuclides and pathways. In this assessment, indepth statistical analyses of the long-term trends of tritium oxide in atmospheric and surface water releases and Cs-137 concentrations in fish and deer are provided. Correlations also are provided with 1) operational changes and improvements, 2) geopolitical events (Cold War cessation), and 3) recent environmental remediation projects and decommissioning of excess facilities. For example, environmental remediation of the F- and H-Area Seepage Basins and the Solid Waste Disposal Facility have resulted in a measurable impact on the tritium oxide flux to the onsite Fourmile Branch stream. Airborne releases of tritium oxide have been greatly affected by operational improvements and the end of the Cold War in 1991. However, the effects of SRS environmental remediation activities and ongoing tritium operations on tritium concentrations in the environment are measurable and documented in this assessment. Controlled hunts of deer and feral hogs are conducted at SRS for approximately six weeks each year. Before any harvested animal is released to a hunter, SRS personnel perform a field analysis for Cs-137 concentrations to ensure the hunter's dose does not exceed the SRS administrative game limit of 0.22 millisievert (22 mrem). However, most of the Cs-137 found in SRS onsite deer is not from site operations but is from nuclear weapons testing fallout from the 1950's and early 1960's. This legacy source term is trended in the SRS deer, and an assessment of the ''effective'' half-life of Cs-137 in deer

  7. Swimming Training Assessment: The Critical Velocity and the 400-m Test for Age-Group Swimmers.

    PubMed

    Zacca, Rodrigo; Fernandes, Ricardo Jorge P; Pyne, David B; Castro, Flávio Antônio de S

    2016-05-01

    Zacca, R, Fernandes, RJP, Pyne, DB, and Castro, FAdS. Swimming training assessment: the critical velocity and the 400-m test for age-group swimmers. J Strength Cond Res 30(5): 1365-1372, 2016-To verify the metabolic responses of oxygen consumption (V[Combining Dot Above]O2), heart rate (HR), blood lactate concentrations [La], and rate of perceived exertion (RPE) when swimming at an intensity corresponding to the critical velocity (CV) assessed by a 4-parameter model (CV4par), and to check the reliability when using only a single 400-m maximal front crawl bout (T400) for CV4par assessment in age-group swimmers. Ten age-group swimmers (14-16 years old) performed 50-, 100-, 200-, 400- (T400), 800-, and 1,500-m maximal front crawl bouts to calculate CV4par. V[Combining Dot Above]O2, HR, [La], and RPE were measured immediately after bouts. Swimmers then performed 3 × 10-minute front crawl (45 seconds rest) at CV4par. V[Combining Dot Above]O2, HR, [La], and RPE were measured after 10 minutes of rest (Rest), warm-up (Pre), each 10-minute repetition, and at the end of the test (Post). CV4par was 1.33 ± 0.08 m·s. V[Combining Dot Above]O2, HR, [La], and RPE were similar between first 10-minute and Post time points in the 3 × 10-minute protocol. CV4par was equivalent to 92 ± 2% of the mean swimming speed of T400 (v400) for these swimmers. CV4par calculated through a single T400 (92%v400) showed excellent agreement (r = 0.30; 95% CI: -0.04 to 0.05 m·s, p = 0.39), low coefficient of variation (2%), and root mean square error of 0.02 ± 0.01 m·s when plotted against CV4par assessed through a 4-parameter model. These results generated the equation CV4par = 0.92 × v400. A single T400 can be used reliably to estimate the CV4par typically derived with 6 efforts in age-group swimmers. PMID:26473520

  8. An assessment of future computer system needs for large-scale computation

    NASA Technical Reports Server (NTRS)

    Lykos, P.; White, J.

    1980-01-01

    Data ranging from specific computer capability requirements to opinions about the desirability of a national computer facility are summarized. It is concluded that considerable attention should be given to improving the user-machine interface. Otherwise, increased computer power may not improve the overall effectiveness of the machine user. Significant improvement in throughput requires highly concurrent systems plus the willingness of the user community to develop problem solutions for that kind of architecture. An unanticipated result was the expression of need for an on-going cross-disciplinary users group/forum in order to share experiences and to more effectively communicate needs to the manufacturers.

  9. Regulating fatty acids in infant formula: critical assessment of U.S. policies and practices

    PubMed Central

    2014-01-01

    Background Fatty acids in breast-milk such as docosahexaenoic acid and arachidonic acid, commonly known as DHA and ARA, contribute to the healthy development of children in various ways. However, the manufactured versions that are added to infant formula might not have the same health benefits as those in breast-milk. There is evidence that the manufactured additives might cause harm to infants’ health, and they might lead to unwarranted increases in the cost of infant formula. The addition of such fatty acids to infant formula needs to be regulated. In the U.S., the Food and Drug Administration has primary responsibility for regulating the composition of infant formula. The central purpose of this study is to assess the FDA’s efforts with regard to the regulation of fatty acids in infant formula. Methods This study is based on critical analysis of policies and practices described in publicly available documents of the FDA, the manufacturers of fatty acids, and other relevant organizations. The broad framework for this work was set out by the author in his book on Regulating Infant Formula, published in 2011. Results The FDA does not assess the safety or the health impacts of fatty acid additives to infant formula before they are marketed, and there is no systematic assessment after marketing is underway. Rather than making its own independent assessments, the FDA accepts the manufacturers’ claims regarding their products’ safety and effectiveness. Conclusions The FDA is not adequately regulating the use of fatty acid additives to infant formula. This results in exposure of infants to potential risks. Adverse reactions are already on record. Also, the additives have led to increasing costs of infant formula despite the lack of proven benefits to normal, full term infants. There is a need for more effective regulation of DHA and ARA additives to infant formula. PMID:24433303

  10. Use of computer-aided testing in the investigation of pilot response to critical in-flight events. Volume 2: Appendix

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.

    1982-01-01

    Computer displays using PLATO are illustrated. Diagnostic scenarios are described. A sample of subject data is presented. Destination diversion displays, a combined destination, diversion scenario, and critical in-flight event (CIFE) data collection/subject testing system are presented.

  11. The Effect on Reasoning, Reading and Number Performance of Computer-Presented Critical Thinking Activities in Five-Year-Old Children.

    ERIC Educational Resources Information Center

    Riding, R. J.; Powell, S. D.

    1987-01-01

    Reports on a study which investigated the possibility of improving five-year-olds' critical thinking skills in reading and mathematics by using computers. Results indicate improvement in the reading area but not in the mathematics area. (RKM)

  12. Implementing and Assessing Computational Modeling in Introductory Mechanics

    ERIC Educational Resources Information Center

    Caballero, Marcos D.; Kohlmyer, Matthew A.; Schatz, Michael F.

    2012-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term, 1357 students in this course solved a suite of 14 computational…

  13. Evaluation and Assessment of a Biomechanics Computer-Aided Instruction.

    ERIC Educational Resources Information Center

    Washington, N.; Parnianpour, M.; Fraser, J. M.

    1999-01-01

    Describes the Biomechanics Tutorial, a computer-aided instructional tool that was developed at Ohio State University to expedite the transition from lecture to application for undergraduate students. Reports evaluation results that used statistical analyses and student questionnaires to show improved performance on posttests as well as positive…

  14. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  15. A Micro-Computer Based System for the Management of the Critically Ill

    PubMed Central

    Comerchero, Harry; Thomas, Gregory; Shapira, Gaby; Greatbatch, Mennen; Hoyt, John W.

    1978-01-01

    A central station based system is described which employs a micro-computer for continuous monitoring of hemodynamic parameters for multiple patients. Monitored vital signs are displayed on a “WARD STATUS” video monitor and processed for long-term trend storage and retrieval. Alarm events and changes in module settings at the bedside are immediately reflected on the WARD STATUS display. Medication administration can be indicated and presented together with the graphical trends of any monitored parameter. Optional features of the system include on-line determination of Cardiac Output, Pulmonary Wedge Pressure Measurements, Arrhythmia and Respiratory Monitoring. An alphanumeric terminal connected to the micro-computer facilitates “background” programming in high level languages. This facility can be used to provide tailored patient data management capability to the medical staff or can be used as a tool for in-house development of special purpose application programs. The system is currently implemented on a Digital Equipment Corporation LSI-11 with 28K memory and dual floppy disks. ImagesFig. 2

  16. The Application of Web-based Computer-assisted Instruction Courseware within Health Assessment

    NASA Astrophysics Data System (ADS)

    Xiuyan, Guo

    Health assessment is a clinical nursing course and places emphasis on clinical skills. The application of computer-assisted instruction in the field of nursing teaching solved the problems in the traditional lecture class. This article stated teaching experience of web-based computer-assisted instruction, based upon a two-year study of computer-assisted instruction courseware use within the course health assessment. The computer-assisted instruction courseware could develop teaching structure, simulate clinical situations, create teaching situations and facilitate students study.

  17. Computational fluid dynamics approaches in quality and hygienic production of semisolid low-moisture foods: a review of critical factors.

    PubMed

    Mondal, Arpita; Buchanan, Robert L; Lo, Y Martin

    2014-10-01

    Low-moisture foods have been responsible for a number of salmonellosis outbreaks worldwide over the last few decades, with cross contamination from contaminated equipment being the most predominant source. To date, actions have been focused on stringent hygienic practices prior to production, namely periodical sanitization of the processing equipment and lines. Not only does optimum sanitization require in-depth knowledge on the type and source of contaminants, but also the heat resistance of microorganisms is unique and often dependent on the heat transfer characteristics of the low-moisture foods. Rheological properties, including viscosity, degree of turbulence, and flow characteristics (for example, Newtonian or non-Newtonian) of both liquid and semisolid foods are critical factors impacting the flow behavior that consequently interferes heat transfer and related control elements. The demand for progressively more accurate prediction of complex fluid phenomena has called for the employment of computational fluid dynamics (CFD) to model mass and heat transfer during processing of various food products, ranging from drying to baking. With the aim of improving the quality and safety of low-moisture foods, this article critically reviewed the published literature concerning microbial survival in semisolid low-moisture foods, including chocolate, honey, and peanut butter. Critical rheological properties and state-of-the-art CFD application relevant to quality production of those products were also addressed. It is anticipated that adequate prediction of specific transport properties during optimum sanitization through CFD could be used to solve current and future food safety challenges. PMID:25224872

  18. Online training course on critical appraisal for nurses: adaptation and assessment

    PubMed Central

    2014-01-01

    Background Research is an essential activity for improving quality and efficiency in healthcare. The objective of this study was to train nurses from the public Basque Health Service (Osakidetza) in critical appraisal, promoting continuous training and the use of research in clinical practice. Methods This was a prospective pre-post test study. The InfoCritique course on critical appraisal was translated and adapted. A sample of 50 nurses and 3 tutors was recruited. Educational strategies and assessment instruments were established for the course. A course website was created that contained contact details of the teaching team and coordinator, as well as a course handbook and videos introducing the course. Assessment comprised the administration of questionnaires before and after the course, in order to explore the main intervention outcomes: knowledge acquired and self-learning readiness. Satisfaction was also measured at the end of the course. Results Of the 50 health professionals recruited, 3 did not complete the course for personal or work-related reasons. The mean score on the pre-course knowledge questionnaire was 70.5 out of 100, with a standard deviation of 11.96. In general, participants’ performance on the knowledge questionnaire improved after the course, as reflected in the notable increase of the mean score, to 86.6, with a standard deviation of 10.00. Further, analyses confirmed statistically significant differences between pre- and post-course results (p < 0.001). With regard to self-learning readiness, after the course, participants reported a greater readiness and ability for self-directed learning. Lastly, in terms of level of satisfaction with the course, the mean score was 7 out of 10. Conclusions Participants significantly improved their knowledge score and self-directed learning readiness after the educational intervention, and they were overall satisfied with the course. For the health system and nursing professionals, this type of

  19. Sugaring the Pill: Assessing Rhetorical Strategies Designed to Minimize Defensive Reactions to Group Criticism

    ERIC Educational Resources Information Center

    Hornsey, Matthew J.; Robson, Erin; Smith, Joanne; Esposo, Sarah; Sutton, Robbie M.

    2008-01-01

    People are considerably more defensive in the face of group criticism when the criticism comes from an out-group rather than an in-group member (the intergroup sensitivity effect). We tested three strategies that out-group critics can use to reduce this heightened defensiveness. In all studies, Australians received criticism of their country…

  20. Criticality safety assessment of a TRIGA reactor spent-fuel pool under accident conditions

    SciTech Connect

    Glumac, B; Ravnik, M.; Logar, M.

    1997-02-01

    Additional criticality safety analysis of a pool-type storage for TRIGA spent fuel at the Jozef Stefan Institute in Ljubljana, Slovenia, is presented. Previous results have shown that subcriticality is not guaranteed for some postulated accidents (earthquake with subsequent fuel rack disintegration resulting in contact fuel pitch) under the assumption that the fuel rack is loaded with fresh 12 wt% standard fuel. To mitigate this deficiency, a study was done on replacing a certain number of fuel elements in the rack with cadmium-loaded absorber rods. The Monte Carlo computer code MCNP4A with an ENDF/B-V library and detailed three-dimensional geometrical model of the spent-fuel rack was used for this purpose. First, a minimum critical number of fuel elements was determined for contact pitch, and two possible geometries of rack disintegration were considered. Next, it was shown that subcriticality can be ensured when pitch is decreased from a rack design pitch of 8 cm to contact, if a certain number of fuel elements (8 to 20 out of 70) are replaced by absorber rods, which are uniformly mixed into the lattice. To account for the possibility that random mixing of fuel elements and absorber rods can occur during rack disintegration and result in a supercritical configuration, a probabilistic study was made to sample the probability density functions for random absorber rod lattice loadings. Results of the calculations show that reasonably low probabilities for supercriticality can be achieved (down to 10{sup {minus}6} per severe earthquake, which would result in rack disintegration and subsequent maximum possible pitch decrease) even in the case where fresh 12 wt% standard TRIGA fuel would be stored in the spent-fuel pool.

  1. Critical issues in the formation of quantum computer test structures by ion implantation

    SciTech Connect

    Schenkel, T.; Lo, C. C.; Weis, C. D.; Schuh, A.; Persaud, A.; Bokor, J.

    2009-04-06

    The formation of quantum computer test structures in silicon by ion implantation enables the characterization of spin readout mechanisms with ensembles of dopant atoms and the development of single atom devices. We briefly review recent results in the characterization of spin dependent transport and single ion doping and then discuss the diffusion and segregation behaviour of phosphorus, antimony and bismuth ions from low fluence, low energy implantations as characterized through depth profiling by secondary ion mass spectrometry (SIMS). Both phosphorus and bismuth are found to segregate to the SiO2/Si interface during activation anneals, while antimony diffusion is found to be minimal. An effect of the ion charge state on the range of antimony ions, 121Sb25+, in SiO2/Si is also discussed.

  2. Fractional Flow Reserve and Coronary Computed Tomographic Angiography: A Review and Critical Analysis.

    PubMed

    Hecht, Harvey S; Narula, Jagat; Fearon, William F

    2016-07-01

    Invasive fractional flow reserve (FFR) is now the gold standard for intervention. Noninvasive functional imaging analyses derived from coronary computed tomographic angiography (CTA) offer alternatives for evaluating lesion-specific ischemia. CT-FFR, CT myocardial perfusion imaging, and transluminal attenuation gradient/corrected contrast opacification have been studied using invasive FFR as the gold standard. CT-FFR has demonstrated significant improvement in specificity and positive predictive value compared with CTA alone for predicting FFR of ≤0.80, as well as decreasing the frequency of nonobstructive invasive coronary angiography. High-risk plaque characteristics have also been strongly implicated in abnormal FFR. Myocardial computed tomographic perfusion is an alternative method with promising results; it involves more radiation and contrast. Transluminal attenuation gradient/corrected contrast opacification is more controversial and may be more related to vessel diameter than stenosis. Important considerations remain: (1) improvement of CTA quality to decrease unevaluable studies, (2) is the diagnostic accuracy of CT-FFR sufficient? (3) can CT-FFR guide intervention without invasive FFR confirmation? (4) what are the long-term outcomes of CT-FFR-guided treatment and how do they compare with other functional imaging-guided paradigms? (5) what degree of stenosis on CTA warrants CT-FFR? (6) how should high-risk plaque be incorporated into treatment decisions? (7) how will CT-FFR influence other functional imaging test utilization, and what will be the effect on the practice of cardiology? (8) will a workstation-based CT-FFR be mandatory? Rapid progress to date suggests that CTA-based lesion-specific ischemia will be the gatekeeper to the cardiac catheterization laboratory and will transform the world of intervention. PMID:27390333

  3. Target highlights in CASP9: Experimental target structures for the critical assessment of techniques for protein structure prediction.

    PubMed

    Kryshtafovych, Andriy; Moult, John; Bartual, Sergio G; Bazan, J Fernando; Berman, Helen; Casteel, Darren E; Christodoulou, Evangelos; Everett, John K; Hausmann, Jens; Heidebrecht, Tatjana; Hills, Tanya; Hui, Raymond; Hunt, John F; Seetharaman, Jayaraman; Joachimiak, Andrzej; Kennedy, Michael A; Kim, Choel; Lingel, Andreas; Michalska, Karolina; Montelione, Gaetano T; Otero, José M; Perrakis, Anastassis; Pizarro, Juan C; van Raaij, Mark J; Ramelot, Theresa A; Rousseau, Francois; Tong, Liang; Wernimont, Amy K; Young, Jasmine; Schwede, Torsten

    2011-01-01

    One goal of the CASP community wide experiment on the critical assessment of techniques for protein structure prediction is to identify the current state of the art in protein structure prediction and modeling. A fundamental principle of CASP is blind prediction on a set of relevant protein targets, that is, the participating computational methods are tested on a common set of experimental target proteins, for which the experimental structures are not known at the time of modeling. Therefore, the CASP experiment would not have been possible without broad support of the experimental protein structural biology community. In this article, several experimental groups discuss the structures of the proteins which they provided as prediction targets for CASP9, highlighting structural and functional peculiarities of these structures: the long tail fiber protein gp37 from bacteriophage T4, the cyclic GMP-dependent protein kinase Iβ dimerization/docking domain, the ectodomain of the JTB (jumping translocation breakpoint) transmembrane receptor, Autotaxin in complex with an inhibitor, the DNA-binding J-binding protein 1 domain essential for biosynthesis and maintenance of DNA base-J (β-D-glucosyl-hydroxymethyluracil) in Trypanosoma and Leishmania, an so far uncharacterized 73 residue domain from Ruminococcus gnavus with a fold typical for PDZ-like domains, a domain from the phycobilisome core-membrane linker phycobiliprotein ApcE from Synechocystis, the heat shock protein 90 activators PFC0360w and PFC0270w from Plasmodium falciparum, and 2-oxo-3-deoxygalactonate kinase from Klebsiella pneumoniae. PMID:22020785

  4. Criticality Safety Assessment: Impact of Tank 40H Sludge Batch 2 Decant No. 2 on the Criticality Safety Assessment of the 242-25H Evaporator System (WSRC-TR-2000-00069)

    SciTech Connect

    Smiley, H.S.

    2001-07-30

    This assessment was done to evaluate the impact of the planned transfer of Decant No.2 from Sludge Batch 2 in Tank 40H on the potential for solids accumulation in the 242-25H evaporator. It is a nuclear criticality safety (NCS) goal to demonstrate that the evaporator vessel cannot accumulate fissile material in a quantity and configuration that provides a pathway to criticality.The mechanism for accumulation of fissile material is through formation of aluminosilicate solids.

  5. Computer-Aided Argument Mapping in an EFL Setting: Does Technology Precede Traditional Paper and Pencil Approach in Developing Critical Thinking?

    ERIC Educational Resources Information Center

    Eftekhari, Maryam; Sotoudehnama, Elaheh; Marandi, S. Susan

    2016-01-01

    Developing higher-order critical thinking skills as one of the central objectives of education has been recently facilitated via software packages. Whereas one such technology as computer-aided argument mapping is reported to enhance levels of critical thinking (van Gelder 2001), its application as a pedagogical tool in English as a Foreign…

  6. Life Cycle Assessment of Pavements: A Critical Review of Existing Literature and Research

    SciTech Connect

    Santero, Nicholas; Masanet, Eric; Horvath, Arpad

    2010-04-20

    This report provides a critical review of existing literature and modeling tools related to life-cycle assessment (LCA) applied to pavements. The review finds that pavement LCA is an expanding but still limited research topic in the literature, and that the existing body of work exhibits methodological deficiencies and incompatibilities that serve as barriers to the widespread utilization of LCA by pavement engineers and policy makers. This review identifies five key issues in the current body of work: inconsistent functional units, improper system boundaries, imbalanced data for asphalt and cement, use of limited inventory and impact assessment categories, and poor overall utility. This review also identifies common data and modeling gaps in pavement LCAs that should be addressed in future work. These gaps include: the use phase (rolling resistance, albedo, carbonation, lighting, leachate, and tire wear and emissions), asphalt fumes, feedstock energy of bitumen, traffic delay, the maintenance phase, and the end-of-life phase. This review concludes with a comprehensive list of recommendations for future research, which shed light on where improvements in knowledge can be made that will benefit the accuracy and comprehensiveness of pavement LCAs moving forward.

  7. An impact assessment and critical appraisal of the ISO standard for wheelchair vocabulary.

    PubMed

    Dolan, Michael J; Henderson, Graham I

    2013-07-01

    Wheelchairs are, for users, a primary means of mobility and an important means of performing activities of daily living. A common, accepted vocabulary is required to support and foster evidence-based practice and communication amongst professionals and with users. The international standard for wheelchair vocabulary, ISO 7176-26:2007, specifies terms and definitions with the purpose of eliminating confusion from the duplication or inappropriate use of terms. The aim of this study was to assess its impact and, based on that assessment, critically appraise the standard. Two databases were searched returning 189 and 283 unique articles with wheelchair in the title published between 2004-2006 and 2009-2011 respectively. Compliance, based on title and abstract usage, was poor, ranging from 0 to 50% correct usage, with no significant difference between pre- and post-publication. A review of prescription forms found only 9% correct usage. A survey of NHS wheelchair managers found that only 30% were positive that they had a copy despite 67% agreeing that the standard is important. The ISO wheelchair vocabulary standard was found not to be achieving its stated purpose. It is recommended that it be revised taking into account the findings of this study including the need for targeted dissemination and increased awareness. PMID:23058286

  8. Contemporary issues for experimental design in assessment of medical imaging and computer-assist systems

    NASA Astrophysics Data System (ADS)

    Wagner, Robert F.; Beiden, Sergey V.; Campbell, Gregory; Metz, Charles E.; Sacks, William M.

    2003-05-01

    The dialog among investigators in academia, industry, NIH, and the FDA has grown in recent years on topics of historic interest to attendees of these SPIE sub-conferences on Image Perception, Observer Performance, and Technology Assessment. Several of the most visible issues in this regard have been the emergence of digital mammography and modalities for computer-assisted detection and diagnosis in breast and lung imaging. These issues appear to be only the "tip of the iceberg" foreshadowing a number of emerging advances in imaging technology. So it is timely to make some general remarks looking back and looking ahead at the landscape (or seascape). The advances have been facilitated and documented in several forums. The major role of the SPIE Medical Imaging Conferences i well-known to all of us. Many of us were also present at the Medical Image Perception Society and co-sponsored by CDRH and NCI in September of 2001 at Airlie House, VA. The workshops and discussions held at that conference addressed some critical contemporary issues related to how society - and in particular industry and FDA - approach the general assessment problem. A great deal of inspiration for these discussions was also drawn from several workshops in recent years sponsored by the Biomedical Imaging Program of the National Cancer Institute on these issues, in particular the problem of "The Moving Target" of imaging technology. Another critical phenomenon deserving our attention is the fact that the Fourth National Forum on Biomedical Imaging in Oncology was recently held in Bethesda, MD., February 6-7, 2003. These forums are presented by the National Cancer Institute (NCI), the Food and Drug Administration (FDA), the Centers for Medicare and Medicaid Services (CMS), and the National Electrical Manufacturers Association (NEMA). They are sponsored by the National Institutes of Health/Foundation for Advanced Education in the Sciences (NIH/FAES). These forums led to the development of the NCI

  9. Summative assessment in a doctor of pharmacy program: a critical insight

    PubMed Central

    Wilbur, Kerry

    2015-01-01

    Background The Canadian-accredited post-baccalaureate Doctor of Pharmacy program at Qatar University trains pharmacists to deliver advanced patient care. Emphasis on acquisition and development of the necessary knowledge, skills, and attitudes lies in the curriculum’s extensive experiential component. A campus-based oral comprehensive examination (OCE) was devised to emulate a clinical viva voce and complement the extensive formative assessments conducted at experiential practice sites throughout the curriculum. We describe an evaluation of the final exit summative assessment for this graduate program. Methods OCE results since the inception of the graduate program (3 years ago) were retrieved and recorded into a blinded database. Examination scores among each paired faculty examiner team were analyzed for inter-rater reliability and linearity of agreement using intraclass correlation and Spearman’s correlation coefficient measurements, respectively. Graduate student ranking from individual examiner OCE scores was compared with that of other relative ranked student performance. Results Sixty-one OCEs were administered to 30 graduate students over 3 years by a composite of eleven different pairs of faculty examiners. Intraclass correlation measures demonstrated that examiner team reliability was low and linearity of agreements was inconsistent. Only one examiner team in each respective academic year was found to have statistically significant inter-rater reliability, and linearity of agreements was inconsistent in all years. No association was found between examination performance rankings and other academic parameters. Conclusion Critical review of our final summative assessment implies it is lacking robustness and defensibility. Measures are in place to continue the quality improvement process and develop and implement an alternative means of evaluation within a more authentic context. PMID:25733948

  10. Computer Simulation of Human Behavior: Assessment of Creativity.

    ERIC Educational Resources Information Center

    Greene, John F.

    The major purpose of this study is to further the development of procedures which minimize current limitations of creativity instruments, thus yielding a reliable and functional means for assessing creativity. Computerized content analysis and multiple regression are employed to simulate the creativity ratings of trained judges. The computerized…

  11. Computer-Based Assessment of School Readiness and Early Reasoning

    ERIC Educational Resources Information Center

    Csapó, Beno; Molnár, Gyöngyvér; Nagy, József

    2014-01-01

    This study explores the potential of using online tests for the assessment of school readiness and for monitoring early reasoning. Four tests of a face-to-face-administered school readiness test battery (speech sound discrimination, relational reasoning, counting and basic numeracy, and deductive reasoning) and a paper-and-pencil inductive…

  12. An Assessment of Nursing Attitudes toward Computers in Health Care.

    ERIC Educational Resources Information Center

    Carl, David L.; And Others

    The attitudes and perceptions of practicing nurses, student nurses, and nurse educators toward computerization of health care were assessed using questionnaires sent to two general hospitals and five nursing education programs. The sample consisted of 83 first-year nursing students, 84 second-year nursing students, 52 practicing nurses, and 26…

  13. A Computer-Based Intelligent Assessment System for Numeric Disciplines.

    ERIC Educational Resources Information Center

    Patel, Ashok; Kinshuk; Russell, David

    1998-01-01

    Describes an intelligent assessment system for numeric disciplines that works in conjunction with the intelligent tutoring tools developed by Teaching and Learning Technology (TLTP) Byzantium, a consortium of six U.K. universities. Topics include intelligent tutoring tools based on cognitive apprenticeship framework, a history of computerized…

  14. Computational and experimental analysis of TMS-induced electric field vectors critical to neuronal activation

    NASA Astrophysics Data System (ADS)

    Krieg, Todd D.; Salinas, Felipe S.; Narayana, Shalini; Fox, Peter T.; Mogul, David J.

    2015-08-01

    Objective. Transcranial magnetic stimulation (TMS) represents a powerful technique to noninvasively modulate cortical neurophysiology in the brain. However, the relationship between the magnetic fields created by TMS coils and neuronal activation in the cortex is still not well-understood, making predictable cortical activation by TMS difficult to achieve. Our goal in this study was to investigate the relationship between induced electric fields and cortical activation measured by blood flow response. Particularly, we sought to discover the E-field characteristics that lead to cortical activation. Approach. Subject-specific finite element models (FEMs) of the head and brain were constructed for each of six subjects using magnetic resonance image scans. Positron emission tomography (PET) measured each subject’s cortical response to image-guided robotically-positioned TMS to the primary motor cortex. FEM models that employed the given coil position, orientation, and stimulus intensity in experimental applications of TMS were used to calculate the electric field (E-field) vectors within a region of interest for each subject. TMS-induced E-fields were analyzed to better understand what vector components led to regional cerebral blood flow (CBF) responses recorded by PET. Main results. This study found that decomposing the E-field into orthogonal vector components based on the cortical surface geometry (and hence, cortical neuron directions) led to significant differences between the regions of cortex that were active and nonactive. Specifically, active regions had significantly higher E-field components in the normal inward direction (i.e., parallel to pyramidal neurons in the dendrite-to-axon orientation) and in the tangential direction (i.e., parallel to interneurons) at high gradient. In contrast, nonactive regions had higher E-field vectors in the outward normal direction suggesting inhibitory responses. Significance. These results provide critical new

  15. Computer Assisted Learning: The Potential for Teaching and Assessing in Nursing.

    ERIC Educational Resources Information Center

    Lowry, Mike; Johnson, Mark

    1999-01-01

    Computer-assisted learning can be an effective medium for undergraduate nursing education, especially through the use of graphics and self-assessment exercises. It also has benefits for patient care and education. (SK)

  16. Classroom Assessment of Computer-Assisted Language Learning: Developing a Strategy for College Faculty.

    ERIC Educational Resources Information Center

    Roman-Odio, Clara; Hartlaub, Bradley A.

    2003-01-01

    Examines trends in computer assisted language learning (CALL) research and postulates strategies for classroom assessment of CALL. Describes a pilot study designed to evaluate a music-based multimedia program. (Author/VWL)

  17. Critical parameters of a noise model that affect fault tolerant quantum computation on a single qubit

    NASA Astrophysics Data System (ADS)

    Iyer, Pavithran; da Silva, Marcus P.; Poulin, David

    In this work, we aim to determine the parameters of a single qubit channel that can tightly bound the logical error rate of the Steane code. We do not assume any a priori structure for the quantum channel, except that it is a CPTP map and we use a concatenated Steane code to encode a single qubit. Unlike the standard Monte Carlo technique that requires many iterations to estimate the logical error rate with sufficient accuracy, we use techniques to compute the complete effect of a physical CPTP map, at the logical level. Using this, we have studied the predictive power of several physical noise metrics on the logical error rate, and show, through numerical simulations with random quantum channels, that, on their own, none of the natural physical metrics lead to accurate predictions about the logical error rate. We then show how machine learning techniques help us to explore which features of a random quantum channel are important in predicting its logical error rate.

  18. Insights Into Microcirculation Underlying Critical Limb Ischemia by Single-Photon Emission Computed Tomography

    PubMed Central

    Liu, Jung-Tung; Chang, Cheng-Siu; Su, Chen-Hsing; Li, Cho-Shun

    2015-01-01

    Abstract Perfusion difference is used as a parameter to evaluate microcirculation. This study aims to differentiate lower-limb perfusion insufficiency from neuropathy to prevent possible occurrence of failed back surgery syndrome (FBSS). Patients were retrospectively gathered from 134 FBSS cases diagnosed in the past 7 years. Up to 82 cases that were excluded from neuralgia by radiologic imaging, electrodiagnostic electromyography, and nerve conduction velocity were enrolled in this study. Perfusion difference was evaluated by single-photon emission computed tomography, and pain intensities were recorded via visual analog scale (VAS) score. Lower perfusion at the left leg comprises 51.2% (42 of 82) of the patients. The mean perfusion difference of the 82 patients was 0.86 ± 0.05 (range: 0.75–0.93). Patients with systemic vascular diseases exhibited significantly higher perfusion difference than that of patients without these related diseases (P < 0.05), except for renal insufficiency (P = 0.134). Significant correlation was observed between perfusion difference and VAS score (r = −0.78; P < 0.0001; n = 82). In this study, we presented perfusion difference as a parameter for evaluating microcirculation, which cannot be detected by ultrasonography or angiography. PMID:26166084

  19. Radiological Assessment of Bioengineered Bone in a Muscle Flap for the Reconstruction of Critical-Size Mandibular Defect

    PubMed Central

    Al-Fotawei, Randa; Ayoub, Ashraf F.; Heath, Neil; Naudi, Kurt B.; Tanner, K. Elizabeth; Dalby, Matthew J.; McMahon, Jeremy

    2014-01-01

    This study presents a comprehensive radiographic evaluation of bone regeneration within a pedicled muscle flap for the reconstruction of critical size mandibular defect. The surgical defect (20 mm×15 mm) was created in the mandible of ten experimental rabbits. The masseter muscle was adapted to fill the surgical defect, a combination of calcium sulphate/hydroxyapatite cement (CERAMENT™ |SPINE SUPPORT), BMP-7 and rabbit mesenchymal stromal cells (rMSCs) was injected inside the muscle tissue. Radiographic assessment was carried out on the day of surgery and at 4, 8, and 12 weeks postoperatively. At 12 weeks, the animals were sacrificed and cone beam computerized tomography (CBCT) scanning and micro-computed tomography (µ-CT) were carried out. Clinically, a clear layer of bone tissue was identified closely adherent to the border of the surgical defect. Sporadic radio-opaque areas within the surgical defect were detected radiographically. In comparison with the opposite non operated control side, the estimated quantitative scoring of the radio-opacity was 46.6% ±15, the mean volume of the radio-opaque areas was 63.4% ±20. Areas of a bone density higher than that of the mandibular bone (+35% ±25%) were detected at the borders of the surgical defect. The micro-CT analysis revealed thinner trabeculae of the regenerated bone with a more condensed trabecular pattern than the surrounding native bone. These findings suggest a rapid deposition rate of the mineralised tissue and an active remodelling process of the newly regenerated bone within the muscle flap. The novel surgical model of this study has potential clinical application; the assessment of bone regeneration using the presented radiolographic protocol is descriptive and comprehensive. The findings of this research confirm the remarkable potential of local muscle flaps as local bioreactors to induce bone formation for reconstruction of maxillofacial bony defects. PMID:25226170

  20. Assessment of toxic metals in waste personal computers

    SciTech Connect

    Kolias, Konstantinos; Hahladakis, John N. Gidarakos, Evangelos

    2014-08-15

    Highlights: • Waste personal computers were collected and dismantled in their main parts. • Motherboards, monitors and plastic housing were examined in their metal content. • Concentrations measured were compared to the RoHS Directive, 2002/95/EC. • Pb in motherboards and funnel glass of devices released <2006 was above the limit. • Waste personal computers need to be recycled and environmentally sound managed. - Abstract: Considering the enormous production of waste personal computers nowadays, it is obvious that the study of their composition is necessary in order to regulate their management and prevent any environmental contamination caused by their inappropriate disposal. This study aimed at determining the toxic metals content of motherboards (printed circuit boards), monitor glass and monitor plastic housing of two Cathode Ray Tube (CRT) monitors, three Liquid Crystal Display (LCD) monitors, one LCD touch screen monitor and six motherboards, all of which were discarded. In addition, concentrations of chromium (Cr), cadmium (Cd), lead (Pb) and mercury (Hg) were compared with the respective limits set by the RoHS 2002/95/EC Directive, that was recently renewed by the 2012/19/EU recast, in order to verify manufacturers’ compliance with the regulation. The research included disassembly, pulverization, digestion and chemical analyses of all the aforementioned devices. The toxic metals content of all samples was determined using Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). The results demonstrated that concentrations of Pb in motherboards and funnel glass of devices with release dates before 2006, that is when the RoHS Directive came into force, exceeded the permissible limit. In general, except from Pb, higher metal concentrations were detected in motherboards in comparison with plastic housing and glass samples. Finally, the results of this work were encouraging, since concentrations of metals referred in the RoHS Directive were found in

  1. Strategic computing: a strategic plan for the development of machine-intelligence technology and its application to critical problems in defense

    SciTech Connect

    Conway, L.

    1984-01-01

    To meet the challenge of certain critical problems in defense, the Defense Advanced Research Projects Agency (DARPA) is initiating an important new program in strategic computing. By seizing an opportunity to leverage recent advances in artificial intelligence, computer science, and microelectronics, the agency plans to create a new generation of machine-intelligence technology.

  2. The Identification, Implementation, and Evaluation of Critical User Interface Design Features of Computer-Assisted Instruction Programs in Mathematics for Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Seo, You-Jin; Woo, Honguk

    2010-01-01

    Critical user interface design features of computer-assisted instruction programs in mathematics for students with learning disabilities and corresponding implementation guidelines were identified in this study. Based on the identified features and guidelines, a multimedia computer-assisted instruction program, "Math Explorer", which delivers…

  3. Static and dynamic assessment of myocardial perfusion by computed tomography.

    PubMed

    Danad, Ibrahim; Szymonifka, Jackie; Schulman-Marcus, Joshua; Min, James K

    2016-08-01

    Recent developments in computed tomography (CT) technology have fulfilled the prerequisites for the clinical application of myocardial CT perfusion (CTP) imaging. The evaluation of myocardial perfusion by CT can be achieved by static or dynamic scan acquisitions. Although both approaches have proved clinically feasible, substantial barriers need to be overcome before its routine clinical application. The current review provides an outline of the current status of CTP imaging and also focuses on disparities between static and dynamic CTPs for the evaluation of myocardial blood flow. PMID:27013250

  4. A Structural and Functional Assessment of the Lung via Multidetector-Row Computed Tomography

    PubMed Central

    Hoffman, Eric A.; Simon, Brett A.; McLennan, Geoffrey

    2006-01-01

    With advances in multidetector-row computed tomography (MDCT), it is now possible to image the lung in 10 s or less and accurately extract the lungs, lobes, and airway tree to the fifth- through seventh-generation bronchi and to regionally characterize lung density, texture, ventilation, and perfusion. These methods are now being used to phenotype the lung in health and disease and to gain insights into the etiology of pathologic processes. This article outlines the application of these methodologies with specific emphasis on chronic obstructive pulmonary disease. We demonstrate the use of our methods for assessing regional ventilation and perfusion and demonstrate early data that show, in a sheep model, a regionally intact hypoxic pulmonary vasoconstrictor (HPV) response with an apparent inhibition of HPV regionally in the presence of inflammation. We present the hypothesis that, in subjects with pulmonary emphysema, one major contributing factor leading to parenchymal destruction is the lack of a regional blunting of HPV when the regional hypoxia is related to regional inflammatory events (bronchiolitis or alveolar flooding). If maintaining adequate blood flow to inflamed lung regions is critical to the nondestructive resolution of inflammatory events, the pathologic condition whereby HPV is sustained in regions of inflammation would likely have its greatest effect in the lung apices where blood flow is already reduced in the upright body posture. PMID:16921136

  5. Cognitive Assessment of Movement-Based Computer Games

    NASA Technical Reports Server (NTRS)

    Kearney, Paul

    2008-01-01

    This paper examines the possibility that dance games such as Dance Dance Revolution or StepMania enhance the cognitive abilities that are critical to academic achievement. These games appear to place a high cognitive load on working memory requiring the player to convert a visual signal to a physical movement up to 7 times per second. Players see a pattern of directions displayed on the screen and they memorise these as a dance sequence. Other researchers have found that attention span and memory ability, both cognitive abilities required for academic achievement, are improved through the use of physical movement and exercise. This paper reviews these claims and documents tool development for on-going research by the author.

  6. Benefits and Drawbacks of Computer-Based Assessment and Feedback Systems: Student and Educator Perspectives

    ERIC Educational Resources Information Center

    Debuse, Justin C. W.; Lawley, Meredith

    2016-01-01

    Providing students with high quality feedback is important and can be achieved using computer-based systems. While student and educator perspectives of such systems have been investigated, a comprehensive multidisciplinary study has not yet been undertaken. This study examines student and educator perspectives of a computer-based assessment and…

  7. The Role of Computer Conferencing in Delivery of a Short Course on Assessment of Learning Difficulties.

    ERIC Educational Resources Information Center

    Dwyer, Eamonn

    1991-01-01

    A pilot project at the University of Ulster (Northern Ireland) used the CAUCUS computer conferencing system on the CAMPUS 2000 education network to train teachers to assess young adults with severe learning difficulties. Despite problems with student attrition and system failure, computer conferencing was felt to be a useful medium for providing…

  8. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    ERIC Educational Resources Information Center

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  9. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  10. An Assessment of the Computer Skills of Incoming Freshmen at Two University of Wisconsin Campuses.

    ERIC Educational Resources Information Center

    Smith, Marian A.; Furst-Bowe, Julie A.

    A study was conducted at the University of Wisconsin (UW)-Eau Claire and UW-Stout in 1992 to assess the computer skills of incoming college freshmen. Information about the students' computer skills was obtained through the use of a questionnaire. The questionnaire was distributed to 92 students at UW-Eau Claire and 86 students at UW-Stout.…

  11. Randomised Items in Computer-Based Tests: Russian Roulette in Assessment?

    ERIC Educational Resources Information Center

    Marks, Anthony M.; Cronje, Johannes C.

    2008-01-01

    Computer-based assessments are becoming more commonplace, perhaps as a necessity for faculty to cope with large class sizes. These tests often occur in large computer testing venues in which test security may be compromised. In an attempt to limit the likelihood of cheating in such venues, randomised presentation of items is automatically…

  12. Impacts of Mobile Computing on Student Learning in the University: A Comparison of Course Assessment Data

    ERIC Educational Resources Information Center

    Hawkes, Mark; Hategekimana, Claver

    2010-01-01

    This study focuses on the impact of wireless, mobile computing tools on student assessment outcomes. In a campus-wide wireless, mobile computing environment at an upper Midwest university, an empirical analysis is applied to understand the relationship between student performance and Tablet PC use. An experimental/control group comparison of…

  13. Methods for assessing critical nonroutine mine health and safety skills. Open File Report, October 1984-June 1987

    SciTech Connect

    Cole, H.P.; Berger, P.K.; Vaught, C.; Lacefield, W.G.; Wasielewski, R.D.

    1988-03-01

    A comprehensive review of published research was carried out to identify methods for teaching and assessing critical but nonroutine skills needed for coping with emergency situations. Specific methods for assessing critical skills proficiency in aviation, medicine, organization management, the military, and other industrial/technical workplaces are described. The potential application of these methods for teaching and assessing (1) critical first aid and (2) self-rescue and escape skills to underground coal miners in annual refresher training is explored. Research and development activities that may improve mine health and safety training are suggested. Most of these research and development activities were completed later in the project. The additional work is reported in the project technical report No. 2 and the final report. The research reported in the present document was completed in 1984-85. Many of the initial findings reported in the document are updated in the later project final report.

  14. Assessing wastewater micropollutant loads with approximate Bayesian computations.

    PubMed

    Rieckermann, Jörg; Anta, Jose; Scheidegger, Andreas; Ort, Christoph

    2011-05-15

    Wastewater production, like many other engineered and environmental processes, is inherent stochastic in nature and requires the use of complex stochastic models, for example, to predict realistic patterns of down-the-drain chemicals or pharmaceuticals and personal care products. Up until now, a formal method of statistical inference has been lacking for many of those models, where explicit likelihood functions were intractable. In this Article, we investigate Approximate Bayesian Computation (ABC) methods to infer important parameters of stochastic environmental models. ABC methods have been recently suggested to perform model-based inference in a Bayesian setting when model likelihoods are analytically or computationally intractable and have not been applied to environmental systems analysis or water quality modeling before. In a case study, we investigate the performance of three different algorithms to infer the number of wastewater pulses contained in three high-resolution data series of benzotriazole and total nitrogen loads in sewers. We find that all algorithms perform well and that the uncertainty in the inferred number of corresponding wastewater pulses varies between 6% and 28%. In our case, the results are more sensitive to substance characteristics than to catchment properties. Although the application of ABC methods requires careful tuning and attention to detail, they have a great general potential to update stochastic model parameters with monitoring data and improve their predictive capabilities. PMID:21504210

  15. Assessment of metabolic bone diseases by quantitative computed tomography

    SciTech Connect

    Richardson, M.L.; Genant, H.K.; Cann, C.E.; Ettinger, B.; Gordan, G.S.; Kolb, F.O.; Reiser, U.J.

    1985-05-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid- induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements.

  16. Assessing The Impact Of Computed Radiography And PACS

    NASA Astrophysics Data System (ADS)

    Hedgcock, Marcus W.; Kehr, Katherine

    1989-05-01

    Our institution (San Francisco VA Medical Center) is a VA pilot center for total digital imaging and PACS. Quantitative information about PACS impact on health care is limited, because no centers have done rigorous preimplementation studies. We are gathering quantitative service delivery and cost data before, during, and after stepwise implementation of computed radiography and PACS at our institution to define the impact on imaging service delivery. We designed a simple audit method using the x-ray request and time clocks to determine patient waiting time, imaging time, film use, image availability to the radiologist, matching of current with previous images, image availability to clinicians, and time to final interpretation. Our department model is a multichannel, mulitserver patient queue. Our current radiograph file is space limited, containing only one year of images; older images are kept in a remote file area in another building. In addition, there are 16 subfile areas within the Radiology Service and the medical center. Our preimplementation audit showed some long waiting times (40 minutes, average 20) and immediate retrieval of prior films in only 42% of cases, with an average retrieval time of 22 hours. Computed radiography and the optical archive have the potential to improve these figures. The audit will be ongoing and automated as implementation of PACS progresses, to measure service improvement and learning curve with the new equipment. We present the audit format and baseline preimplementation figures.

  17. Computed tomography assessment for transcatheter mitral valve interventions.

    PubMed

    Narang, Akhil; Guerrero, Mayra; Feldman, Ted; Pursnani, Amit

    2016-06-01

    Multidetector cardiac computerized tomography (CT) is a robust advanced imaging modality with high spatial resolution that has emerged as an essential tool for the planning of structural heart and electrophysiology interventions. The most notable example has been its important role in the pre-procedural planning of transcatheter aortic valve replacement (TAVR), which has developed to the point that commercial software packages are commonly used for this application. More recently several novel approaches and devices have been developed for transcatheter mitral valve replacement (TMVR). Given the greater complexity of mitral valve anatomy, CT has at least an equally important role for preprocedural planning of TMVR. Similar to TAVR assessment, its utility in TMVR is multi-fold, including assessment of valve and adjacent anatomical structures, determination of accurate annulus dimensions for prosthesis sizing, vascular access planning, and prediction of fluoroscopic angles. PMID:27028331

  18. Towards a dynamic assessment of raw materials criticality: linking agent-based demand--with material flow supply modelling approaches.

    PubMed

    Knoeri, Christof; Wäger, Patrick A; Stamp, Anna; Althaus, Hans-Joerg; Weil, Marcel

    2013-09-01

    Emerging technologies such as information and communication-, photovoltaic- or battery technologies are expected to increase significantly the demand for scarce metals in the near future. The recently developed methods to evaluate the criticality of mineral raw materials typically provide a 'snapshot' of the criticality of a certain material at one point in time by using static indicators both for supply risk and for the impacts of supply restrictions. While allowing for insights into the mechanisms behind the criticality of raw materials, these methods cannot account for dynamic changes in products and/or activities over time. In this paper we propose a conceptual framework intended to overcome these limitations by including the dynamic interactions between different possible demand and supply configurations. The framework integrates an agent-based behaviour model, where demand emerges from individual agent decisions and interaction, into a dynamic material flow model, representing the materials' stocks and flows. Within the framework, the environmental implications of substitution decisions are evaluated by applying life-cycle assessment methodology. The approach makes a first step towards a dynamic criticality assessment and will enhance the understanding of industrial substitution decisions and environmental implications related to critical metals. We discuss the potential and limitation of such an approach in contrast to state-of-the-art methods and how it might lead to criticality assessments tailored to the specific circumstances of single industrial sectors or individual companies. PMID:23453658

  19. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Bland, Arthur S Buddy; Hack, James J; Baker, Ann E; Barker, Ashley D; Boudwin, Kathlyn J.; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; White, Julia C

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next

  20. The Interactive Media Package for Assessment of Communication and Critical Thinking (IMPACCT[c]): Testing a Programmatic Online Communication Competence Assessment System

    ERIC Educational Resources Information Center

    Spitzberg, Brian H.

    2011-01-01

    IMPACCT is an online survey covering over 40 self-report types of student communication competency, as well as a test of critical thinking based on cognitive problem-solving. The student nominates two peers who rate the student's interpersonal, computer-mediated, group and leadership, and public speaking communication competence. The student takes…

  1. A Critical Assessment of Hygroscopic Seeding of Convective Clouds for Rainfall Enhancement.

    NASA Astrophysics Data System (ADS)

    Silverman, Bernard A.

    2003-09-01

    During the past decade, statistically positive results have been reported for four major, randomized hygroscopic seeding experiments, each in a different part of the world. Experiments on cold convective clouds using hygroscopic flares were carried out in South Africa and Mexico. Experiments on warm convective clouds using hygroscopic particles were carried out in Thailand and India. The scientific evidence for enhancing rainfall from convective clouds by hygroscopic seeding from these four randomized experiments is examined and critically assessed. The assessment uses, as a measure of proof of concept, the criteria for success of any cloud seeding activity that were recommended in the Scientific Background for the 1998 AMS Policy Statement on Planned and Inadvertent Weather Modifications, criteria that required both statistical and physical evidence.Based on a critical examination of the results of these four major, randomized hygroscopic seeding experiments, it has been concluded that they have not yet provided either the statistical or physical evidence required to establish that the effectiveness of hygroscopic seeding of convective clouds to increase precipitation is scientifically proven. The impressive statistical results from these experiments must be viewed with caution because, according to the proof-of-concept criteria, credibility of the results depends on the physical plausibility of the seeding conceptual model that forms the basis for anticipating seeding-induced increases in rainfall. The credibility of the hygroscopic seeding for microphysical effects hypothesis has been seriously undermined because it cannot explain the magnitude and timing of the statistically significant increases in precipitation that were observed. Theories suggesting that the microphysical effects of seeding-enhanced downdraft circulations to produce longer-lived clouds have been advanced; however, in the absence of any supporting physical or model evidence, they must be

  2. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  3. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  4. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  5. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  6. The critical role of culture and environment as determinants of women's participation in computer science

    NASA Astrophysics Data System (ADS)

    Frieze, Carol

    This thesis proposes the need for, and illustrates, a new approach to how we think about, and act on, issues relating to women's participation, or lack of participation, in computer science (CS). This approach is based on a cultural perspective arguing that many of the reasons for women entering---or not entering---CS programs have little to do with gender and a lot to do with environment and culture. Evidence for this approach comes primarily from a qualitative, research study, which shows the effects of changes in the micro-culture on CS undergraduates at Carnegie Mellon, and from studies of other cultural contexts that illustrate a "Women-CS fit". We also discuss the interventions that have been crucial to the evolution of this specific micro-culture. Our argument goes against the grain of many gender and CS studies which conclude that the reasons for women's low participation in CS are based in gender --and particularly in gender differences in how men and women relate to the field. Such studies tend to focus on gender differences and recommend accommodating (what are perceived to be) women's different ways of relating to CS. This is often interpreted as contextualizing the curriculum to make it "female-friendly". The CS curriculum at Carnegie Mellon was not contextualized to be "female-friendly". Nevertheless, over the past few years, the school has attracted and graduated well above the US national average for women in undergraduate CS programs. We argue that this is due in large part to changes in the culture and environment of the department. As the environment has shifted from an unbalanced to a more balanced environment (balanced in terms of gender, breadth of student personalities, and professional support for women) the way has been opened for a range of students, including a significant number of women, to participate, and be successful, in the CS major. Our research shows that as men and women inhabit, and participate in, a more balanced environment

  7. Single-molecule protein sequencing through fingerprinting: computational assessment

    NASA Astrophysics Data System (ADS)

    Yao, Yao; Docter, Margreet; van Ginkel, Jetty; de Ridder, Dick; Joo, Chirlmin

    2015-10-01

    Proteins are vital in all biological systems as they constitute the main structural and functional components of cells. Recent advances in mass spectrometry have brought the promise of complete proteomics by helping draft the human proteome. Yet, this commonly used protein sequencing technique has fundamental limitations in sensitivity. Here we propose a method for single-molecule (SM) protein sequencing. A major challenge lies in the fact that proteins are composed of 20 different amino acids, which demands 20 molecular reporters. We computationally demonstrate that it suffices to measure only two types of amino acids to identify proteins and suggest an experimental scheme using SM fluorescence. When achieved, this highly sensitive approach will result in a paradigm shift in proteomics, with major impact in the biological and medical sciences.

  8. Assessment of Stirling Technology Has Provided Critical Data Leading Toward Flight Readiness of the Stirling Converter

    NASA Technical Reports Server (NTRS)

    Thieme, Lanny G.

    2001-01-01

    The NASA Glenn Research Center is supporting the development of a Stirling converter with the Department of Energy (DOE, Germantown, Maryland) for an advanced Stirling Radioisotope Power System (SRPS) to provide spacecraft onboard electric power for NASA space science missions. A key technology assessment completed by Glenn and DOE has led to the SRPS being identified as a high-efficiency power source for such deep space missions as the Europa Orbiter and the Solar Probe. In addition, the Stirling system is now being considered for unmanned Mars rovers, especially where mission profiles may exclude the use of photovoltaic power systems, such as exploration at high Martian latitudes or for missions of long duration. The SRPS efficiency of over 20 percent will reduce the required amount of radioisotope by more than a factor of 3 in comparison to current radioisotope thermoelectric generators. This significantly reduces radioisotope cost, radiological inventory, and system cost, and it provides efficient use of scarce radioisotope resources. In support of this technology assessment, Glenn conducted a series of independent evaluations and tests to determine the technology readiness of a 55-We Stirling converter developed by Stirling Technology Company (Kennewick, Washington) and DOE. Key areas evaluated by Glenn included: 1) Radiation tolerance of materials; 2) Random vibration testing of the Stirling converter in Glenn's Structural Dynamics Lab to simulate operation in the launch environment; 3) Electromagnetic interference and compatibility (EMI/EMC) of the converter operating in Glenn's EMI lab; Independent failure modes, effects, and criticality analysis, and life and reliability 4. Independent failure modes, effects, and criticality analysis, and life and reliability assessment; and 5) SRPS cost estimate. The data from these evaluations were presented to NASA Headquarters and the Jet Propulsion Laboratory mission office by a joint industry/Government team

  9. Teaching and Assessing Critical Thinking Skills for Argument Analysis in Psychology

    ERIC Educational Resources Information Center

    Bensley, D. Alan; Crowe, Deborah S.; Bernhardt, Paul; Buckner, Camille; Allman, Amanda L.

    2010-01-01

    Critical thinking is a valued educational outcome; however, little is known about whether psychology courses, especially ones such as research methods courses that might be expected to promote critical thinking skills, actually improve them. We compared the acquisition of critical thinking skills for analyzing psychological arguments in 3 groups…

  10. Assessing the Development of Critical Language Awareness in a Foreign Language Environment.

    ERIC Educational Resources Information Center

    Zinkgraf, Magdalena

    Critical language awareness refers to how conscious people are of the ideologies hidden in the language. A study was carried out to determine whether such a critical perspective towards text could be developed in an English-as-a-Foreign-Language (EFL) context. This paper evaluates the results of the application of methods of critical discourse…

  11. Research to Practice: Testing a Tool for Assessing Critical Thinking in Art Museum Programs

    ERIC Educational Resources Information Center

    Luke, Jessica J.; Stein, Jill; Foutz, Susan; Adams, Marianna

    2007-01-01

    Many art museum programs aim to facilitate the development of young people's critical-thinking skills, but most are unclear in their definitions of critical thinking and their notions of how it is best facilitated. This article shares a diagnostic tool for identifying instances of critical thinking in art museum programs. Specifically, the authors…

  12. Critical assessment of extracellular polymeric substances extraction methods from mixed culture biomass.

    PubMed

    Pellicer-Nàcher, Carles; Domingo-Félez, Carlos; Mutlu, A Gizem; Smets, Barth F

    2013-10-01

    Extracellular polymeric substances (EPS) have a presumed determinant role in the structure, architecture, strength, filterability, and settling behaviour of microbial solids in biological wastewater treatment processes. Consequently, numerous EPS extraction protocols have recently been published that aim to optimize the trade off between high EPS recovery and low cell lysis. Despite extensive efforts, the obtained results are often contradictory, even when analysing similar biomass samples and using similar experimental conditions, which greatly complicates the selection of an extraction protocol. This study presents a rigorous and critical assessment of existing physical and chemical EPS extraction methods applied to mixed-culture biomass samples (nitrifying, nitritation-anammox, and activated sludge biomass). A novel fluorescence-based method was developed and calibrated to quantify the lysis potential of different EPS extraction protocols. We concluded that commonly used methods to assess cell lysis (DNA concentrations or G6PDH activities in EPS extracts) do not correlate with cell viability. Furthermore, we discovered that the presence of certain chemicals in EPS extracts results in severe underestimation of protein and carbohydrate concentrations by using standard analytical methods. Keeping both maximum EPS extraction yields and minimal biomass lysis as criteria, it was identified a sonication-based extraction method as the best to determine and compare tightly-bound EPS fractions in different biomass samples. Protein was consistently the main EPS component in all analysed samples. However, EPS from nitrifying enrichments was richer in DNA, the activated sludge EPS had a higher content in humic acids and carbohydrates, and the nitritation-anammox EPS, while similar in composition to the nitrifier EPS, had a lower fraction of hydrophobic biopolymers. In general, the easily-extractable EPS fraction was more abundant in carbohydrates and humic substances, while

  13. Conceptualizing learning for sustainability through environmental assessment: critical reflections on 15 years of research

    SciTech Connect

    Sinclair, A. John Diduck, Alan Fitzpatrick, Patricia

    2008-10-15

    Numerous scholars are now directing their attention to the education and learning implications of participatory resource and environmental governance because of the potential implications of these for generating the social mobilization necessary to achieve sustainability trajectories. Our work, and that of other researchers, establishes that public participation in environmental assessment (EA) provides fertile ground for considering the intricacies of governance as they relate to participation, and for examining the education and learning implications of participation. Since EA law requires in many cases that public voices be part of the decision process, it has resulted in the creation of fascinating, state-sanctioned, deliberative spaces for civic interactions. Our purpose here is to share, and build upon, a framework that conceptualizes the relationships among participation, education, learning and sustainability in an EA context. We do so by considering findings from studies we have undertaken on participation in EA in Canada since the early 90's. Our approach was interactive and collaborative. We each considered in detail the key results of our earlier work as they relate to education, learning and EA process design. The findings illuminate aspects of the conceptual framework for which there is considerable empirical evidence, such as the link between meaningful participation and critical education and the diversity of individual learning outcomes associated with public participation in EA. The findings also highlight those parts of the framework for which the empirical evidence is relatively sparse, such as the range of possible social learning outcomes, their congruence with sustainability criteria, and the roles of monitoring and cumulative and strategic assessments in shaping EA into an adaptive, learning system.

  14. Quantitative computed tomography for spinal mineral assessment: current status

    NASA Technical Reports Server (NTRS)

    Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U.; Arnaud, C. D.

    1985-01-01

    Quantitative CT (QCT) is an established method for the noninvasive assessment of bone mineral content in the vertebral spongiosum and other anatomic locations. The potential strengths of QCT relative to dual photon absorptiometry (DPA) are its capability for precise three-dimensional anatomic localization providing a direct density measurement and its capability for spatial separation of highly responsive cancellous bone from less responsive cortical bone. The extraction of this quantitative information from the CT image, however, requires sophisticated calibration and positioning techniques and careful technical monitoring.

  15. eLearning to facilitate the education and implementation of the Chelsea Critical Care Physical Assessment: a novel measure of function in critical illness

    PubMed Central

    Corner, Evelyn J; Handy, Jonathan M; Brett, Stephen J

    2016-01-01

    Objective To evaluate the efficacy of eLearning in the widespread standardised teaching, distribution and implementation of the Chelsea Critical Care Physical Assessment (CPAx) tool—a validated tool to assess physical function in critically ill patients. Design Prospective educational study. An eLearning module was developed through a conceptual framework, using the four-stage technique for skills teaching to teach clinicians how to use the CPAx. Example and test video case studies of CPAx assessments were embedded within the module. The CPAx scores for the test case studies and demographic data were recorded in a secure area of the website. Data were analysed for inter-rater reliability using intraclass correlation coefficients (ICCs) to see if an eLearning educational package facilitated consistent use of the tool. A utility and content validity questionnaire was distributed after 1 year to eLearning module registrants (n=971). This was to evaluate uptake of the CPAx in clinical practice and content validity of the CPAx from the perspective of clinical users. Setting The module was distributed for use via professional forums (n=2) and direct contacts (n=95). Participants Critical care clinicians. Primary outcome measure ICC of the test case studies. Results Between July and October 2014, 421 candidates from 15 countries registered for the eLearning module. The ICC for case one was 0.996 (95% CI 0.990 to 0.999; n=207). The ICC for case two was 0.988 (0.996 to 1.000; n=184). The CPAx has a strong total scale content validity index (s-CVI) of 0.94 and is well used. Conclusions eLearning is a useful and reliable way of teaching psychomotor skills, such as the CPAx. The CPAx is a well-used measure with high content validity rated by clinicians. PMID:27067895

  16. A critical assessment of UH-60 main rotor blade airfoil data

    NASA Technical Reports Server (NTRS)

    Totah, Joseph

    1993-01-01

    Many current comprehensive rotorcraft analyses employ lifting-line methods that require main rotor blade airfoil data, typically obtained from wind tunnel tests. In order to effectively evaluate these lifting-line methods, it is of the utmost importance to ensure that the airfoil section data are free of inaccuracies. A critical assessment of the SC1095 and SC1094R8 airfoil data used on the UH-60 main rotor blade was performed for that reason. Nine sources of wind tunnel data were examined, all of which contain SC1095 data and four of which also contain SC1094R8 data. Findings indicate that the most accurate data were generated in 1982 at the 11-Foot Wind Tunnel Facility at NASA Ames Research Center and in 1985 at the 6-inch by 22-inch transonic wind tunnel facility at Ohio State University. It has not been determined if data from these two sources are sufficiently accurate for their use in comprehensive rotorcraft analytical models of the UH-60. It is recommended that new airfoil tables be created for both airfoils using the existing data. Additional wind tunnel experimentation is also recommended to provide high quality data for correlation with these new airfoil tables.

  17. A Critical Assessment of the Effects of Bt Transgenic Plants on Parasitoids

    PubMed Central

    Chen, Mao; Zhao, Jian-Zhou; Collins, Hilda L.; Earle, Elizabeth D.; Cao, Jun; Shelton, Anthony M.

    2008-01-01

    The ecological safety of transgenic insecticidal plants expressing crystal proteins (Cry toxins) from the bacterium Bacillus thuringiensis (Bt) continues to be debated. Much of the debate has focused on nontarget organisms, especially predators and parasitoids that help control populations of pest insects in many crops. Although many studies have been conducted on predators, few reports have examined parasitoids but some of them have reported negative impacts. None of the previous reports were able to clearly characterize the cause of the negative impact. In order to provide a critical assessment, we used a novel paradigm consisting of a strain of the insect pest, Plutella xylostella (herbivore), resistant to Cry1C and allowed it to feed on Bt plants and then become parasitized by Diadegma insulare, an important endoparasitoid of P. xylostella. Our results indicated that the parasitoid was exposed to a biologically active form of the Cy1C protein while in the host but was not harmed by such exposure. Parallel studies conducted with several commonly used insecticides indicated they significantly reduced parasitism rates on strains of P. xylostella resistant to these insecticides. These results provide the first clear evidence of the lack of hazard to a parasitoid by a Bt plant, compared to traditional insecticides, and describe a test to rigorously evaluate the risks Bt plants pose to predators and parasitoids. PMID:18523682

  18. A critical assessment of the ecological assumptions underpinning compensatory mitigation of salmon-derived nutrients

    USGS Publications Warehouse

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-01-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  19. A Critical Assessment of the Ecological Assumptions Underpinning Compensatory Mitigation of Salmon-Derived Nutrients

    NASA Astrophysics Data System (ADS)

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-09-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  20. A Critical Assessment of the Ecological Assumptions Underpinning Compensatory Mitigation of Salmon-Derived Nutrients.

    PubMed

    Collins, Scott F; Marcarelli, Amy M; Baxter, Colden V; Wipfli, Mark S

    2015-09-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited. PMID:25968140

  1. Real-time threat assessment for critical infrastructure protection: data incest and conflict in evidential reasoning

    NASA Astrophysics Data System (ADS)

    Brandon, R.; Page, S.; Varndell, J.

    2012-06-01

    This paper presents a novel application of Evidential Reasoning to Threat Assessment for critical infrastructure protection. A fusion algorithm based on the PCR5 Dezert-Smarandache fusion rule is proposed which fuses alerts generated by a vision-based behaviour analysis algorithm and a-priori watch-list intelligence data. The fusion algorithm produces a prioritised event list according to a user-defined set of event-type severity or priority weightings. Results generated from application of the algorithm to real data and Behaviour Analysis alerts captured at London's Heathrow Airport under the EU FP7 SAMURAI programme are presented. A web-based demonstrator system is also described which implements the fusion process in real-time. It is shown that this system significantly reduces the data deluge problem, and directs the user's attention to the most pertinent alerts, enhancing their Situational Awareness (SA). The end-user is also able to alter the perceived importance of different event types in real-time, allowing the system to adapt rapidly to changes in priorities as the situation evolves. One of the key challenges associated with fusing information deriving from intelligence data is the issue of Data Incest. Techniques for handling Data Incest within Evidential Reasoning frameworks are proposed, and comparisons are drawn with respect to Data Incest management techniques that are commonly employed within Bayesian fusion frameworks (e.g. Covariance Intersection). The challenges associated with simultaneously dealing with conflicting information and Data Incest in Evidential Reasoning frameworks are also discussed.

  2. A critical assessment of UH-60 main rotor blade airfoil data

    NASA Technical Reports Server (NTRS)

    Totah, Joseph

    1993-01-01

    Many current comprehensive rotorcraft analyses employ lifting-line methods that require main rotor blade airfoil data, typically obtained from wind tunnel tests. In order to effectively evaluate these lifting-line methods, it is of the utmost importance to ensure that the airfoil section data are free of inaccuracies. A critical assessment of the SC1095 and SC1094R8 airfoil data used on the UH-60 main rotor blade was performed for that reason. Nine sources of wind tunnel data were examined, all of which contain SC1095 data and four of which also contain SC1094R8 data. Findings indicate that the most accurate data were generated in 1982 at the 11-Foot Wind Tunnel Facility at NASA Ames Research Center and in 1985 at the 6-inch-by-22-inch transonic wind tunnel facility at Ohio State University. It has not been determined if data from these two sources are sufficiently accurate for their use in comprehensive rotorcraft analytical models of the UH-60. It is recommended that new airfoil tables be created for both airfoils using the existing data. Additional wind tunnel experimentation is also recommended to provide high quality data for correlation with these new airfoil tables.

  3. Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction

    ERIC Educational Resources Information Center

    Zoanetti, Nathan

    2010-01-01

    This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…

  4. Formative Computer-Based Assessment in Higher Education: The Effectiveness of Feedback in Supporting Student Learning

    ERIC Educational Resources Information Center

    Miller, Tess

    2009-01-01

    A formative computer-based assessment (CBA) was one of three instruments used for assessment in a Bachelor of Education course at Queen's University (Ontario, Canada) with an enrolment of approximately 700 students. The formative framework fostered a self-regulated learning environment whereby feedback on the CBA was used to support rather than…

  5. Synchronous Computer-Mediated Dynamic Assessment: A Case Study of L2 Spanish Past Narration

    ERIC Educational Resources Information Center

    Darhower, Mark Anthony

    2014-01-01

    In this study, dynamic assessment is employed to help understand the developmental processes of two university Spanish learners as they produce a series of past narrations in a synchronous computer mediated environment. The assessments were conducted in six weekly one-hour chat sessions about various scenes of a Spanish language film. The analysis…

  6. Supporting Student Learning: The Use of Computer-Based Formative Assessment Modules.

    ERIC Educational Resources Information Center

    Peat, Mary; Franklin, Sue

    2002-01-01

    Describes the development of a variety of computer-based assessment opportunities, both formative and summative, that are available to a large first-year biology class at the University of Sydney (Australia). Discusses online access to weekly quizzes, a mock exam, and special self-assessment modules that are beneficial to student learning.…

  7. Staff and Student Perceptions of Computer-Assisted Assessment for Physiology Practical Classes

    ERIC Educational Resources Information Center

    Sheader, Elizabeth; Gouldsborough, Ingrid; Grady, Ruth

    2006-01-01

    Effective assessment of laboratory practicals is a challenge for large-size classes. To reduce the administrative burden of staff members without compromising the student learning experience, we utilized dedicated computer software for short-answer question assessment for nearly 300 students and compared it with the more traditional, paper-based…

  8. Computational Modeling and Assessment Of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect

    David W. Gandy; John P. Shingledecker

    2011-04-11

    Forced outages and boiler unavailability in conventional coal-fired fossil power plants is most often caused by fireside corrosion of boiler waterwalls. Industry-wide, the rate of wall thickness corrosion wastage of fireside waterwalls in fossil-fired boilers has been of concern for many years. It is significant that the introduction of nitrogen oxide (NOx) emission controls with staged burners systems has increased reported waterwall wastage rates to as much as 120 mils (3 mm) per year. Moreover, the reducing environment produced by the low-NOx combustion process is the primary cause of accelerated corrosion rates of waterwall tubes made of carbon and low alloy steels. Improved coatings, such as the MCrAl nanocoatings evaluated here (where M is Fe, Ni, and Co), are needed to reduce/eliminate waterwall damage in subcritical, supercritical, and ultra-supercritical (USC) boilers. The first two tasks of this six-task project-jointly sponsored by EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)-have focused on computational modeling of an advanced MCrAl nanocoating system and evaluation of two nanocrystalline (iron and nickel base) coatings, which will significantly improve the corrosion and erosion performance of tubing used in USC boilers. The computational model results showed that about 40 wt.% is required in Fe based nanocrystalline coatings for long-term durability, leading to a coating composition of Fe-25Cr-40Ni-10 wt.% Al. In addition, the long term thermal exposure test results further showed accelerated inward diffusion of Al from the nanocrystalline coatings into the substrate. In order to enhance the durability of these coatings, it is necessary to develop a diffusion barrier interlayer coating such TiN and/or AlN. The third task 'Process Advanced MCrAl Nanocoating Systems' of the six-task project jointly sponsored by the Electric Power Research Institute, EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)- has focused on processing of

  9. A Comparative Assessment of Computer Literacy of Private and Public Secondary School Students in Lagos State, Nigeria

    ERIC Educational Resources Information Center

    Osunwusi, Adeyinka Olumuyiwa; Abifarin, Michael Segun

    2013-01-01

    The aim of this study was to conduct a comparative assessment of computer literacy of private and public secondary school students. Although the definition of computer literacy varies widely, this study treated computer literacy in terms of access to, and use of, computers and the internet, basic knowledge and skills required to use computers and…

  10. Critical assessment of density functional theory for computing vibrational (hyper)polarizabilities

    NASA Astrophysics Data System (ADS)

    Zaleśny, R.; Bulik, I. W.; Mikołajczyk, M.; Bartkowiak, W.; Luis, J. M.; Kirtman, B.; Avramopoulos, A.; Papadopoulos, M. G.

    2012-12-01

    Despite undisputed success of the density functional theory (DFT) in various branches of chemistry and physics, an application of the DFT for reliable predictions of nonlinear optical properties of molecules has been questioned a decade ago. As it was shown by Champagne, et al. [1, 2, 3] most conventional DFT schemes were unable to qualitatively predict the response of conjugated oligomers to a static electric field. Long-range corrected (LRC) functionals, like LC-BLYP or CAM-B3LYP, have been proposed to alleviate this deficiency. The reliability of LRC functionals for evaluating molecular (hyper)polarizabilities is studied for various groups of organic systems, with a special focus on vibrational corrections to the electric properties.

  11. Implementation of computerized physician order entry in National Guard hospitals: Assessment of critical success factors

    PubMed Central

    Altuwaijri, Majid M.; Bahanshal, Abdullah; Almehaid, Mona

    2011-01-01

    Objective: The purpose of this study is to describe the needs, process and experience of implementing a computerized physician order entry (CPOE) system in a leading healthcare organization in Saudi Arabia. Materials and Methods: The National Guard Health Affairs (NGHA) deployed the CPOE in a pilot department, which was the intensive care unit (ICU) in order to assess its benefits and risks and to test the system. After the CPOE was implemented in the ICU area, a survey was sent to the ICU clinicians to assess their perception on the importance of 32 critical success factors (CSFs) that was acquired from the literature. The project team also had several meetings to gather lessons learned from the pilot project in order to utilize them for the expansion of the project to other NGHA clinics and hospitals. Results: The results of the survey indicated that the selected CSFs, even though they were developed with regard to international settings, are very much applicable for the pilot area. The top three CSFs rated by the survey respondents were: The “before go-live training”, the adequate clinical resources during implementation, and the ordering time. After the assessment of the survey and the lessons learned from the pilot project, NGHA decided that the potential benefits of the CPOE are expected to be greater the risks expected. The project was then expanded to cover all NGHA clinics and hospitals in a phased approach. Currently, the project is in its final stages and expected to be completed by the end of 2011. Conclusion: The role of CPOE systems is very important in hospitals in order to reduce medication errors and to improve the quality of care. In spite of their great benefits, many studies suggest that a high percentage of these projects fail. In order to increase the chances of success and due to the fact that CPOE is a clinical system, NGHA implemented the system first in a pilot area in order to test the system without putting patients at risk and to

  12. Evaluating social outcomes of HIV/AIDS interventions: a critical assessment of contemporary indicator frameworks

    PubMed Central

    Mannell, Jenevieve; Cornish, Flora; Russell, Jill

    2014-01-01

    Introduction Contemporary HIV-related theory and policy emphasize the importance of addressing the social drivers of HIV risk and vulnerability for a long-term response. Consequently, increasing attention is being given to social and structural interventions, and to social outcomes of HIV interventions. Appropriate indicators for social outcomes are needed in order to institutionalize the commitment to addressing social outcomes. This paper critically assesses the current state of social indicators within international HIV/AIDS monitoring and evaluation frameworks. Methods We analyzed the indicator frameworks of six international organizations involved in efforts to improve and synchronize the monitoring and evaluation of the HIV/AIDS response. Our analysis classifies the 328 unique indicators according to what they measure and assesses the degree to which they offer comprehensive measurement across three dimensions: domains of the social context, levels of change and organizational capacity. Results and discussion The majority of indicators focus on individual-level (clinical and behavioural) interventions and outcomes, neglecting structural interventions, community interventions and social outcomes (e.g. stigma reduction; community capacity building; policy-maker sensitization). The main tool used to address social aspects of HIV/AIDS is the disaggregation of data by social group. This raises three main limitations. Indicator frameworks do not provide comprehensive coverage of the diverse social drivers of the epidemic, particularly neglecting criminalization, stigma, discrimination and gender norms. There is a dearth of indicators for evaluating the social impacts of HIV interventions. Indicators of organizational capacity focus on capacity to effectively deliver and manage clinical services, neglecting capacity to respond appropriately and sustainably to complex social contexts. Conclusions Current indicator frameworks cannot adequately assess the social

  13. Cone Beam Computed Tomographic Assessment of Bifid Mandibular Condyle

    PubMed Central

    Khojastepour, Leila; Kolahi, Shirin; Panahi, Nazi

    2015-01-01

    Objectives: Differential diagnosis of bifid mandibular condyle (BMC) is important, since it may play a role in temporomandibular joint (TMJ) dysfunctions and joint symptoms. In addition, radiographic appearance of BMC may mimic tumors and/or fractures. The aim of this study was to evaluate the prevalence and orientation of BMC based on cone beam computed tomography (CBCT) scans. Materials and Methods: This cross-sectional study was performed on CBCT scans of paranasal sinuses of 425 patients. In a designated NNT station, all CBCT scans were evaluated in the axial, coronal and sagittal planes to find the frequency of BMC. The condylar head horizontal angulations were also determined in the transverse plane. T-test was used to compare the frequency of BMC between the left and right sides and between males and females. Results: Totally, 309 patients with acceptable visibility of condyles on CBCT scans were entered in the study consisting of 170 (55%) females and 139 (45%) males with a mean age of 39.43±9.7 years. The BMC was detected in 14 cases (4.53%). Differences between males and females, sides and horizontal angulations of condyle of normal and BMC cases were not significant. Conclusion: The prevalence of BMC in the studied population was 4.53%. No significant difference was observed between males and females, sides or horizontal angulations of the involved and uninvolved condyles.

  14. Assessing abdominal aorta narrowing using computational fluid dynamics.

    PubMed

    Al-Rawi, Mohammad; Al-Jumaily, Ahmed M

    2016-05-01

    This paper investigates the effect of developing arterial blockage at the abdominal aorta on the blood pressure waves at an externally accessible location suitable for invasive measurements such as the brachial and the femoral arteries. Arterial blockages are created surgically within the abdominal aorta of healthy Wistar rats to create narrowing resemblance conditions. Blood pressure is measured using a catheter inserted into the right femoral artery. Measurements are taken at the baseline healthy condition as well as at four different severities (20, 50, 80 and 100 %) of arterial blockage. In vivo and in vitro measurements of the lumen diameter and wall thickness are taken using magnetic resonance imaging and microscopic techniques, respectively. These data are used to validate a 3D computational fluid dynamics model which is developed to generalize the outcomes of this work and to determine the arterial stress and strain under the blockage conditions. This work indicates that an arterial blockage in excess of 20 % of the lumen diameter significantly influences the pressure wave and reduces the systolic blood pressure at the right femoral artery. High wall shear stresses and low circumferential strains are also generated at the blockage site. PMID:26319006

  15. Computational intelligence for target assessment in Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Micheli-Tzanakou, Evangelia; Hamilton, J. L.; Zheng, J.; Lehman, Richard M.

    2001-11-01

    Recent advances in image and signal processing have created a new challenging environment for biomedical engineers. Methods that were developed for different fields are now finding a fertile ground in biomedicine, especially in the analysis of bio-signals and in the understanding of images. More and more, these methods are used in the operating room, helping surgeons, and in the physician's office as aids for diagnostic purposes. Neural Network (NN) research on the other hand, has gone a long way in the past decade. NNs now consist of many thousands of highly interconnected processing elements that can encode, store and recall relationships between different patterns by altering the weighting coefficients of inputs in a systematic way. Although they can generate reasonable outputs from unknown input patterns, and can tolerate a great deal of noise, they are very slow when run on a serial machine. We have used advanced signal processing and innovative image processing methods that are used along with computational intelligence for diagnostic purposes and as visualization aids inside and outside the operating room. Applications to be discussed include EEGs and field potentials in Parkinson's disease along with 3D reconstruction of MR or fMR brain images in Parkinson's patients, are currently used in the operating room for Pallidotomies and Deep Brain Stimulation (DBS).

  16. Ultrasound attenuation computed tomography assessment of PAGAT gel dose.

    PubMed

    Khoei, S; Trapp, J V; Langton, C M

    2014-08-01

    Ultrasound has been previously investigated as an alternative readout method for irradiated polymer gel dosimeters, with authors reporting varying dose responses. We extend previous work utilizing a new computed tomography ultrasound scanner comprising of two identical 5 MHz, 128-element linear-array ultrasound transducers, co-axially aligned and submerged in water as a coupling agent, with rotational of the gel dosimeter between the transducers facilitated by a robotic arm. We have investigated the dose-dependence of both ultrasound bulk attenuation and broadband ultrasound attenuation (BUA) for the PAGAT gel dosimeter. The ultrasound bulk attenuation dose sensitivity was found to be 1.46  ±  0.04 dB m( -1) Gy( -1), being in agreement with previously published results for PAG and MAGIC gels. BUA was also found to be dose dependent and was measured to be 0.024  ±  0.003 dB MHz( -1) Gy( -1); the advantage of BUA being its insensitivity to frequency-independent attenuation mechanisms including reflection and refraction, thereby minimizing image reconstruction artefacts. PMID:25049236

  17. Assessment of toxic metals in waste personal computers.

    PubMed

    Kolias, Konstantinos; Hahladakis, John N; Gidarakos, Evangelos

    2014-08-01

    Considering the enormous production of waste personal computers nowadays, it is obvious that the study of their composition is necessary in order to regulate their management and prevent any environmental contamination caused by their inappropriate disposal. This study aimed at determining the toxic metals content of motherboards (printed circuit boards), monitor glass and monitor plastic housing of two Cathode Ray Tube (CRT) monitors, three Liquid Crystal Display (LCD) monitors, one LCD touch screen monitor and six motherboards, all of which were discarded. In addition, concentrations of chromium (Cr), cadmium (Cd), lead (Pb) and mercury (Hg) were compared with the respective limits set by the RoHS 2002/95/EC Directive, that was recently renewed by the 2012/19/EU recast, in order to verify manufacturers' compliance with the regulation. The research included disassembly, pulverization, digestion and chemical analyses of all the aforementioned devices. The toxic metals content of all samples was determined using Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). The results demonstrated that concentrations of Pb in motherboards and funnel glass of devices with release dates before 2006, that is when the RoHS Directive came into force, exceeded the permissible limit. In general, except from Pb, higher metal concentrations were detected in motherboards in comparison with plastic housing and glass samples. Finally, the results of this work were encouraging, since concentrations of metals referred in the RoHS Directive were found in lower levels than the legislative limits. PMID:24816521

  18. Computational assessment of several hydrogen-free high energy compounds.

    PubMed

    Tan, Bisheng; Huang, Ming; Long, Xinping; Li, Jinshan; Fan, Guijuan

    2016-01-01

    Tetrazino-tetrazine-tetraoxide (TTTO) is an attractive high energy compound, but unfortunately, it is not yet experimentally synthesized so far. Isomerization of TTTO leads to its five isomers, bond-separation energies were empolyed to compare the global stability of six compounds, it is found that isomer 1 has the highest bond-separation energy (1204.6kJ/mol), compared with TTTO (1151.2kJ/mol); thermodynamic properties of six compounds were theoretically calculated, including standard formation enthalpies (solid and gaseous), standard fusion enthalpies, standard vaporation enthalpies, standard sublimation enthalpies, lattice energies and normal melting points, normal boiling points; their detonation performances were also computed, including detonation heat (Q, cal/g), detonation velocity (D, km/s), detonation pressure (P, GPa) and impact sensitivity (h50, cm), compared with TTTO (Q=1311.01J/g, D=9.228km/s, P=40.556GPa, h50=12.7cm), isomer 5 exhibites better detonation performances (Q=1523.74J/g, D=9.389km/s, P=41.329GPa, h50= 28.4cm). PMID:26705845

  19. An Assessment of Supercavitation Transition using Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Fronzeo, Melissa; Kinzel, Michael

    2015-11-01

    A computational fluid dynamics approach is used to improve the understanding of supercavitation and its physical characteristics. A ventilated disk cavitator is used in several studies to evaluate these physics. The first study focuses on twin vortex cavities, specifically to understand correlation between cavity shape and pressure. The study uses validated measurements (in the CFD model) of the cavity shape and pressure for various ventilation rates and Fr numbers. The data is used to evaluate the semi-empirical formula of L.A Epstein, where results indicate a potentially improved correlation. In addition, the detailed measurements of the CFD model yield insight on improved experimental measurement techniques for cavity pressure. The second study uses unsteady detached eddy simulations (DES) to predict hysteresis in the transition behavior of the cavity closure from toroidal vortex to twin-vortex regimes. The solution is initialized as a toroidal-type cavity (low gas ventilation rate), then the ventilation rate is slowly increased until a twin-vortex cavity is formed. In addition, the opposite process is also performed. The data is analyzed to develop an understanding of the unknown physical mechanisms involved in the transition process.

  20. CEREAS: an interactive computer-mapping system for environmental assessments

    SciTech Connect

    Levenson, J.B.; Snider, M.A.

    1983-01-01

    CEREAS (Categorical Exclusion Review/Environmental Assessment System) provides the environmental scientist with the tools needed to quickly process an application for permit to drill (APD). It provides an interactive referencing system for producing maps of potential constraint areas surrounding oil and gas activity sites. The design of the system makes it highly flexible. CEREAS was specifically developed to support the CER procedure in processing APDs for oil and gas in the western overthrust belt. It should be emphasized, however, that the system is not limited by geographic boundaries, CER criteria, or oil and gas activities. Of the three system components, only the mapping program remains constant. The command processor and data bases are necessarily project-specific, contributing to the overall system flexibility. Any geographic data potentially defined by longitude/latitude coordinates can be designated as a data base. CEREAS, as described here, is basically a graphic data retrieval system.

  1. Computational Assessment of the Aerodynamic Performance of a Variable-Speed Power Turbine for Large Civil Tilt-Rotor Application

    NASA Technical Reports Server (NTRS)

    Welch, Gerard E.

    2011-01-01

    The main rotors of the NASA Large Civil Tilt-Rotor notional vehicle operate over a wide speed-range, from 100% at take-off to 54% at cruise. The variable-speed power turbine offers one approach by which to effect this speed variation. Key aero-challenges include high work factors at cruise and wide (40 to 60 deg.) incidence variations in blade and vane rows over the speed range. The turbine design approach must optimize cruise efficiency and minimize off-design penalties at take-off. The accuracy of the off-design incidence loss model is therefore critical to the turbine design. In this effort, 3-D computational analyses are used to assess the variation of turbine efficiency with speed change. The conceptual design of a 4-stage variable-speed power turbine for the Large Civil Tilt-Rotor application is first established at the meanline level. The design of 2-D airfoil sections and resulting 3-D blade and vane rows is documented. Three-dimensional Reynolds Averaged Navier-Stokes computations are used to assess the design and off-design performance of an embedded 1.5-stage portion-Rotor 1, Stator 2, and Rotor 2-of the turbine. The 3-D computational results yield the same efficiency versus speed trends predicted by meanline analyses, supporting the design choice to execute the turbine design at the cruise operating speed.

  2. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Baker, Ann E; Barker, Ashley D; Bland, Arthur S Buddy; Boudwin, Kathlyn J.; Hack, James J; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; Wells, Jack C; White, Julia C; Hudson, Douglas L

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation billions of gallons of

  3. Human head-neck computational model for assessing blast injury.

    PubMed

    Roberts, J C; Harrigan, T P; Ward, E E; Taylor, T M; Annett, M S; Merkle, A C

    2012-11-15

    A human head finite element model (HHFEM) was developed to study the effects of a blast to the head. To study both the kinetic and kinematic effects of a blast wave, the HHFEM was attached to a finite element model of a Hybrid III ATD neck. A physical human head surrogate model (HSHM) was developed from solid model files of the HHFEM, which was then attached to a physical Hybrid III ATD neck and exposed to shock tube overpressures. This allowed direct comparison between the HSHM and HHFEM. To develop the temporal and spatial pressures on the HHFEM that would simulate loading to the HSHM, a computational fluid dynamics (CFD) model of the HHFEM in front of a shock tube was generated. CFD simulations were made using loads equivalent to those seen in experimental studies of the HSHM for shock tube driver pressures of 517, 690 and 862 kPa. Using the selected brain material properties, the peak intracranial pressures, temporal and spatial histories of relative brain-skull displacements and the peak relative brain-skull displacements in the brain of the HHFEM compared favorably with results from the HSHM. The HSHM sensors measured the rotations of local areas of the brain as well as displacements, and the rotations of the sensors in the sagittal plane of the HSHM were, in general, correctly predicted from the HHFEM. Peak intracranial pressures were between 70 and 120 kPa, while the peak relative brain-skull displacements were between 0.5 and 3.0mm. PMID:23010219

  4. Radiological dose assessment for bounding accident scenarios at the Critical Experiment Facility, TA-18, Los Alamos National Laboratory

    SciTech Connect

    1991-09-01

    A computer modeling code, CRIT8, was written to allow prediction of the radiological doses to workers and members of the public resulting from these postulated maximum-effect accidents. The code accounts for the relationships of the initial parent radionuclide inventory at the time of the accident to the growth of radioactive daughter products, and considers the atmospheric conditions at time of release. The code then calculates a dose at chosen receptor locations for the sum of radionuclides produced as a result of the accident. Both criticality and non-criticality accidents are examined.

  5. Computer-aided design of dry powder inhalers using computational fluid dynamics to assess performance.

    PubMed

    Suwandecha, Tan; Wongpoowarak, Wibul; Srichana, Teerapol

    2016-01-01

    Dry powder inhalers (DPIs) are gaining popularity for the delivery of drugs. A cost effective and efficient delivery device is necessary. Developing new DPIs by modifying an existing device may be the simplest way to improve the performance of the devices. The aim of this research was to produce a new DPIs using computational fluid dynamics (CFD). The new DPIs took advantages of the Cyclohaler® and the Rotahaler®. We chose a combination of the capsule chamber of the Cyclohaler® and the mouthpiece and grid of the Rotahaler®. Computer-aided design models of the devices were created and evaluated using CFD. Prototype models were created and tested with the DPI dispersion experiments. The proposed model 3 device had a high turbulence with a good degree of deagglomeration in the CFD and the experiment data. The %fine particle fraction (FPF) was around 50% at 60 L/min. The mass median aerodynamic diameter was around 2.8-4 μm. The FPF were strongly correlated to the CFD-predicted turbulence and the mechanical impaction parameters. The drug retention in the capsule was only 5-7%. In summary, a simple modification of the Cyclohaler® and Rotahaler® could produce a better performing inhaler using the CFD-assisted design. PMID:25265389

  6. Using Interactive Simulations in Assessment: The Use of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    ERIC Educational Resources Information Center

    Neumann, David L.

    2010-01-01

    Interactive computer-based simulations have been applied in several contexts to teach statistical concepts in university level courses. In this report, the use of interactive simulations as part of summative assessment in a statistics course is described. Students accessed the simulations via the web and completed questions relating to the…

  7. Visual assessment of bayed beach stability with computer software

    NASA Astrophysics Data System (ADS)

    da Fontoura Klein, Antonio Henrique; Vargas, Ariel; Raabe, André Luís. Alice; Hsu, John R. C.

    2003-12-01

    The parabolic bay shape model is the only morphological model that has the mechanism for the evaluating beach stability and predicting shoreline changes arising from structures built on a curved beach. However, application of this parabolic model has been largely in manual form, by tracing the calculated bay shape on a map or aerial photograph after hand calculation. To overcome this drawback, a software package called model for equilibrium planform of bay beaches (MEPBAY) written in Object Pascal language is proposed to facilitate the model application. MEPBAY calculates the idealized shoreline planform of a headland-bay beach in static equilibrium based on the parabolic model. It then presents the results graphically on a screen display overlaying the image of the existing beach. It thus allows the stability of a headland-bay beach to be assessed visually by comparing the existing shoreline periphery with the static equilibrium planform. The software offers a friendly environment from simple input to instant visualization of the results. MEPBAY not only helps students understand the morphological process, but also provides engineers with a valuable tool for practical applications on shoreline protection and coastal management.

  8. Computer-based assessment of student-constructed responses.

    PubMed

    Magliano, Joseph P; Graesser, Arthur C

    2012-09-01

    Student-constructed responses, such as essays, short-answer questions, and think-aloud protocols, provide a valuable opportunity to gauge student learning outcomes and comprehension strategies. However, given the challenges of grading student-constructed responses, instructors may be hesitant to use them. There have been major advances in the application of natural language processing of student-constructed responses. This literature review focuses on two dimensions that need to be considered when developing new systems. The first is type of response provided by the student-namely, meaning-making responses (e.g., think-aloud protocols, tutorial dialogue) and products of comprehension (e.g., essays, open-ended questions). The second corresponds to considerations of the type of natural language processing systems used and how they are applied to analyze the student responses. We argue that the appropriateness of the assessment protocols is, in part, constrained by the type of response and researchers should use hybrid systems that rely on multiple, convergent natural language algorithms. PMID:22581494

  9. Roughness Based Crossflow Transition Control: A Computational Assessment

    NASA Technical Reports Server (NTRS)

    Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan; Streett, Craig L.; Carpenter, Mark H.

    2009-01-01

    A combination of parabolized stability equations and secondary instability theory has been applied to a low-speed swept airfoil model with a chord Reynolds number of 7.15 million, with the goals of (i) evaluating this methodology in the context of transition prediction for a known configuration for which roughness based crossflow transition control has been demonstrated under flight conditions and (ii) of analyzing the mechanism of transition delay via the introduction of discrete roughness elements (DRE). Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes, so as to weaken the growth of naturally occurring, linearly more unstable crossflow modes. Therefore, a synthesis of receptivity, linear and nonlinear growth of stationary crossflow disturbances, and the ensuing development of high frequency secondary instabilities is desirable to understand the experimentally observed transition behavior. With further validation, such higher fidelity prediction methodology could be utilized to assess the potential for crossflow transition control at even higher Reynolds numbers, where experimental data is currently unavailable.

  10. A Critical Analysis of the International Baccalaureate's Middle Years Programme Assessment Design with Particular Focus on Feedback

    ERIC Educational Resources Information Center

    Hughes, Conrad

    2014-01-01

    The International Baccalaureate's Middle Years Programme (IBMYP) is designed to support the development of creativity, critical thinking, international-mindedness and values. However, close inspection of the programme's assessment structure suggests that many of the competence-related and dispositional elements of the programme's…

  11. Environmental assessment for consolidation of certain materials and machines for nuclear criticality experiments and training

    SciTech Connect

    1996-05-21

    In support of its assigned missions and because of the importance of avoiding nuclear criticality accidents, DOE has adopted a policy to reduce identifiable nuclear criticality safety risks and to protect the public, workers, government property and essential operations from the effects of a criticality accident. In support of this policy, the Los Alamos Critical Experiments Facility (LACEF) at the Los Alamos National Laboratory (LANL) Technical Area (TA) 18, provides a program of general purpose critical experiments. This program, the only remaining one of its kind in the United States, seeks to maintain a sound basis of information for criticality control in those physical situations that DOE will encounter in handling and storing fissionable material in the future, and ensuring the presence of a community of individuals competent in practicing this control.

  12. Total electron content in the Martian atmosphere: A critical assessment of the Mars Express MARSIS data sets

    NASA Astrophysics Data System (ADS)

    Sánchez-Cano, B.; Morgan, D. D.; Witasse, O.; Radicella, S. M.; Herraiz, M.; Orosei, R.; Cartacci, M.; Cicchetti, A.; Noschese, R.; Kofman, W.; Grima, C.; Mouginot, J.; Gurnett, D. A.; Lester, M.; Blelly, P.-L.; Opgenoorth, H.; Quinsac, G.

    2015-03-01

    The total electron content (TEC) is one of the most useful parameters to evaluate the behavior of the Martian ionosphere because it contains information on the total amount of free electrons, the main component of the Martian ionospheric plasma. The Mars Express Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS) radar is able to derive TEC from both of its operation modes: (1) the active ionospheric sounding (AIS) mode and (2) the subsurface mode. TEC estimates from the subsurface sounding mode can be computed from the same raw data independently using different algorithms, which should yield similar results. Significant differences on the dayside, however, have been found from two of the algorithms. Moreover, both algorithms seem also to disagree with the TEC results from the AIS mode. This paper gives a critical, quantitative, and independent assessment of these discrepancies and indicates the possible uncertainty of these databases. In addition, a comparison between the results given by the empirical model of the Martian ionosphere developed by Sánchez-Cano et al. (2013) and the different data sets has been performed. The main result is that for solar zenith angles higher than 75°, where the maximum plasma frequency is typically small compared with the radar frequencies, the two subsurface algorithms can be confidently used. For solar zenith angles less than 75°, where the maximum plasma frequency is very close to the radar frequencies, both algorithms suffer limitations. Nevertheless, despite the solar zenith angle restrictions, the dayside TEC of one of the two algorithms is consistent with the modeled TEC.

  13. Use of subjective global assessment and clinical outcomes in critically ill geriatric patients receiving nutrition support.

    PubMed

    Atalay, Betül Gülsen; Yagmur, Cahide; Nursal, Tarik Zafer; Atalay, Hakan; Noyan, Turgut

    2008-01-01

    The objective of this study is to examine the prevalence of malnutrition and evaluate the nutrition status and clinical outcome in hospitalized patients aged 65 years and older receiving enteral-parenteral nutrition. This retrospective study was carried out at Başkent University Hospital, Adana, Turkey. A total of 119 patients older than 65 years were recruited. Patients were classified into 3 groups: protein-energy malnutrition (PEM), moderate PEM, and well nourished according to subjective global assessment (SGA) at admission. All patients were fed by enteral or parenteral route. Acute physiological and chronic health evaluation (APACHE-2) and simplified acute physiology (SAPS 2) scores were recorded in patients followed in the intensive care unit (ICU). Nutrition status was assessed with biochemical (serum albumin, serum prealbumin) parameters. These results were compared with mortality rate and length of hospital stay (LOS). The subjects' mean (+/-SD) age was 73.1 +/- 5.4 years. Using SGA, 5.9% (n = 7) of the patients were classified as severely PEM, 27.7% (n = 33) were classified as moderately PEM, and 66.4% (n = 79) were classified as well nourished. Some 73.1% (n = 87) of the patients were followed in the ICU. Among all patients, 42.9% (n = 51) were fed by a combined enteral-parenteral route, 31.1% (n = 37) by an enteral route, 18.5% (n = 22) by a parenteral route, and 7.6% (n = 9) by an oral route. The average length of stay for the patients was 18.9 +/- 13.7 days. The mortality rate was 44.5% (n = 53). The mortality rate was 43% (n = 34) in well-nourished patients (n = 79), 48.5% (n = 16) in moderately PEM patients (n = 33), and 42.9% (n = 3) in severely PEM patients (n = 7) (P = .86). The authors observed no difference between well-nourished and malnourished patients with regard to the serum protein values on admission, LOS, and mortality rate. In this study, malnutrition as defined by SGA did not influence the mortality rate of critically ill geriatric

  14. Critical Assessment of Object Segmentation in Aerial Image Using Geo-Hausdorff Distance

    NASA Astrophysics Data System (ADS)

    Sun, H.; Ding, Y.; Huang, Y.; Wang, G.

    2016-06-01

    Aerial Image records the large-range earth objects with the ever-improving spatial and radiometric resolution. It becomes a powerful tool for earth observation, land-coverage survey, geographical census, etc., and helps delineating the boundary of different kinds of objects on the earth both manually and automatically. In light of the geo-spatial correspondence between the pixel locations of aerial image and the spatial coordinates of ground objects, there is an increasing need of super-pixel segmentation and high-accuracy positioning of objects in aerial image. Besides the commercial software package of eCognition and ENVI, many algorithms have also been developed in the literature to segment objects of aerial images. But how to evaluate the segmentation results remains a challenge, especially in the context of the geo-spatial correspondence. The Geo-Hausdorff Distance (GHD) is proposed to measure the geo-spatial distance between the results of various object segmentation that can be done with the manual ground truth or with the automatic algorithms.Based on the early-breaking and random-sampling design, the GHD calculates the geographical Hausdorff distance with nearly-linear complexity. Segmentation results of several state-of-the-art algorithms, including those of the commercial packages, are evaluated with a diverse set of aerial images. They have different signal-to-noise ratio around the object boundaries and are hard to trace correctly even for human operators. The GHD value is analyzed to comprehensively measure the suitability of different object segmentation methods for aerial images of different spatial resolution. By critically assessing the strengths and limitations of the existing algorithms, the paper provides valuable insight and guideline for extensive research in automating object detection and classification of aerial image in the nation-wide geographic census. It is also promising for the optimal design of operational specification of remote

  15. Assessment of TRAC-PF1/MOD1 version 14. 3 using separate effects critical flow and blowdown experiments

    SciTech Connect

    Spindler, B.; Pellissier, M. )

    1990-01-01

    Independent assessment of the TRAC code was conducted at the Centre d'Etudes Nucleaires de Grenoble of the Commissariate a l'Energie Atomique (France) in the frame of the ICAP. This report presents the results of the assessment of TRAC-PF1/MOD1 version 14.3 using critical flow steady state tests (MOBY-DICK, SUPER-MOBY-DICK), and blowdown tests (CANON, SUPER-CANON, VERTICAL-CANON, MARVIKEN, OMEGA-TUBE, OMEGA-BUNDLE). This document, Volume 1, presents the text and tables from this assessment.

  16. Assessment of TRAC-PF1/MOD1 Version 14. 3 using separate effects critical flow and blowdown experiments

    SciTech Connect

    Spindler, B.; Pellissier, M. )

    1990-01-01

    Independent assessment of the TRAC code was conducted at the Centre d'Etudes Nucleaires de Grenoble of the Commissariate a l'Energie Atomique (France) in the frame of the ICAP. This report presents the results of the assessment of TRAC-PF1/MOD1 version 14.3 using critical flow steady state tests (MOBY-DICK, SUPER-MOBY-DICK), and blowdown tests (CANON, SUPER-CANON, VERTICAL-CANON, MARVIKEN, OMEGA-TUBE, OMEGA-BUNDLE). This document, Volume 2, presents the experimental data and figures from the assessment.

  17. Experiments on small-size fast critical fuel assemblies at the AKSAMIT facility and their use for development of computational models

    NASA Astrophysics Data System (ADS)

    Glushkov, E. S.; Glushkov, A. E.; Gomin, E. A.; Daneliya, S. B.; Zimin, A. A.; Kalugin, M. A.; Kapitonova, A. V.; Kompaniets, G. V.; Moroz, N. P.; Nosov, V. I.; Petrushenko, R. P.; Smirnov, O. N.

    2013-12-01

    Small-size fast critical assemblies with highly enriched fuel at the AKSAMIT facility are described in detail. Computational models of the critical assemblies at room temperature are given. The calculation results for the critical parameters are compared with the experimental data. A good agreement between the calculations and the experimental data is shown. The physical models developed for the critical assemblies, as well as the experimental results, can be applied to verify various codes intended for calculation of the neutronic characteristics of small-size fast nuclear reactors. For these experiments, the results computed using the codes of the MCU family show a high quality of the neutron data and of the physical models used.

  18. Theory of Knowledge Aims, Objectives and Assessment Criteria: An Analysis of Critical Thinking Descriptors

    ERIC Educational Resources Information Center

    Hughes, Conrad

    2014-01-01

    This article analyses the construct validity of the International Baccalaureate Diploma Programme's Theory of Knowledge course in the light of claims that it is a course in critical thinking. After discussion around critical thinking--what it is and why it is valuable educationally--the article analyses the extent to which the course aims,…

  19. A Quantitative Assessment of an Application of Halpern's Teaching for Critical Thinking in a Business Class

    ERIC Educational Resources Information Center

    Reid, Joanne R.

    2010-01-01

    Can Critical Thinking be taught and learned? The author used a pre-experimental research method to answer this question. The foundation of this research study was Halpern's Teaching for Critical Thinking model. The instructional design paradigm was the 2003 Cognitive Training Model of Foshay, Silber, and Stelnicki. The author developed a course…

  20. Critical Thinking Assessment across Four Sustainability-Related Experiential Learning Settings

    ERIC Educational Resources Information Center

    Heinrich, William F.; Habron, Geoffrey B.; Johnson, Heather L.; Goralnik, Lissy

    2015-01-01

    Today's complex societal problems require both critical thinking and an engaged citizenry. Current practices in higher education, such as service learning, suggest that experiential learning can serve as a vehicle to encourage students to become engaged citizens. However, critical thinking is not necessarily a part of every experiential learning…