Science.gov

Sample records for computer criticality assessments

  1. Making Student Thinking Visible through a Concept Map in Computer-Based Assessment of Critical Thinking

    ERIC Educational Resources Information Center

    Rosen, Yigal; Tager, Maryam

    2014-01-01

    Major educational initiatives in the world place great emphasis on fostering rich computer-based environments of assessment that make student thinking and reasoning visible. Using thinking tools engages students in a variety of critical and complex thinking, such as evaluating, analyzing, and decision making. The aim of this study was to explore…

  2. Content Analysis in Computer-Mediated Communication: Analyzing Models for Assessing Critical Thinking through the Lens of Social Constructivism

    ERIC Educational Resources Information Center

    Buraphadeja, Vasa; Dawson, Kara

    2008-01-01

    This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…

  3. Conversion of Input Data between KENO and MCNP File Formats for Computer Criticality Assessments

    SciTech Connect

    Schwarz, Randolph A.; Carter, Leland L.; Schwarz Alysia L.

    2006-11-30

    KENO is a Monte Carlo criticality code that is maintained by Oak Ridge National Laboratory (ORNL). KENO is included in the SCALE (Standardized Computer Analysis for Licensing Evaluation) package. KENO is often used because it was specifically designed for criticality calculations. Because KENO has convenient geometry input, including the treatment of lattice arrays of materials, it is frequently used for production calculations. Monte Carlo N-Particle (MCNP) is a Monte Carlo transport code maintained by Los Alamos National Laboratory (LANL). MCNP has a powerful 3D geometry package and an extensive cross section database. It is a general-purpose code and may be used for calculations involving shielding or medical facilities, for example, but can also be used for criticality calculations. MCNP is becoming increasingly more popular for performing production criticality calculations. Both codes have their own specific advantages. After a criticality calculation has been performed with one of the codes, it is often desirable (or may be a safety requirement) to repeat the calculation with the other code to compare the important parameters using a different geometry treatment and cross section database. This manual conversion of input files between the two codes is labor intensive. The industry needs the capability of converting geometry models between MCNP and KENO without a large investment in manpower. The proposed conversion package will aid the user in converting between the codes. It is not intended to be used as a “black box”. The resulting input file will need to be carefully inspected by criticality safety personnel to verify the intent of the calculation is preserved in the conversion. The purpose of this package is to help the criticality specialist in the conversion process by converting the geometry, materials, and pertinent data cards.

  4. Handheld computers in critical care

    PubMed Central

    Lapinsky, Stephen E; Weshler, Jason; Mehta, Sangeeta; Varkul, Mark; Hallett, Dave; Stewart, Thomas E

    2001-01-01

    Background Computing technology has the potential to improve health care management but is often underutilized. Handheld computers are versatile and relatively inexpensive, bringing the benefits of computers to the bedside. We evaluated the role of this technology for managing patient data and accessing medical reference information, in an academic intensive-care unit (ICU). Methods Palm III series handheld devices were given to the ICU team, each installed with medical reference information, schedules, and contact numbers. Users underwent a 1-hour training session introducing the hardware and software. Various patient data management applications were assessed during the study period. Qualitative assessment of the benefits, drawbacks, and suggestions was performed by an independent company, using focus groups. An objective comparison between a paper and electronic handheld textbook was achieved using clinical scenario tests. Results During the 6-month study period, the 20 physicians and 6 paramedical staff who used the handheld devices found them convenient and functional but suggested more comprehensive training and improved search facilities. Comparison of the handheld computer with the conventional paper text revealed equivalence. Access to computerized patient information improved communication, particularly with regard to long-stay patients, but changes to the software and the process were suggested. Conclusions The introduction of this technology was well received despite differences in users' familiarity with the devices. Handheld computers have potential in the ICU, but systems need to be developed specifically for the critical-care environment. PMID:11511337

  5. Computer-Based Assessment in Safety-Critical Industries: The Case of Shipping

    ERIC Educational Resources Information Center

    Gekara, Victor Oyaro; Bloor, Michael; Sampson, Helen

    2011-01-01

    Vocational education and training (VET) concerns the cultivation and development of specific skills and competencies, in addition to broad underpinning knowledge relating to paid employment. VET assessment is, therefore, designed to determine the extent to which a trainee has effectively acquired the knowledge, skills, and competencies required by…

  6. Carahunge - A Critical Assessment

    NASA Astrophysics Data System (ADS)

    González-García, A. César

    Carahunge is a megalithic monument in southern Armenia that has often been acclaimed as the oldest observatory. The monument, composed of dozens of standing stones, has some perforated stones. The direction of the holes has been measured and their orientation is related to the sun, moon, and stars, obtaining a date for the construction of such devices. After a critical review of the methods and conclusions, these are shown as untenable.

  7. Computer Security Risk Assessment

    1992-02-11

    LAVA/CS (LAVA for Computer Security) is an application of the Los Alamos Vulnerability Assessment (LAVA) methodology specific to computer and information security. The software serves as a generic tool for identifying vulnerabilities in computer and information security safeguards systems. Although it does not perform a full risk assessment, the results from its analysis may provide valuable insights into security problems. LAVA/CS assumes that the system is exposed to both natural and environmental hazards and tomore » deliberate malevolent actions by either insiders or outsiders. The user in the process of answering the LAVA/CS questionnaire identifies missing safeguards in 34 areas ranging from password management to personnel security and internal audit practices. Specific safeguards protecting a generic set of assets (or targets) from a generic set of threats (or adversaries) are considered. There are four generic assets: the facility, the organization''s environment; the hardware, all computer-related hardware; the software, the information in machine-readable form stored both on-line or on transportable media; and the documents and displays, the information in human-readable form stored as hard-copy materials (manuals, reports, listings in full-size or microform), film, and screen displays. Two generic threats are considered: natural and environmental hazards, storms, fires, power abnormalities, water and accidental maintenance damage; and on-site human threats, both intentional and accidental acts attributable to a perpetrator on the facility''s premises.« less

  8. NASA Critical Facilities Maintenance Assessment

    NASA Technical Reports Server (NTRS)

    Oberhettinger, David J.

    2006-01-01

    Critical Facilities Maintenance Assessment (CFMA) was first implemented by NASA following the March 2000 overtest of the High Energy Solar Spectroscopic Imager (HESSI) spacecraft. A sine burst dynamic test using a 40 year old shaker failed. Mechanical binding/slippage of the slip table imparted 10 times the planned force to the test article. There was major structural damage to HESSI. The mechanical "health" of the shaker had not been assessed and tracked to assure the test equipment was in good working order. Similar incidents have occurred at NASA facilities due to inadequate maintenance (e.g., rainwater from a leaky roof contaminated an assembly facility that housed a spacecraft). The HESSI incident alerted NASA to the urgent need to identify inadequacies in ground facility readiness and maintenance practices. The consequences of failures of ground facilities that service these NASA systems are severe due to the high unit value of NASA products.

  9. Formative Assessment using Computer-Aided Assessment.

    ERIC Educational Resources Information Center

    Lawson, Duncan

    1999-01-01

    Describes how computer-aided assessment can provide a means of preserving formative assessment within the curriculum at a fraction of the time-cost involved with written work. Illustrates a variety of computer-aided assessment styles. (Author/ASK)

  10. Formative Assessment: A Critical Review

    ERIC Educational Resources Information Center

    Bennett, Randy Elliot

    2011-01-01

    This paper covers six interrelated issues in formative assessment (aka, "assessment for learning"). The issues concern the definition of formative assessment, the claims commonly made for its effectiveness, the limited attention given to domain considerations in its conceptualisation, the under-representation of measurement principles in that…

  11. Criticality assessment of LLRWDF closure

    SciTech Connect

    Sarrack, A.G.; Weber, J.H.; Woody, N.D.

    1992-10-06

    During the operation of the Low Level Radioactive Waste Disposal Facility (LLRWDF), large amounts (greater than 100 kg) of enriched uranium (EU) were buried. This EU came primarily from the closing and decontamination of the Naval Fuels Facility in the time period from 1987--1989. Waste Management Operations (WMO) procedures were used to keep the EU boxes separated to prevent possible criticality during normal operation. Closure of the LLRWDF is currently being planned, and waste stabilization by Dynamic Compaction (DC) is proposed. Dynamic compaction will crush the containers in the LLRWDF and result in changes in their geometry. Research of the LLRWDF operations and record keeping practices have shown that the EU contents of trenches are known, but details of the arrangement of the contents cannot be proven. Reviews of the trench contents, combined with analysis of potential critical configurations, revealed that some portions of the LLRWDF can be expected to be free of criticality concerns while other sections have credible probabilities for the assembly of a critical mass, even in the uncompacted configuration. This will have an impact on the closure options and which trenches can be compacted.

  12. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  13. Computer Narrative Assessment Reports.

    ERIC Educational Resources Information Center

    Mathews, Walter M.

    The use of narrative test reports overcomes the major barrier to understanding reports, understanding the language that is used. Early attempts to utilize the computer in generating narrative reports include: (1) Teaching Information Processing System (TIPS), involving periodic collection of information from students regarding courses, which is…

  14. Assessment of Critical-Analytic Thinking

    ERIC Educational Resources Information Center

    Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.

    2014-01-01

    National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…

  15. Assessing Postgraduate Students' Critical Thinking Ability

    ERIC Educational Resources Information Center

    Javed, Muhammad; Nawaz, Muhammad Atif; Qurat-Ul-Ain, Ansa

    2015-01-01

    This paper addresses to assess the critical thinking ability of postgraduate students. The target population was the male and female students at University level in Pakistan. A small sample of 45 male and 45 female students were selected randomly from The Islamia University of Bahawalpur, Pakistan. Cornell Critical Thinking Test Series, The…

  16. Equivalent damage: A critical assessment

    NASA Technical Reports Server (NTRS)

    Laflen, J. R.; Cook, T. S.

    1982-01-01

    Concepts in equivalent damage were evaluated to determine their applicability to the life prediction of hot path components of aircraft gas turbine engines. Equivalent damage was defined as being those effects which influence the crack initiation life-time beyond the damage that is measured in uniaxial, fully-reversed sinusoidal and isothermal experiments at low homologous temperatures. Three areas of equivalent damage were examined: mean stress, cumulative damage, and multiaxiality. For each area, a literature survey was conducted to aid in selecting the most appropriate theories. Where possible, data correlations were also used in the evaluation process. A set of criteria was developed for ranking the theories in each equivalent damage regime. These criteria considered aspects of engine utilization as well as the theoretical basis and correlative ability of each theory. In addition, consideration was given to the complex nature of the loading cycle at fatigue critical locations of hot path components; this loading includes non-proportional multiaxial stressing, combined temperature and strain fluctuations, and general creep-fatigue interactions. Through applications of selected equivalent damage theories to some suitable data sets it was found that there is insufficient data to allow specific recommendations of preferred theories for general applications. A series of experiments and areas of further investigations were identified.

  17. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  18. Reliability of assessment of critical thinking.

    PubMed

    Allen, George D; Rubenfeld, M Gaie; Scheffer, Barbara K

    2004-01-01

    Although clinical critical thinking skills and behaviors are among the most highly sought characteristics of BSN graduates, they remain among the most difficult to teach and assess. Three reasons for this difficulty have been (1) lack of agreement among nurse educators as to the definition of critical thinking, (2) low correlation between clinical critical thinking and existing standardized tests of critical thinking, and (3) poor reliability in scoring other evidences of critical thinking, such as essays. This article first describes a procedure for teaching critical thinking that is based on a consensus definition of 17 dimensions of critical thinking in clinical nursing practice. This procedure is easily taught to nurse educators and can be flexibly and inexpensively incorporated into any undergraduate nursing curriculum. We then show that students' understanding and use of these dimensions can be assessed with high reliability (coefficient alpha between 0.7 and 0.8) and with great time efficiency for both teachers and students. By using this procedure iteratively across semesters, students can develop portfolios demonstrating attainment of competence in clinical critical thinking, and educators can obtain important summary evaluations of the degree to which their graduates have succeeded in this important area of their education.

  19. Recent Use of Covariance Data for Criticality Safety Assessment

    NASA Astrophysics Data System (ADS)

    Rearden, B. T.; Mueller, D. E.

    2008-12-01

    The TSUNAMI codes of the Oak Ridge National Laboratory SCALE code system were applied to a burnup credit application to demonstrate the use of sensitivity and uncertainty analysis with recent cross section covariance data for criticality safety code and data validation. The use of sensitivity and uncertainty analysis provides for the assessment of a defensible computational bias, bias uncertainty, and gap analysis for a complex system that otherwise could be assessed only through the use of expert judgment and conservative assumptions.

  20. Recent Use of Covariance Data for Criticality Safety Assessment

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2008-01-01

    The TSUNAMI codes of the Oak Ridge National Laboratory SCALE code system were applied to a burnup credit application to demonstrate the use of sensitivity and uncertainty analysis with recent cross section covariance data for criticality safety code and data validation. The use of sensitivity and uncertainty analysis provides for the assessment of a defensible computational bias, bias uncertainty, and gap analysis for a complex system that otherwise could be assessed only through the use of expert judgment and conservative assumptions.

  1. To assess the reparative ability of differentiated mesenchymal stem cells in a rat critical size bone repair defect model using high frequency co-registered photoacoustic/ultrasound imaging and micro computed tomography

    NASA Astrophysics Data System (ADS)

    Zafar, Haroon; Gaynard, Sean; O'Flatharta, Cathal; Doroshenkova, Tatiana; Devine, Declan; Sharif, Faisal; Barry, Frank; Hayes, Jessica; Murphy, Mary; Leahy, Martin J.

    2016-03-01

    Stem cell based treatments hold great potential and promise to address many unmet clinical needs. The importance of non-invasive imaging techniques to monitor transplanted stem cells qualitatively and quantitatively is crucial. The objective of this study was to create a critical size bone defect in the rat femur and then assess the ability of the differentiated mesenchymal stem cells (MSCs) to repair the defect using high frequency co-registered photoacoustic(PA)/ultrasound(US) imaging and micro computed tomography (μCT) over an 8 week period. Combined PA and US imaging was performed using 256 elements, 21 MHz frequency linear-array transducer combined with multichannel collecting system. In vivo 3D PA and US images of the defect bone in the rat femur were acquired after 4 and 8 weeks of the surgery. 3D co-registered structural such as microvasculature and the functional images such as total concentration of haemoglobin (HbT) and the haemoglobin oxygen saturation (sO2) were obtained using PA and US imaging. Bone formation was assessed after 4 and 8 weeks of the surgery by μCT. High frequency linear-array based coregistered PA/US imaging has been found promising in terms of non-invasiveness, sensitivity, adaptability, high spatial and temporal resolution at sufficient depths for the assessment of the reparative ability of MSCs in a rat critical size bone repair defect model.

  2. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  3. Assessment of paediatric pain: a critical review.

    PubMed

    Manocha, Sachin; Taneja, Navneet

    2016-06-01

    Pain is a complex experience, and its quantification involves many aspects including physiological, behavioural, and psychological factors. References related to the topic were selected and analysed, along with a PubMed search of the recent and earlier reports. Assessment of pain in infants and children has always been a dilemma for the clinicians. Unlike in adults, it is difficult to assess and effectively treat pain in paediatric age groups, and it often remains untreated or undertreated. Misperceptions are attributed not only to the difficulties in isolating the specific signs of pain but also in recognising and inferring the meaning of the cues available in the complex of individual differences in the reaction pattern of children to pain. In children, several parameters such as age, cognitive level, type of pain, etc. are required to be considered for the selection of appropriate pain assessment tools. Although considerable progress has been made, there is a critical need for a more accurate measurement tool for both research and clinical purposes. This review has critically analysed the various techniques available to assess pain in children with emphasis on current research and present-day status of paediatric pain assessment.

  4. DOE/EM Criticality Safety Needs Assessment

    SciTech Connect

    Westfall, Robert Michael; Hopper, Calvin Mitchell

    2011-02-01

    The issue of nuclear criticality safety (NCS) in Department of Energy Environmental Management (DOE/EM) fissionable material operations presents challenges because of the large quantities of material present in the facilities and equipment that are committed to storage and/or material conditioning and dispositioning processes. Given the uncertainty associated with the material and conditions for many DOE/EM fissionable material operations, ensuring safety while maintaining operational efficiency requires the application of the most-effective criticality safety practices. In turn, more-efficient implementation of these practices can be achieved if the best NCS technologies are utilized. In 2002, DOE/EM-1 commissioned a survey of criticality safety technical needs at the major EM sites. These needs were documented in the report Analysis of Nuclear Criticality Safety Technology Supporting the Environmental Management Program, issued May 2002. Subsequent to this study, EM safety management personnel made a commitment to applying the best and latest criticality safety technology, as described by the DOE Nuclear Criticality Safety Program (NCSP). Over the past 7 years, this commitment has enabled the transfer of several new technologies to EM operations. In 2008, it was decided to broaden the basis of the EM NCS needs assessment to include not only current needs for technologies but also NCS operational areas with potential for improvements in controls, analysis, and regulations. A series of NCS workshops has been conducted over the past years, and needs have been identified and addressed by EM staff and contractor personnel. These workshops were organized and conducted by the EM Criticality Safety Program Manager with administrative and technical support by staff at Oak Ridge National Laboratory (ORNL). This report records the progress made in identifying the needs, determining the approaches for addressing these needs, and assimilating new NCS technologies into EM

  5. Radiation exposure and risk assessment for critical female body organs

    NASA Technical Reports Server (NTRS)

    Atwell, William; Weyland, Mark D.; Hardy, Alva C.

    1991-01-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed.

  6. Critical Reflection on Cultural Difference in the Computer Conference

    ERIC Educational Resources Information Center

    Ziegahn, Linda

    2005-01-01

    Adult educators have a strong interest in designing courses that stimulate learning toward critical, more inclusive cultural perspectives. Critical reflection is a key component of both intercultural learning and a growing medium of instruction, the asynchronous computer conference (CC). This study combined qualitative methodology with a framework…

  7. Assessment of computational performance in nuclear criticality

    SciTech Connect

    Petrie, L.M.; Thomas, J.T.

    1985-01-01

    This report presents the results of a study undertaken to resolve the long-standing discrepanies between calculations and experiments involving arrays of fissile solution units. Room return was found to be sufficient to account for the discrepancy of some bare arrays, but reflected arrays are still in disagreement, and the magnitude of the room return raises other unresolved issues.

  8. Nutritional assessment in the critically ill.

    PubMed

    Manning, E M; Shenkin, A

    1995-07-01

    Although many of the measurements and techniques outlined in this article may be epidemiologically useful and correlate with morbidity and mortality, no single indicator is of consistent value in the nutritional assessment of critically ill patients. Measurements such as anthropometrics, total body fat estimation, or delayed hypersensitivity skin testing either are liable to non-nutritional influences or lack accuracy and precision in individual patients. Plasma concentrations of hepatic proteins are affected significantly by the patient's underlying disease state and therapeutic interventions and therefore lack specificity. Although the measurement of these proteins is of little value in the initial nutritional assessment of the critically ill, serial measurement, particularly of plasma pre-albumin, may be useful in monitoring the response to nutritional support. Nitrogen balance is a widely used and valuable nutritional indicator in the critically ill. Direct measurement of urine nitrogen is the preferred test, although nitrogen excretion often is derived from 24-hour urine urea measurement, an inexpensive and easy procedure, but one that is less accurate. More accurate techniques of assessing change in nutritional status, such as IVNAA of total body nitrogen or isotopic measurement of exchangeable potassium or sodium, are more expensive, less available, unsuitable for repeated analyses, and less feasible in severely ill patients. Total body nitrogen measured using IVNAA and total-body potassium, however, are the most accurate ways of measuring body composition in the presence of large amounts of edema fluid. The application of body composition measurements to patient care remains poorly defined because of the many problems encountered with the various techniques, including cost, availability, and radiation exposure. Improved, more sensitive and, preferably, bedside methods for the measurement of body composition are needed. It is of paramount importance that

  9. Nutritional Assessment in Critically Ill Patients

    PubMed Central

    Hejazi, Najmeh; Mazloom, Zohreh; Zand, Farid; Rezaianzadeh, Abbas; Amini, Afshin

    2016-01-01

    Background: Malnutrition is an important factor in the survival of critically ill patients. The purpose of the present study was to assess the nutritional status of patients in the intensive care unit (ICU) on the days of admission and discharge via a detailed nutritional assessment. Methods: Totally, 125 patients were followed up from admission to discharge at 8ICUs in Shiraz, Iran. The patients’ nutritional status was assessed using subjective global assessment (SGA), anthropometric measurements, biochemical indices, and body composition indicators. Diet prescription and intake was also evaluated. Results: Malnutrition prevalence significantly increased on the day of discharge (58.62%) compared to the day of admission (28.8%) according to SGA (P<0.001). The patients’ weight, mid-upper-arm circumference, mid-arm muscle circumference, triceps skinfold thickness, and calf circumference decreased significantly as well (P<0.001). Lean mass weight and body cell mass also decreased significantly (P<0.001). Biochemical indices showed no notable changes except for magnesium, which decreased significantly (P=0.013). A negative significant correlation was observed between malnutrition on discharge day and anthropometric measurements. Positive and significant correlations were observed between the number of days without enteral feeding, days delayed from ICU admission to the commencement of enteral feeding, and the length of ICU stay and malnutrition on discharge day. Energy and protein intakes were significantly less than the prescribed diet (26.26% and 26.48%, respectively). Conclusion: Malnutrition on discharge day increased in the patients in the ICU according to SGA. Anthropometric measurements were better predictors of the nutritional outcome of our critically ill patients than were biochemical tests. PMID:27217600

  10. Assessing Terrorist Motivations for Attacking Critical Infrastructure

    SciTech Connect

    Ackerman, G; Abhayaratne, P; Bale, J; Bhattacharjee, A; Blair, C; Hansell, L; Jayne, A; Kosal, M; Lucas, S; Moran, K; Seroki, L; Vadlamudi, S

    2006-12-04

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security and way of life. These complex and often interconnected systems have become so ubiquitous and essential to day-to-day life that they are easily taken for granted. Often it is only when the important services provided by such infrastructure are interrupted--when we lose easy access to electricity, health care, telecommunications, transportation or water, for example--that we are conscious of our great dependence on these networks and of the vulnerabilities that stem from such dependence. Unfortunately, it must be assumed that many terrorists are all too aware that CI facilities pose high-value targets that, if successfully attacked, have the potential to dramatically disrupt the normal rhythm of society, cause public fear and intimidation, and generate significant publicity. Indeed, revelations emerging at the time of this writing about Al Qaida's efforts to prepare for possible attacks on major financial facilities in New York, New Jersey, and the District of Columbia remind us just how real and immediate such threats to CI may be. Simply being aware that our nation's critical infrastructure presents terrorists with a plethora of targets, however, does little to mitigate the dangers of CI attacks. In order to prevent and preempt such terrorist acts, better understanding of the threats and vulnerabilities relating to critical infrastructure is required. The Center for Nonproliferation Studies (CNS) presents this document as both a contribution to the understanding of such threats and an initial effort at ''operationalizing'' its findings for use by analysts who work on issues of critical infrastructure protection. Specifically, this study focuses on a subsidiary aspect of CI threat assessment that has thus far remained largely unaddressed by contemporary terrorism research: the motivations and related factors that determine whether a terrorist

  11. A critical analysis of sarcoidosis incidence assessment.

    PubMed

    Reich, Jerome M

    2013-01-01

    Valid sarcoidosis incidence assessment is contingent on access to medical care, thoroughness of reportage, assiduity of radiographic interpretation, employment and health care screening policies, misclassification, and population ethnicity. To diminish ambiguity and foster inter-population comparison, the term "sarcoidosis incidence" must be modified to convey the methodology employed in compiling the numerator. In age-delimited cohorts, valid comparison to population incidence requires age adjustment due to the age-dependency of incidence. The "true incidence" of sarcoidosis is a notional concept: more than 90% of cases are subclinical and radiographically inevident. Occupational causal inference based on incidence differential vs. populations has been undermined by methodological differences in ascertainment and computation.

  12. CRITICAL ISSUES IN HIGH END COMPUTING - FINAL REPORT

    SciTech Connect

    Corones, James

    2013-09-23

    High-End computing (HEC) has been a driver for advances in science and engineering for the past four decades. Increasingly HEC has become a significant element in the national security, economic vitality, and competitiveness of the United States. Advances in HEC provide results that cut across traditional disciplinary and organizational boundaries. This program provides opportunities to share information about HEC systems and computational techniques across multiple disciplines and organizations through conferences and exhibitions of HEC advances held in Washington DC so that mission agency staff, scientists, and industry can come together with White House, Congressional and Legislative staff in an environment conducive to the sharing of technical information, accomplishments, goals, and plans. A common thread across this series of conferences is the understanding of computational science and applied mathematics techniques across a diverse set of application areas of interest to the Nation. The specific objectives of this program are: Program Objective 1. To provide opportunities to share information about advances in high-end computing systems and computational techniques between mission critical agencies, agency laboratories, academics, and industry. Program Objective 2. To gather pertinent data, address specific topics of wide interest to mission critical agencies. Program Objective 3. To promote a continuing discussion of critical issues in high-end computing. Program Objective 4.To provide a venue where a multidisciplinary scientific audience can discuss the difficulties applying computational science techniques to specific problems and can specify future research that, if successful, will eliminate these problems.

  13. Program computes single-point failures in critical system designs

    NASA Technical Reports Server (NTRS)

    Brown, W. R.

    1967-01-01

    Computer program analyzes the designs of critical systems that will either prove the design is free of single-point failures or detect each member of the population of single-point failures inherent in a system design. This program should find application in the checkout of redundant circuits and digital systems.

  14. Countering Deterministic Tools: A Critical Theory Approach to Computers & Composition.

    ERIC Educational Resources Information Center

    Kimme Hea, Amy C.

    A writing instructor has grappled with how both to integrate and to complicate critical perspectives on technology in the writing classroom. In collaboration with another instructor, a computer classroom pedagogy was constructed emphasizing imperatives of cultural studies practice as outlined by James Berlin. The pedagogy is similar to Berlin's…

  15. Critical infrastructure systems of systems assessment methodology.

    SciTech Connect

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  16. An Assessment of Student Computer Ergonomic Knowledge.

    ERIC Educational Resources Information Center

    Alexander, Melody W.

    1997-01-01

    Business students (n=254) were assessed on their knowledge of computers, health and safety, radiation, workstations, and ergonomic techniques. Overall knowledge was low in all categories. In particular, they had not learned computer-use techniques. (SK)

  17. Auroral weak double layers: A critical assessment

    NASA Astrophysics Data System (ADS)

    Koskinen, Hannu E. J.; Mälkki, Anssi M.

    Weak double layers (WDLs) were first observed in the mid-altitude auroral magnetosphere in 1976 by the S3-3 satellite. The observations were confirmed by Viking in 1986, when more detailed information of these small-scale plasma structures became available. WDLs are upward moving rarefactive solitary structures with negative electric potential. The potential drop over a WDL is typically 0-1 V with electric field pointing predominantly upward. The structures are usually found in relatively weak (≤2 kV) auroral acceleration regions where the field-aligned current is upward, but sometimes very small. The observations suggest that WDLs exist in regions of cool electron and ion background. Most likely the potential structures are embedded in the background ion population that may drift slowly upward. There have been several attempts for plasma physical explanation of WDLs but so far the success has not been very good. Computer simulations have been able to produce similar structures, but usually for somewhat unrealistic plasma parameters. A satisfactory understanding of the phenomenon requires consideration of the role of WDLs in the magnetosphere-ionosphere (MI) coupling, including the large-scale electric fields, both parallel and perpendicular to the magnetic field, and the Alfvén waves mediating the coupling. In this report we give a critical review of our present understanding of WDLs. We try to find out what can be safely deduced from the observations, what are just educated guesses, and where we may go wrong.

  18. Radiation exposure and risk assessment for critical female body organs

    SciTech Connect

    Atwell, W.; Weyland, M.D.; Hardy, A.C. NASA, Johnson Space Center, Houston, TX )

    1991-07-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed. 13 refs.

  19. Adapting the Critical Thinking Assessment Test for Palestinian Universities

    ERIC Educational Resources Information Center

    Basha, Sami; Drane, Denise; Light, Gregory

    2016-01-01

    Critical thinking is a key learning outcome for Palestinian students. However, there are no validated critical thinking tests in Arabic. Suitability of the US developed Critical Thinking Assessment Test (CAT) for use in Palestine was assessed. The test was piloted with university students in English (n = 30) and 4 questions were piloted in Arabic…

  20. Cryptographic Key Management and Critical Risk Assessment

    SciTech Connect

    Abercrombie, Robert K

    2014-05-01

    The Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) CyberSecurity for Energy Delivery Systems (CSEDS) industry led program (DE-FOA-0000359) entitled "Innovation for Increasing CyberSecurity for Energy Delivery Systems (12CSEDS)," awarded a contract to Sypris Electronics LLC to develop a Cryptographic Key Management System for the smart grid (Scalable Key Management Solutions for Critical Infrastructure Protection). Oak Ridge National Laboratory (ORNL) and Sypris Electronics, LLC as a result of that award entered into a CRADA (NFE-11-03562) between ORNL and Sypris Electronics, LLC. ORNL provided its Cyber Security Econometrics System (CSES) as a tool to be modified and used as a metric to address risks and vulnerabilities in the management of cryptographic keys within the Advanced Metering Infrastructure (AMI) domain of the electric sector. ORNL concentrated our analysis on the AMI domain of which the National Electric Sector Cyber security Organization Resource (NESCOR) Working Group 1 (WG1) has documented 29 failure scenarios. The computational infrastructure of this metric involves system stakeholders, security requirements, system components and security threats. To compute this metric, we estimated the stakes that each stakeholder associates with each security requirement, as well as stochastic matrices that represent the probability of a threat to cause a component failure and the probability of a component failure to cause a security requirement violation. We applied this model to estimate the security of the AMI, by leveraging the recently established National Institute of Standards and Technology Interagency Report (NISTIR) 7628 guidelines for smart grid security and the International Electrotechnical Commission (IEC) 63351, Part 9 to identify the life cycle for cryptographic key management, resulting in a vector that assigned to each stakeholder an estimate of their average loss in terms of dollars per day of system

  1. Predictive Dynamic Security Assessment through Advanced Computing

    SciTech Connect

    Huang, Zhenyu; Diao, Ruisheng; Jin, Shuangshuang; Chen, Yousu

    2014-11-30

    Abstract— Traditional dynamic security assessment is limited by several factors and thus falls short in providing real-time information to be predictive for power system operation. These factors include the steady-state assumption of current operating points, static transfer limits, and low computational speed. This addresses these factors and frames predictive dynamic security assessment. The primary objective of predictive dynamic security assessment is to enhance the functionality and computational process of dynamic security assessment through the use of high-speed phasor measurements and the application of advanced computing technologies for faster-than-real-time simulation. This paper presents algorithms, computing platforms, and simulation frameworks that constitute the predictive dynamic security assessment capability. Examples of phasor application and fast computation for dynamic security assessment are included to demonstrate the feasibility and speed enhancement for real-time applications.

  2. Assessing Vulnerabilities, Risks, and Consequences of Damage to Critical Infrastructure

    SciTech Connect

    Suski, N; Wuest, C

    2011-02-04

    Phase brings together infrastructure owners and operators to identify critical assets and help the team create a structured information request. During this phase, we gain information about the critical assets from those who are most familiar with operations and interdependencies, making the time we spend on the ground conducting the assessment much more productive and enabling the team to make actionable recommendations. The Assessment Phase analyzes 10 areas: Threat environment, cyber architecture, cyber penetration, physical security, physical penetration, operations security, policies and procedures, interdependencies, consequence analysis, and risk characterization. Each of these individual tasks uses direct and indirect data collection, site inspections, and structured and facilitated workshops to gather data. Because of the importance of understanding the cyber threat, LLNL has built both fixed and mobile cyber penetration, wireless penetration and supporting tools that can be tailored to fit customer needs. The Post-Assessment Phase brings vulnerability and risk assessments to the customer in a format that facilitates implementation of mitigation options. Often the assessment findings and recommendations are briefed and discussed with several levels of management and, if appropriate, across jurisdictional boundaries. The end result is enhanced awareness and informed protective measures. Over the last 15 years, we have continued to refine our methodology and capture lessons learned and best practices. The resulting risk and decision framework thus takes into consideration real-world constraints, including regulatory, operational, and economic realities. In addition to 'on the ground' assessments focused on mitigating vulnerabilities, we have integrated our computational and atmospheric dispersion capability with easy-to-use geo-referenced visualization tools to support emergency planning and response operations. LLNL is home to the National Atmospheric Release

  3. Inequalities, Assessment and Computer Algebra

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.

    2015-01-01

    The goal of this paper is to examine single variable real inequalities that arise as tutorial problems and to examine the extent to which current computer algebra systems (CAS) can (1) automatically solve such problems and (2) determine whether students' own answers to such problems are correct. We review how inequalities arise in…

  4. Assessing Quality of Critical Thought in Online Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa; Baltes, Beate; Lynn, Laura Knight

    2009-01-01

    Purpose: The purpose of this paper is to describe a theoretically based coding framework for an integrated analysis and assessment of critical thinking in online discussion. Design/methodology/approach: The critical thinking assessment framework (TAF) is developed through review of theory and previous research, verified by comparing results to…

  5. A Novel Instrument for Assessing Students' Critical Thinking Abilities

    ERIC Educational Resources Information Center

    White, Brian; Stains, Marilyne; Escriu-Sune, Marta; Medaglia, Eden; Rostamnjad, Leila; Chinn, Clark; Sevian, Hannah

    2011-01-01

    Science literacy involves knowledge of both science content and science process skills. In this study, we describe the Assessment of Critical Thinking Ability survey and its preliminary application to assess the critical thinking skills of undergraduate students, graduate students, and postdoctoral fellows. This survey is based on a complex and…

  6. CriTi-CAL: A computer program for Critical Coiled Tubing Calculations

    SciTech Connect

    He, X.

    1995-12-31

    A computer software package for simulating coiled tubing operations has been developed at Rogaland Research. The software is named CriTiCAL, for Critical Coiled Tubing Calculations. It is a PC program running under Microsoft Windows. CriTi-CAL is designed for predicting force, stress, torque, lockup, circulation pressure losses and along-hole-depth corrections for coiled tubing workover and drilling operations. CriTi-CAL features an user-friendly interface, integrated work string and survey editors, flexible input units and output format, on-line documentation and extensive error trapping. CriTi-CAL was developed by using a combination of Visual Basic and C. Such an approach is an effective way to quickly develop high quality small to medium size software for the oil industry. The software is based on the results of intensive experimental and theoretical studies on buckling and post-buckling of coiled tubing at Rogaland Research. The software has been validated by full-scale test results and field data.

  7. Perspectives on sedation assessment in critical care.

    PubMed

    Olson, Daiwai M; Thoyre, Suzanne M; Auyong, David B

    2007-01-01

    Multiple studies have been undertaken to show that neurofunction monitors can correlate to objective sedation assessments. Showing a correlation between these 2 patient assessments tools may not be the correct approach for validation of neurofunction monitors. Two different methods of assessing 2 different modes of the patient's response to sedation should not be expected to precisely correlate unless the desire is to replace one method with the other. We provide a brief summary of several sedation scales, physiologic measures and neurofunction monitoring tools, and correlations literature for bispectral index monitoring, and the Ramsay Scale and the Sedation Agitation Scale. Neurofunction monitors provide near continuous information about a different domain of the sedation response than intermittent observational assessments. Further research should focus on contributions from this technology to the improvement of patient outcomes when neurofunction monitoring is used as a complement, not a replacement, for observational methods of sedation assessment.

  8. Inequalities, assessment and computer algebra

    NASA Astrophysics Data System (ADS)

    Sangwin, Christopher J.

    2015-01-01

    The goal of this paper is to examine single variable real inequalities that arise as tutorial problems and to examine the extent to which current computer algebra systems (CAS) can (1) automatically solve such problems and (2) determine whether students' own answers to such problems are correct. We review how inequalities arise in contemporary curricula. We consider the formal mathematical processes by which such inequalities are solved, and we consider the notation and syntax through which solutions are expressed. We review the extent to which current CAS can accurately solve these inequalities, and the form given to the solutions by the designers of this software. Finally, we discuss the functionality needed to deal with students' answers, i.e. to establish equivalence (or otherwise) of expressions representing unions of intervals. We find that while contemporary CAS accurately solve inequalities there is a wide variety of notation used.

  9. Criticality of Water: Aligning Water and Mineral Resources Assessment.

    PubMed

    Sonderegger, Thomas; Pfister, Stephan; Hellweg, Stefanie

    2015-10-20

    The concept of criticality has been used to assess whether a resource may become a limiting factor to economic activities. It has been primarily applied to nonrenewable resources, in particular to metals. However, renewable resources such as water may also be overused and become a limiting factor. In this paper, we therefore developed a water criticality method that allows for a new, user-oriented assessment of water availability and accessibility. Comparability of criticality across resources is desirable, which is why the presented adaptation of the criticality approach to water is based on a metal criticality method, whose basic structure is maintained. With respect to the necessary adaptations to the water context, a transparent water criticality framework is proposed that may pave the way for future integrated criticality assessment of metals, water, and other resources. Water criticality scores were calculated for 159 countries subdivided into 512 geographic units for the year 2000. Results allow for a detailed analysis of criticality profiles, revealing locally specific characteristics of water criticality. This is useful for the screening of sites and their related water criticality, for indication of water related problems and possible mitigation options and water policies, and for future water scenario analysis.

  10. Criticality of Water: Aligning Water and Mineral Resources Assessment.

    PubMed

    Sonderegger, Thomas; Pfister, Stephan; Hellweg, Stefanie

    2015-10-20

    The concept of criticality has been used to assess whether a resource may become a limiting factor to economic activities. It has been primarily applied to nonrenewable resources, in particular to metals. However, renewable resources such as water may also be overused and become a limiting factor. In this paper, we therefore developed a water criticality method that allows for a new, user-oriented assessment of water availability and accessibility. Comparability of criticality across resources is desirable, which is why the presented adaptation of the criticality approach to water is based on a metal criticality method, whose basic structure is maintained. With respect to the necessary adaptations to the water context, a transparent water criticality framework is proposed that may pave the way for future integrated criticality assessment of metals, water, and other resources. Water criticality scores were calculated for 159 countries subdivided into 512 geographic units for the year 2000. Results allow for a detailed analysis of criticality profiles, revealing locally specific characteristics of water criticality. This is useful for the screening of sites and their related water criticality, for indication of water related problems and possible mitigation options and water policies, and for future water scenario analysis. PMID:26392153

  11. RHIC CRITICAL POINT SEARCH: ASSESSING STARs CAPABILITIES.

    SciTech Connect

    SORENSEN,P.

    2006-07-03

    In this report we discuss the capabilities and limitations of the STAR detector to search for signatures of the QCD critical point in a low energy scan at RHIC. We find that a RHIC low energy scan will cover a broad region of interest in the nuclear matter phase diagram and that the STAR detector--a detector designed to measure the quantities that will be of interest in this search--will provide new observables and improve on previous measurements in this energy range.

  12. Gender and Equity Issues in Computer-Based Science Assessment.

    ERIC Educational Resources Information Center

    Cheek, Dennis W.; Agruso, Susan

    1995-01-01

    Suggests that computer and related technologies as tools for teaching, learning, and assessment are neither gender neutral nor benign in effect. Examines computers, equity, and access issues, computers as a technology, and the implications for computer-based assessment. (LZ)

  13. A Critical Evaluation of Cognitive Style Assessment.

    ERIC Educational Resources Information Center

    Richter, Ricka

    This document reviews theories of cognitive style and methods of cognitive style assessment as they relate to the context of South Africa, where sociopolitical changes call for reassessment of theoretical assumptions in education and training. The report consists of six chapters. After a brief introductory chapter, the second chapter gives an…

  14. Computer-assisted navigation in knee arthroplasty: a critical appraisal.

    PubMed

    Venkatesan, Muralidharan; Mahadevan, Devendra; Ashford, Robert U

    2013-10-01

    The purpose of this review was to appraise the use of computer-assisted navigation in total knee arthroplasty and to assess whether this technology has improved clinical outcomes. Studies were identified through searches in MEDLINE, Embase, and PubMed. Numerous studies have shown improved leg and component alignment using navigation systems. However, the better alignment achieved in navigated knee arthroplasty has not been shown to lead to better clinical outcomes. Navigated knee arthroplasty had lower calculated blood loss and lower incidence of fat embolism compared with conventional knee arthroplasty using intramedullary jigs. It may be most valued when dealing with complex knee deformities, revision surgery, or minimally invasive surgery. Navigated knee arthroplasty, however, is only cost-effective in centers with a high volume of joint replacements. Overall, computer-assisted navigated knee arthroplasty provides some advantages over conventional surgery, but its clinical benefits to date are unclear and remain to be defined on a larger scale.

  15. Fuzzy architecture assessment for critical infrastructure resilience

    SciTech Connect

    Muller, George

    2012-12-01

    This paper presents an approach for the selection of alternative architectures in a connected infrastructure system to increase resilience of the overall infrastructure system. The paper begins with a description of resilience and critical infrastructure, then summarizes existing approaches to resilience, and presents a fuzzy-rule based method of selecting among alternative infrastructure architectures. This methodology includes considerations which are most important when deciding on an approach to resilience. The paper concludes with a proposed approach which builds on existing resilience architecting methods by integrating key system aspects using fuzzy memberships and fuzzy rule sets. This novel approach aids the systems architect in considering resilience for the evaluation of architectures for adoption into the final system architecture.

  16. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  17. Assessment of critical-fluid extractions in the process industries

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The potential for critical-fluid extraction as a separation process for improving the productive use of energy in the process industries is assessed. Critical-fluid extraction involves the use of fluids, normally gaseous at ambient conditions, as extraction solvents at temperatures and pressures around the critical point. Equilibrium and kinetic properties in this regime are very favorable for solvent applications, and generally allow major reductions in the energy requirements for separating and purifying chemical component of a mixture.

  18. Using Writing to Develop and Assess Critical Thinking.

    ERIC Educational Resources Information Center

    Wade, Carole

    1995-01-01

    Asserts that written work has advantages over oral discussion in the development and assessment of students' critical thinking skills. Describes a set of short writing assignments that focuses on eight essential aspects of critical and creative thought. Provides examples of how to use writing assignments in college psychology courses. (CFR)

  19. Assessment of community contamination: a critical approach.

    PubMed

    Clark, Lauren; Barton, Judith A; Brown, Nancy J

    2002-01-01

    The purpose of this paper is to review data from two Superfund sites and describe the latitude of interpretation of "environmental risk" by residents living in the area, governmental agencies, and the media. The first community was located within a 5-mi perimeter of the Rocky Flats Environmental Technology Site (RFETS) outside Denver, Colorado. The second community was located on the south side of Tucson, Arizona, adjacent to the Tucson International Airport area (TIAA) Superfund site. Critical theory was the perspective used in this analysis and proposal of public health actions to attain social justice. Differences between the two populations' experiences with risk and contamination coincided with divergent levels of trust in government. RFETS residents demanded monitoring, whereas the minority residents at TIAA were ambivalent about their trust in government cleanup activities. Unraveling the purpose of "facts" and the social force of "truth" can direct nurses to address environmental justice issues. By policing governmental and business activities in halting or cleaning up environmental contamination, nurses may become mouthpieces for the concerns underlying the fragile surface of "virtual trust" in contaminated communities. Cutting through competing rhetoric to police environmental safety, the core function of assurance becomes what nurses do, not what they say. PMID:12182695

  20. Assessment of community contamination: a critical approach.

    PubMed

    Clark, Lauren; Barton, Judith A; Brown, Nancy J

    2002-01-01

    The purpose of this paper is to review data from two Superfund sites and describe the latitude of interpretation of "environmental risk" by residents living in the area, governmental agencies, and the media. The first community was located within a 5-mi perimeter of the Rocky Flats Environmental Technology Site (RFETS) outside Denver, Colorado. The second community was located on the south side of Tucson, Arizona, adjacent to the Tucson International Airport area (TIAA) Superfund site. Critical theory was the perspective used in this analysis and proposal of public health actions to attain social justice. Differences between the two populations' experiences with risk and contamination coincided with divergent levels of trust in government. RFETS residents demanded monitoring, whereas the minority residents at TIAA were ambivalent about their trust in government cleanup activities. Unraveling the purpose of "facts" and the social force of "truth" can direct nurses to address environmental justice issues. By policing governmental and business activities in halting or cleaning up environmental contamination, nurses may become mouthpieces for the concerns underlying the fragile surface of "virtual trust" in contaminated communities. Cutting through competing rhetoric to police environmental safety, the core function of assurance becomes what nurses do, not what they say.

  1. A COMPUTER-ASSIST MATERIAL TRACKING SYSTEM AS A CRITICALITY SAFETY AID TO OPERATORS

    SciTech Connect

    Claybourn, R V; Huang, S T

    2007-03-30

    In today's compliant-driven environment, fissionable material handlers are inundated with work control rules and procedures in carrying out nuclear operations. Historically, human errors are one of the key contributors of various criticality accidents. Since moving and handling fissionable materials are key components of their job functions, any means that can be provided to assist operators in facilitating fissionable material moves will help improve operational efficiency and enhance criticality safety implementation. From the criticality safety perspective, operational issues have been encountered in Lawrence Livermore National Laboratory (LLNL) plutonium operations. Those issues included lack of adequate historical record keeping for the fissionable material stored in containers, a need for a better way of accommodating operations in a research and development setting, and better means of helping material handlers in carrying out various criticality safety controls. Through the years, effective means were implemented including better work control process, standardized criticality control conditions (SCCC) and relocation of criticality safety engineers to the plutonium facility. Another important measure taken was to develop a computer data acquisition system for criticality safety assessment, which is the subject of this paper. The purpose of the Criticality Special Support System (CSSS) is to integrate many of the proven operational support protocols into a software system to assist operators with assessing compliance to procedures during the handling and movement of fissionable materials. Many nuclear facilities utilize mass cards or a computer program to track fissionable material mass data in operations. Additional item specific data such as, the presence of moderators or close fitting reflectors, could be helpful to fissionable material handlers in assessing compliance to SCCC's. Computer-assist checking of a workstation material inventory against the

  2. Criticality safety assessment of tank 241-C-106 remediation

    SciTech Connect

    Waltar, A.E., Westinghouse Hanford

    1996-07-19

    A criticality safety assessment was performed in support of Project 320 for the retrieval of waste from tank 241-C-106 to tank 241-AY-102. The assessment was performed by a multi-disciplined team consisting of expertise covering the range of nuclear engineering, plutonium and nuclear waste chemistry,and physical mixing hydraulics. Technical analysis was performed to evaluate the physical and chemical behavior of fissile material in neutralized Hanford waste as well as modeling of the fluid dynamics for the retrieval activity. The team has not found evidence of any credible mechanism to attain neutronic criticality in either tank and has concluded that a criticality accident is incredible.

  3. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  4. Assessment of (Computer-Supported) Collaborative Learning

    ERIC Educational Resources Information Center

    Strijbos, J. -W.

    2011-01-01

    Within the (Computer-Supported) Collaborative Learning (CS)CL research community, there has been an extensive dialogue on theories and perspectives on learning from collaboration, approaches to scaffold (script) the collaborative process, and most recently research methodology. In contrast, the issue of assessment of collaborative learning has…

  5. Mobile sources critical review: 1998 NARSTO assessment

    NASA Astrophysics Data System (ADS)

    Sawyer, R. F.; Harley, R. A.; Cadle, S. H.; Norbeck, J. M.; Slott, R.; Bravo, H. A.

    Mobile sources of air pollutants encompass a range of vehicle, engine, and fuel combinations. They emit both of the photochemical ozone precursors, hydrocarbons and oxides of nitrogen. The most important source of hydrocarbons and oxides of nitrogen are light- and heavy-duty on-road vehicles and heavy-duty off-road vehicles, utilizing spark and compression ignition engines burning gasoline and diesel respectively. Fuel consumption data provide a convenient starting point for assessing current and future emissions. Modern light-duty, gasoline vehicles when new have very low emissions. The in-use fleet, due largely to emissions from a small "high emitter" fraction, has significantly larger emissions. Hydrocarbons and carbon monoxide are higher than reported in current inventories. Other gasoline powered mobile sources (motorcycles, recreational vehicles, lawn, garden, and utility equipment, and light aircraft) have high emissions on a per quantity of fuel consumed basis, but their contribution to total emissions is small. Additional uncertainties in spatial and temporal distribution of emissions exist. Heavy-duty diesel vehicles are becoming the dominant mobile source of oxides of nitrogen. Oxides of nitrogen emissions may be greater than reported in current inventories, but the evidence for this is mixed. Oxides of nitrogen emissions on a fuel-consumed basis are much greater from diesel mobile sources than from gasoline mobile sources. This is largely the result of stringent control of gasoline vehicle emissions and a lesser (heavy-duty trucks) or no control (construction equipment, locomotives, ships) of heavy-duty mobile sources. The use of alternative fuels, natural gas, propane, alcohols, and oxygenates in motor vehicles is increasing but remains small. Vehicles utilizing these fuels can be but are not necessarily cleaner than their gasoline or diesel counterparts. Historical vehicle kilometers traveled growth rates of about 2% annually in both the United States

  6. Criticism and Assessment Applied to New Media Art

    ERIC Educational Resources Information Center

    Ursyn, Anna

    2015-01-01

    This text examines educational criticism and assessment with an emphasis on the new media arts. The article shares with readers the versatile, abridged to four points criteria, based on a research on assessment made by students, faculty, and non-art-related professionals, thus providing a preliminary tool for the use in the classroom environment.…

  7. Establishing the Critical Elements That Determine Authentic Assessment

    ERIC Educational Resources Information Center

    Ashford-Rowe, Kevin; Herrington, Janice; Brown, Christine

    2014-01-01

    This study sought to determine the critical elements of an authentic learning activity, design them into an applicable framework and then use this framework to guide the design, development and application of work-relevant assessment. Its purpose was to formulate an effective model of task design and assessment. The first phase of the study…

  8. Guidelines for a Scientific Approach to Critical Thinking Assessment

    ERIC Educational Resources Information Center

    Bensley, D. Alan; Murtagh, Michael P.

    2012-01-01

    Assessment of student learning outcomes can be a powerful tool for improvement of instruction when a scientific approach is taken; unfortunately, many educators do not take full advantage of this approach. This article examines benefits of taking a scientific approach to critical thinking assessment and proposes guidelines for planning,…

  9. Critical Assessment of Correction Methods for Fisheye Lens Distortion

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Tian, C.; Huang, Y.

    2016-06-01

    A fisheye lens is widely used to create a wide panoramic or hemispherical image. It is an ultra wide-angle lens that produces strong visual distortion. The distortion modeling and estimation of the fisheye lens are the crucial step for fisheye lens calibration and image rectification in computer vision and close-range photography. There are two kinds of distortion: radial and tangential distortion. Radial distortion is large for fisheye imaging and critical for the subsequent image processing. Although many researchers have developed calibration algorithms of radial distortion of fisheye lens, quantitative evaluation of the correction performance has remained a challenge. This is the first paper that intuitively and objectively evaluates the performance of five different calibration algorithms. Upto- date research on fisheye lens calibration is comprehensively reviewed to identify the research need. To differentiate their performance in terms of precision and ease-using, five methods are then tested using a diverse set of actual images of the checkerboard that are taken at Wuhan University, China under varying lighting conditions, shadows, and shooting angles. The method of rational function model, which was generally used for wide-angle lens correction, outperforms the other methods. However, the one parameter division model is easy for practical use without compromising too much the precision. The reason is that it depends on the linear structure in the image and requires no preceding calibration. It is a tradeoff between correction precision and ease-using. By critically assessing the strengths and limitations of the existing algorithms, the paper provides valuable insight and guideline for future practice and algorithm development that are important for fisheye lens calibration. It is promising for the optimal design of lens correction models that are suitable for the millions of portable imaging devices.

  10. Research on computer aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. J.

    1984-01-01

    Experiments on pilot decision making are described. The development of models of pilot decision making in critical in flight events (CIFE) are emphasized. The following tests are reported on the development of: (1) a frame system representation describing how pilots use their knowledge in a fault diagnosis task; (2) assessment of script norms, distance measures, and Markov models developed from computer aided testing (CAT) data; and (3) performance ranking of subject data. It is demonstrated that interactive computer aided testing either by touch CRT's or personal computers is a useful research and training device for measuring pilot information management in diagnosing system failures in simulated flight situations. Performance is dictated by knowledge of aircraft sybsystems, initial pilot structuring of the failure symptoms and efficient testing of plausible causal hypotheses.

  11. Computer assisted blast design and assessment tools

    SciTech Connect

    Cameron, A.R.; Kleine, T.H.; Forsyth, W.W.

    1995-12-31

    In general the software required by a blast designer includes tools that graphically present blast designs (surface and underground), can analyze a design or predict its result, and can assess blasting results. As computers develop and computer literacy continues to rise the development of and use of such tools will spread. An example of the tools that are becoming available includes: Automatic blast pattern generation and underground ring design; blast design evaluation in terms of explosive distribution and detonation simulation; fragmentation prediction; blast vibration prediction and minimization; blast monitoring for assessment of dynamic performance; vibration measurement, display and signal processing; evaluation of blast results in terms of fragmentation; and risk and reliability based blast assessment. The authors have identified a set of criteria that are essential in choosing appropriate software blasting tools.

  12. Critical thinking traits of top-tier experts and implications for computer science education

    NASA Astrophysics Data System (ADS)

    Bushey, Dean E.

    of this study suggest a need to examine how critical-thinking abilities are learned in the undergraduate computer science curriculum and the need to foster these abilities in order to produce the high-level, critical-thinking professionals necessary to fill the growing need for these experts. Due to the fact that current measures of academic performance do not adequately depict students' cognitive abilities, assessment of these skills must be incorporated into existing curricula.

  13. APPLICATION OF FETAX IN ECOLOGICAL RISK ASSESSMENTS: A CRITICAL ASSESSMENT

    EPA Science Inventory

    A workshop sponsored by NIEHS in 2000 evaluated the use of FETAX as a screening method for identifying the developmental toxicity potenial of chemical and environmental samples. Workshop recommendations pertinent to environmental risk assessment suggested that additional comparat...

  14. Computed Tomography: Image and Dose Assessment

    NASA Astrophysics Data System (ADS)

    Valencia-Ortega, F.; Ruiz-Trejo, C.; Rodríguez-Villafuerte, M.; Buenfil, A. E.; Mora-Hernández, L. A.

    2006-09-01

    In this work an experimental evaluation of image quality and dose imparted during a computed tomography study in a Public Hospital in Mexico City is presented; The measurements required the design and construction of two phantoms at the Institute of Physics, UNAM, according to the recommendations of American Association of Physicists in Medicine (AAPM). Image assessment was performed in terms the spatial resolution and image contrast. Dose measurements were carried out using LiF: Mg,Ti (TLD-100) dosemeters and pencil-shaped ionisation chamber; The results for a computed tomography head study in single and multiple detector modes are presented.

  15. Assessment of pain and agitation in critically ill infants.

    PubMed

    Ramelet, A S

    1999-09-01

    Critically ill infants are subjected to many painful experiences that, if inadequately treated, can have severe physiological and psychological consequences. Optimal management of pain relies on the adequacy of nurses' assessment; this, however, is complicated by another common condition, agitation. A multidimensional assessment is therefore necessary to adequately identify pain and agitation. The aim of this descriptive study was to identify the cues that nurses caring for critically ill infants use to assess pain and agitation. A questionnaire, developed from the literature, was distributed to all registered nurses (85) working in the neonatal and paediatric intensive care units of an Australian teaching hospital. Questionnaires were completed by 41 nurses (a 57 per cent response rate). Results revealed that, except for diagnosis, there were no significant differences between the cues participants used to assess pain and those to assess agitation. Nurses used numerous cues from various sources: most importantly, their own judgement (99 per cent); the parents' judgement (90 per cent); the infant's environment; documentation (78 per cent), and the infant's cues (70 per cent). These findings demonstrate the relevance of the nurse's role in assessment of pain and agitation in critically ill infants. Nurses used cues specific to the critically ill rather than the less sick infant. Results of this study also show the difficulty of differentiating between pain and agitation. Further research on ways of distinguishing between the construct of pain and agitation needs to be undertaken.

  16. Accessible high performance computing solutions for near real-time image processing for time critical applications

    NASA Astrophysics Data System (ADS)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  17. An Exploration of Three-Dimensional Integrated Assessment for Computational Thinking

    ERIC Educational Resources Information Center

    Zhong, Baichang; Wang, Qiyun; Chen, Jie; Li, Yi

    2016-01-01

    Computational thinking (CT) is a fundamental skill for students, and assessment is a critical factor in education. However, there is a lack of effective approaches to CT assessment. Therefore, we designed the Three-Dimensional Integrated Assessment (TDIA) framework in this article. The TDIA has two aims: one was to integrate three dimensions…

  18. Effects of Computer-Aided Personalized System of Instruction in Developing Knowledge and Critical Thinking in Blended Learning Courses

    ERIC Educational Resources Information Center

    Svenningsen, Louis; Pear, Joseph J.

    2011-01-01

    Two experiments were conducted to assess an online version of Keller's personalized system of instruction, called computer-aided personalized system of instruction (CAPSI), as part of a blended learning design with regard to course knowledge and critical thinking development. In Experiment 1, two lecture sections of an introduction to University…

  19. Critical issues using brain-computer interfaces for augmentative and alternative communication.

    PubMed

    Hill, Katya; Kovacs, Thomas; Shin, Sangeun

    2015-03-01

    Brain-computer interfaces (BCIs) may potentially be of significant practical value to patients in advanced stages of amyotrophic lateral sclerosis and locked-in syndrome for whom conventional augmentative and alternative communication (AAC) systems, which require some measure of consistent voluntary muscle control, are not satisfactory options. However, BCIs have primarily been used for communication in laboratory research settings. This article discusses 4 critical issues that should be addressed as BCIs are translated out of laboratory settings to become fully functional BCI/AAC systems that may be implemented clinically. These issues include (1) identification of primary, secondary, and tertiary system features; (2) integrating BCI/AAC systems in the World Health Organization's International Classification of Functioning, Disability and Health framework; (3) implementing language-based assessment and intervention; and (4) performance measurement. A clinical demonstration project is presented as an example of research beginning to address these critical issues.

  20. Conceptualising, Developing and Assessing Critical Thinking in Law

    ERIC Educational Resources Information Center

    James, Nickolas; Hughes, Clair; Cappa, Clare

    2010-01-01

    "Critical thinking" is commonly included in the lists of graduate attributes (GAs), which all Australian universities are now required to develop and implement. That efforts to do so have met with limited success is due to a range of factors including inconsistent or naive conceptualisations, the failure to explicitly develop or assess GAs, and…

  1. Antiracist Education in Theory and Practice: A Critical Assessment

    ERIC Educational Resources Information Center

    Niemonen, Jack

    2007-01-01

    "Antiracist Education in Theory and Practice: A Critical Assessment" As a set of pedagogical, curricular, and organizational strategies, antiracist education claims to be the most progressive way today to understand race relations. Constructed from whiteness studies and the critique of colorblindness, its foundational core is located in…

  2. Implementation and Critical Assessment of the Flipped Classroom Experience

    ERIC Educational Resources Information Center

    Scheg, Abigail G., Ed.

    2015-01-01

    In the past decade, traditional classroom teaching models have been transformed in order to better promote active learning and learner engagement. "Implementation and Critical Assessment of the Flipped Classroom Experience" seeks to capture the momentum of non-traditional teaching methods and provide a necessary resource for individuals…

  3. What Is a Good School? Critical Thoughts about Curriculum Assessments

    ERIC Educational Resources Information Center

    Zierer, Klaus

    2013-01-01

    Within the educational field, measurements such as the Programme for International Student Assessment (PISA), the Trends in International Mathematics and Science Study (TIMSS), and the Progress in International Reading Literacy Study (PIRLS) suggest we are living in a time of competition. This article takes a critical view of the modern drive to…

  4. Assessment of computational prediction of tail buffeting

    NASA Technical Reports Server (NTRS)

    Edwards, John W.

    1990-01-01

    Assessments of the viability of computational methods and the computer resource requirements for the prediction of tail buffeting are made. Issues involved in the use of Euler and Navier-Stokes equations in modeling vortex-dominated and buffet flows are discussed and the requirement for sufficient grid density to allow accurate, converged calculations is stressed. Areas in need of basic fluid dynamics research are highlighted: vorticity convection, vortex breakdown, dynamic turbulence modeling for free shear layers, unsteady flow separation for moderately swept, rounded leading-edge wings, vortex flows about wings at high subsonic speeds. An estimate of the computer run time for a buffeting response calculation for a full span F-15 aircraft indicates that an improvement in computer and/or algorithm efficiency of three orders of magnitude is needed to enable routine use of such methods. Attention is also drawn to significant uncertainties in the estimates, in particular with regard to nonlinearities contained within the modeling and the question of the repeatability or randomness of buffeting response.

  5. The Role of Computer Assisted Fluid Balance in Critical Care

    PubMed Central

    Ciccolella, Sergio A.; Halloran, Mark J.; Brimm, John E.; O'Hara, Michael R.

    1978-01-01

    Computational, reporting, and data base management needs along with growth in sophistication have propelled the application of computers in medicine. These elements are satisfying specific clinical needs in the fluid balance program design that was undertaken. Significant potential exists for extending the computer's intervention by using available transducing techniques to obtain information that is currently manually derived. Thus, the design currently satisfies the goal of maximizing information while minimizing labor intensive overhead and will continue to evolve in that direction.

  6. Computational methods for criticality safety analysis within the scale system

    SciTech Connect

    Parks, C.V.; Petrie, L.M.; Landers, N.F.; Bucholz, J.A.

    1986-01-01

    The criticality safety analysis capabilities within the SCALE system are centered around the Monte Carlo codes KENO IV and KENO V.a, which are both included in SCALE as functional modules. The XSDRNPM-S module is also an important tool within SCALE for obtaining multiplication factors for one-dimensional system models. This paper reviews the features and modeling capabilities of these codes along with their implementation within the Criticality Safety Analysis Sequences (CSAS) of SCALE. The CSAS modules provide automated cross-section processing and user-friendly input that allow criticality safety analyses to be done in an efficient and accurate manner. 14 refs., 2 figs., 3 tabs.

  7. VOXMAT: Hybrid Computational Phantom for Dose Assessment

    SciTech Connect

    Akkurt, Hatice; Eckerman, Keith F

    2007-01-01

    The Oak Ridge National Laboratory (ORNL) computational phantoms have been the standard for assessing the radiation dose due to internal and external exposure over the past three decades. In these phantoms, the body surface and each organ are approximated by mathematical equations; hence, some of the organs are not necessarily realistic in their shape. Over the past two decades, these phantoms have been revised and updated: some of the missing internal organs have been added and the locations of the existing organs have been revised (e.g., thyroid). In the original phantom, only three elemental compositions were used to describe all body tissues. Recently, the compositions of the organs have been updated based on ICRP-89 standards. During the past decade, phantoms based on CT scans were developed for use in dose assessment. Although their shapes are realistic, some computational challenges are noted; including increased computational times and increased memory requirements. For good spatial resolution, more than several million voxels are used to represent the human body. Moreover, when CT scans are obtained, the subject is in a supine position with arms at the side. In some occupational exposure cases, it is necessary to evaluate the dose with the arms and legs in different positions. It will be very difficult and inefficient to reposition the voxels defining the arms and legs to simulate these exposure geometries. In this paper, a new approach for computational phantom development is presented. This approach utilizes the combination of a mathematical phantom and a voxelized phantom for the representation of the anatomy.

  8. Assessing Terrorist Motivations for Attacking Critical "Chemical" Infrastructure

    SciTech Connect

    Ackerman, G; Bale, J; Moran, K

    2004-12-14

    Certain types of infrastructure--critical infrastructure (CI)--play vital roles in underpinning our economy, security, and way of life. One particular type of CI--that relating to chemicals--constitutes both an important element of our nation's infrastructure and a particularly attractive set of potential targets. This is primarily because of the large quantities of toxic industrial chemicals (TICs) it employs in various operations and because of the essential economic functions it serves. This study attempts to minimize some of the ambiguities that presently impede chemical infrastructure threat assessments by providing new insight into the key motivational factors that affect terrorist organizations propensity to attack chemical facilities. Prepared as a companion piece to the Center for Nonproliferation Studies August 2004 study--''Assessing Terrorist Motivations for Attacking Critical Infrastructure''--it investigates three overarching research questions: (1) why do terrorists choose to attack chemical-related infrastructure over other targets; (2) what specific factors influence their target selection decisions concerning chemical facilities; and (3) which, if any, types of groups are most inclined to attack chemical infrastructure targets? The study involved a multi-pronged research design, which made use of four discrete investigative techniques to answer the above questions as comprehensively as possible. These include: (1) a review of terrorism and threat assessment literature to glean expert consensus regarding terrorist interest in targeting chemical facilities; (2) the preparation of case studies to help identify internal group factors and contextual influences that have played a significant role in leading some terrorist groups to attack chemical facilities; (3) an examination of data from the Critical Infrastructure Terrorist Incident Catalog (CrITIC) to further illuminate the nature of terrorist attacks against chemical facilities to date; and (4

  9. Antibiotic prophylaxis and reflux: critical review and assessment

    PubMed Central

    Baquerizo, Bernarda Viteri

    2014-01-01

    The use of continuous antibiotic prophylaxis (CAP) was critical in the evolution of vesicoureteral reflux (VUR) from a condition in which surgery was the standard of treatment to its becoming a medically managed condition. The efficacy of antibiotic prophylaxis in the management of VUR has been challenged in recent years, and significant confusion exists as to its clinical value. This review summarizes the critical factors in the history, use, and investigation of antibiotic prophylaxis in VUR. This review provides suggestions for assessing the potential clinical utility of prophylaxis. PMID:25580258

  10. Factors confounding the assessment of reflection: a critical review

    PubMed Central

    2011-01-01

    Background Reflection on experience is an increasingly critical part of professional development and lifelong learning. There is, however, continuing uncertainty about how best to put principle into practice, particularly as regards assessment. This article explores those uncertainties in order to find practical ways of assessing reflection. Discussion We critically review four problems: 1. Inconsistent definitions of reflection; 2. Lack of standards to determine (in)adequate reflection; 3. Factors that complicate assessment; 4. Internal and external contextual factors affecting the assessment of reflection. Summary To address the problem of inconsistency, we identified processes that were common to a number of widely quoted theories and synthesised a model, which yielded six indicators that could be used in assessment instruments. We arrived at the conclusion that, until further progress has been made in defining standards, assessment must depend on developing and communicating local consensus between stakeholders (students, practitioners, teachers, supervisors, curriculum developers) about what is expected in exercises and formal tests. Major factors that complicate assessment are the subjective nature of reflection's content and the dependency on descriptions by persons being assessed about their reflection process, without any objective means of verification. To counter these validity threats, we suggest that assessment should focus on generic process skills rather than the subjective content of reflection and where possible to consider objective information about the triggering situation to verify described reflections. Finally, internal and external contextual factors such as motivation, instruction, character of assessment (formative or summative) and the ability of individual learning environments to stimulate reflection should be considered. PMID:22204704

  11. The Effect of Three Computer Conferencing Designs on Critical Thinking Skills of Nursing Students

    ERIC Educational Resources Information Center

    Duphorne, Patsy L.; Gunawardena, Charlotte N.

    2005-01-01

    A study of university nursing students tested the effect of computer conference designs and advance organizers on critical thinking skills. Critical thinking, although not significantly different between three conference groups, was evident for groups in all three conference designs. Those conferences designed to facilitate critical inquiry showed…

  12. Computational Tools to Assess Turbine Biological Performance

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

    2014-07-24

    Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

  13. Critical thinking: assessing the risks to the future security of supply of critical metals

    NASA Astrophysics Data System (ADS)

    Gunn, Gus

    2015-04-01

    Increasing world population, the spread of prosperity across the globe and the demands of new technologies have led to a revival of concerns about the availability of raw materials needed by society. Despite scare stories about resource depletion, physical exhaustion of minerals is considered to be unlikely. However, we do need to know which materials might be of concern so that we can develop strategies to secure adequate supplies and to mitigate the effects of supply disruption. This requirement has led to renewed interest in criticality, a term that is generally used to refer to metals and minerals of high economic importance that have a relatively high likelihood of supply disruption. The European Union (EU) developed a quantitative methodology for the assessment of criticality which led to the definition of 14 raw materials as critical to the EU economy (EC, 2010). This has succeeded in raising awareness of potential supply issues and in helping to prioritise requirements for new policies and supporting research. The EU has recently assessed a larger number of candidate materials of which 20 are now identified as critical to the EU (EC, 2014). These include metals such as indium, mostly used in flat-screen displays, antimony for flame retardants and cobalt for rechargeable batteries, alloys and a host of other products. Although there is no consensus on the methodology for criticality assessments and broad analyses at this scale are inevitably imperfect, they can, nevertheless, provide early warning of supply problems. However, in order to develop more rigorous and dynamic assessments of future availability detailed analysis of the whole life-cycle of individual metals to identify specific problems and develop appropriate solutions is required. New policies, such as the Raw Materials Initiative (2008) and the European Innovation Partnership on Raw Materials (2013), have been developed by the European Commission (EC) and are aimed at securing sustainable

  14. Laptop Computer - Based Facial Recognition System Assessment

    SciTech Connect

    R. A. Cain; G. B. Singleton

    2001-03-01

    The objective of this project was to assess the performance of the leading commercial-off-the-shelf (COTS) facial recognition software package when used as a laptop application. We performed the assessment to determine the system's usefulness for enrolling facial images in a database from remote locations and conducting real-time searches against a database of previously enrolled images. The assessment involved creating a database of 40 images and conducting 2 series of tests to determine the product's ability to recognize and match subject faces under varying conditions. This report describes the test results and includes a description of the factors affecting the results. After an extensive market survey, we selected Visionics' FaceIt{reg_sign} software package for evaluation and a review of the Facial Recognition Vendor Test 2000 (FRVT 2000). This test was co-sponsored by the US Department of Defense (DOD) Counterdrug Technology Development Program Office, the National Institute of Justice, and the Defense Advanced Research Projects Agency (DARPA). Administered in May-June 2000, the FRVT 2000 assessed the capabilities of facial recognition systems that were currently available for purchase on the US market. Our selection of this Visionics product does not indicate that it is the ''best'' facial recognition software package for all uses. It was the most appropriate package based on the specific applications and requirements for this specific application. In this assessment, the system configuration was evaluated for effectiveness in identifying individuals by searching for facial images captured from video displays against those stored in a facial image database. An additional criterion was that the system be capable of operating discretely. For this application, an operational facial recognition system would consist of one central computer hosting the master image database with multiple standalone systems configured with duplicates of the master operating in

  15. Man-Computer Symbiosis Through Interactive Graphics: A Survey and Identification of Critical Research Areas.

    ERIC Educational Resources Information Center

    Knoop, Patricia A.

    The purpose of this report was to determine the research areas that appear most critical to achieving man-computer symbiosis. An operational definition of man-computer symbiosis was developed by: (1) reviewing and summarizing what others have said about it, and (2) attempting to distinguish it from other types of man-computer relationships. From…

  16. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Möderl, Michael; Rauch, Wolfgang

    2011-12-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g., by terrorist attacks, infrastructure deterioration or climate change. For the spatial risk assessment, vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process. Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios. Thereby parameters are varied according to the specific impact of a particular threat scenario. Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past. The application of the spatial risk assessment is exemplified by means of a case study for a water supply system, but the principal concept is applicable likewise to other critical network infrastructure. The aim of the approach is to help decision makers in choosing zones for preventive measures.

  17. Critical Issues Forum: A multidisciplinary educational program integrating computer technology

    SciTech Connect

    Alexander, R.J.; Robertson, B.; Jacobs, D.

    1998-09-01

    The Critical Issues Forum (CIF) funded by the US Department of Energy is a collaborative effort between the Science Education Team of Los Alamos National Laboratory (LANL) and New Mexico high schools to improve science education throughout the state of New Mexico as well as nationally. By creating an education relationship between the LANL with its unique scientific resources and New Mexico high schools, students and teachers participate in programs that increase not only their science content knowledge but also their critical thinking and problem-solving skills. The CIF program focuses on current, globally oriented topics crucial to the security of not only the US but to that of all nations. The CIF is an academic-year program that involves both teachers and students in the process of seeking solutions for real world concerns. Built around issues tied to LANL`s mission, participating students and teachers are asked to critically investigate and examine the interactions among the political, social, economic, and scientific domains while considering diversity issues that include geopolitical entities and cultural and ethnic groupings. Participants are expected to collaborate through telecommunications during the research phase and participate in a culminating multimedia activity, where they produce and deliver recommendations for the current issues being studied. The CIF was evaluated and found to be an effective approach for teacher professional training, especially in the development of skills for critical thinking and questioning. The CIF contributed to students` ability to integrate diverse disciplinary content about science-related topics and supported teachers in facilitating the understanding of their students using the CIF approach. Networking technology in CIF has been used as an information repository, resource delivery mechanism, and communication medium.

  18. Fool's Gold: A Critical Look at Computers in Childhood.

    ERIC Educational Resources Information Center

    Cordes, Colleen, Ed.; Miller, Edward, Ed.

    Noting that computers are reshaping children's lives in profound and unexpected ways, this report examines potential harms and promised benefits of these changes, focusing on early childhood and elementary education. Chapter 1 argues that popular attempts to hurry children intellectually are at odds with the natural pace of human development.…

  19. Exact computation of the critical exponents of the jamming transition

    NASA Astrophysics Data System (ADS)

    Zamponi, Francesco

    2015-03-01

    The jamming transition marks the emergence of rigidity in a system of amorphous and athermal grains. It is characterized by a divergent correlation length of the force-force correlation and non-trivial critical exponents that are independent of spatial dimension, suggesting that a mean field theory can correctly predict their values. I will discuss a mean field approach to the problem based on the exact solution of the hard sphere model in infinite dimension. An unexpected analogy with the Sherrington-Kirkpatrick spin glass model emerges in the solution: as in the SK model, the glassy states turn out to be marginally stable, and are described by a Parisi equation. Marginal stability has a deep impact on the critical properties of the jamming transition and allows one to obtain analytic predictions for the critical exponents. The predictions are consistent with a recently developed scaling theory of the jamming transition, and with numerical simulations. Finally, I will briefly discuss some possible extensions of this approach to other open issues in the theory of glasses.

  20. Quality assessment of clinical computed tomography

    NASA Astrophysics Data System (ADS)

    Berndt, Dorothea; Luckow, Marlen; Lambrecht, J. Thomas; Beckmann, Felix; Müller, Bert

    2008-08-01

    Three-dimensional images are vital for the diagnosis in dentistry and cranio-maxillofacial surgery. Artifacts caused by highly absorbing components such as metallic implants, however, limit the value of the tomograms. The dominant artifacts observed are blowout and streaks. Investigating the artifacts generated by metallic implants in a pig jaw, the data acquisition for the patients in dentistry should be optimized in a quantitative manner. A freshly explanted pig jaw including related soft-tissues served as a model system. Images were recorded varying the accelerating voltage and the beam current. The comparison with multi-slice and micro computed tomography (CT) helps to validate the approach with the dental CT system (3D-Accuitomo, Morita, Japan). The data are rigidly registered to comparatively quantify their quality. The micro CT data provide a reasonable standard for quantitative data assessment of clinical CT.

  1. Adaptive critic design for computer intrusion detection system

    NASA Astrophysics Data System (ADS)

    Novokhodko, Alexander; Wunsch, Donald C., II; Dagli, Cihan H.

    2001-03-01

    This paper summarizes ongoing research. A neural network is used to detect a computer system intrusion basing on data from the system audit trail generated by Solaris Basic Security Module. The data have been provided by Lincoln Labs, MIT. The system alerts the human operator, when it encounters suspicious activity logged in the audit trail. To reduce the false alarm rate and accommodate the temporal indefiniteness of moment of attack a reinforcement learning approach is chosen to train the network.

  2. Computer-Assisted Assessment: Suggested Guidelines for an Institutional Strategy.

    ERIC Educational Resources Information Center

    Stephens, Derek; Bull, Joanna; Wade, Winnie

    1998-01-01

    Reviews lessons learned from experience with computer-aided learning that can inform the use of computers in student assessment in college teaching, describes the experience of two institutions with computer-assisted student assessment, and makes recommendations for developing and implementing effective institution-wide systems. (Author/MSE)

  3. Literary and Electronic Hypertext: Borges, Criticism, Literary Research, and the Computer.

    ERIC Educational Resources Information Center

    Davison, Ned J.

    1991-01-01

    Examines what "hypertext" means to literary criticism on the one hand (i.e., intertextuality) and computing on the other, to determine how the two concepts may serve each other in a mutually productive way. (GLR)

  4. A CAD (Classroom Assessment Design) of a Computer Programming Course

    ERIC Educational Resources Information Center

    Hawi, Nazir S.

    2012-01-01

    This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified for the subsequent…

  5. Clinical significance of computed tomography assessment for third molar surgery.

    PubMed

    Nakamori, Kenji; Tomihara, Kei; Noguchi, Makoto

    2014-07-28

    Surgical extraction of the third molar is the most commonly performed surgical procedure in the clinical practice of oral surgery. Third molar surgery is warranted when there is inadequate space for eruption, malpositioning, or risk for cyst or odontogenic tumor formation. Preoperative assessment should include a detailed morphologic analysis of the third molar and its relationship to adjacent structures and surrounding tissues. Due to developments in medical engineering technology, computed tomography (CT) now plays a critical role in providing the clear images required for adequate assessment prior to third molar surgery. Removal of the maxillary third molar is associated with a risk for maxillary sinus perforation, whereas removal of the mandibular third molar can put patients at risk for a neurosensory deficit from damage to the lingual nerve or inferior alveolar nerve. Multiple factors, including demographic, anatomic, and treatment-related factors, influence the incidence of nerve injury during or following removal of the third molar. CT assessment of the third molar prior to surgery can identify some of these risk factors, such as the absence of cortication between the mandibular third molar and the inferior alveolar canal, prior to surgery to reduce the risk for nerve damage. This topic highlight presents an overview of the clinical significance of CT assessment in third molar surgery. PMID:25071882

  6. Clinical significance of computed tomography assessment for third molar surgery

    PubMed Central

    Nakamori, Kenji; Tomihara, Kei; Noguchi, Makoto

    2014-01-01

    Surgical extraction of the third molar is the most commonly performed surgical procedure in the clinical practice of oral surgery. Third molar surgery is warranted when there is inadequate space for eruption, malpositioning, or risk for cyst or odontogenic tumor formation. Preoperative assessment should include a detailed morphologic analysis of the third molar and its relationship to adjacent structures and surrounding tissues. Due to developments in medical engineering technology, computed tomography (CT) now plays a critical role in providing the clear images required for adequate assessment prior to third molar surgery. Removal of the maxillary third molar is associated with a risk for maxillary sinus perforation, whereas removal of the mandibular third molar can put patients at risk for a neurosensory deficit from damage to the lingual nerve or inferior alveolar nerve. Multiple factors, including demographic, anatomic, and treatment-related factors, influence the incidence of nerve injury during or following removal of the third molar. CT assessment of the third molar prior to surgery can identify some of these risk factors, such as the absence of cortication between the mandibular third molar and the inferior alveolar canal, prior to surgery to reduce the risk for nerve damage. This topic highlight presents an overview of the clinical significance of CT assessment in third molar surgery. PMID:25071882

  7. 24 CFR 901.105 - Computing assessment score.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Computing assessment score. 901.105 Section 901.105 Housing and Urban Development Regulations Relating to Housing and Urban Development... DEVELOPMENT PUBLIC HOUSING MANAGEMENT ASSESSMENT PROGRAM § 901.105 Computing assessment score. (a)...

  8. Computer Aided Assessment and Development of Basic Skills

    ERIC Educational Resources Information Center

    Macleod, Iain; Overheu, Don

    1977-01-01

    The advantages of applying computer techniques to assessment and development of basic skills in mildly intellectually handicapped children are discussed, and several applications of computer devices to instruction are described. (SBH)

  9. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  10. A Critical Assessment of Vector Control for Dengue Prevention

    PubMed Central

    Achee, Nicole L.; Gould, Fred; Perkins, T. Alex; Reiner, Robert C.; Morrison, Amy C.; Ritchie, Scott A.; Gubler, Duane J.; Teyssou, Remy; Scott, Thomas W.

    2015-01-01

    Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations. PMID:25951103

  11. A critical assessment of vector control for dengue prevention.

    PubMed

    Achee, Nicole L; Gould, Fred; Perkins, T Alex; Reiner, Robert C; Morrison, Amy C; Ritchie, Scott A; Gubler, Duane J; Teyssou, Remy; Scott, Thomas W

    2015-05-01

    Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations.

  12. Computer Simulations to Support Science Instruction and Learning: A Critical Review of the Literature

    ERIC Educational Resources Information Center

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-01-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is…

  13. Risk assessment for physical and cyber attacks on critical infrastructures.

    SciTech Connect

    Smith, Bryan J.; Sholander, Peter E.; Phelan, James M.; Wyss, Gregory Dane; Varnado, G. Bruce; Depoy, Jennifer Mae

    2005-08-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies. Existing risk assessment methodologies consider physical security and cyber security separately. As such, they do not accurately model attacks that involve defeating both physical protection and cyber protection elements (e.g., hackers turning off alarm systems prior to forced entry). This paper presents a risk assessment methodology that accounts for both physical and cyber security. It also preserves the traditional security paradigm of detect, delay and respond, while accounting for the possibility that a facility may be able to recover from or mitigate the results of a successful attack before serious consequences occur. The methodology provides a means for ranking those assets most at risk from malevolent attacks. Because the methodology is automated the analyst can also play 'what if with mitigation measures to gain a better understanding of how to best expend resources towards securing the facilities. It is simple enough to be applied to large infrastructure facilities without developing highly complicated models. Finally, it is applicable to facilities with extensive security as well as those that are less well-protected.

  14. Assessing Students' Critical Thinking Performance: Urging for Measurements Using Multi-Response Format

    ERIC Educational Resources Information Center

    Ku, Kelly Y. L.

    2009-01-01

    The current paper discusses ambiguities in critical thinking assessment. The paper first reviews the components of critical thinking. It then discusses the features and issues of commonly used critical thinking tests and to what extend they are made compatible to the conceptualization of critical thinking. The paper argues that critical thinking…

  15. Computer Viruses: An Assessment of Student Perceptions.

    ERIC Educational Resources Information Center

    Jones, Mary C.; Arnett, Kirk P.

    1992-01-01

    A majority of 213 college business students surveyed had knowledge of computer viruses; one-fourth had been exposed to them. Many believed that computer professionals are responsible for prevention and cure. Educators should make students aware of multiple sources of infection, the breadth and extent of possible damage, and viral detection and…

  16. Report on the 2011 Critical Assessment of Function Annotation (CAFA) meeting

    SciTech Connect

    Friedberg, Iddo

    2015-01-21

    The Critical Assessment of Function Annotation meeting was held July 14-15, 2011 at the Austria Conference Center in Vienna, Austria. There were 73 registered delegates at the meeting. We thank the DOE for this award. It helped us organize and support a scientific meeting AFP 2011 as a special interest group (SIG) meeting associated with the ISMB 2011 conference. The conference was held in Vienna, Austria, in July 2011. The AFP SIG was held on July 15-16, 2011 (immediately preceding the conference). The meeting consisted of two components, the first being a series of talks (invited and contributed) and discussion sections dedicated to protein function research, with an emphasis on the theory and practice of computational methods utilized in functional annotation. The second component provided a large-scale assessment of computational methods through participation in the Critical Assessment of Functional Annotation (CAFA). The meeting was exciting and, based on feedback, quite successful. There were 73 registered participants. The schedule was only slightly different from the one proposed, due to two cancellations. Dr. Olga Troyanskaya has canceled and we invited Dr. David Jones instead. Similarly, instead of Dr. Richard Roberts, Dr. Simon Kasif gave a closing keynote. The remaining invited speakers were Janet Thornton (EBI) and Amos Bairoch (University of Geneva).

  17. The Acceptance and Use of Computer Based Assessment

    ERIC Educational Resources Information Center

    Terzis, Vasileios; Economides, Anastasios A.

    2011-01-01

    The effective development of a computer based assessment (CBA) depends on students' acceptance. The purpose of this study is to build a model that demonstrates the constructs that affect students' behavioral intention to use a CBA. The proposed model, Computer Based Assessment Acceptance Model (CBAAM) is based on previous models of technology…

  18. Assessing Mathematics Automatically Using Computer Algebra and the Internet

    ERIC Educational Resources Information Center

    Sangwin, Chris

    2004-01-01

    This paper reports some recent developments in mathematical computer-aided assessment which employs computer algebra to evaluate students' work using the Internet. Technical and educational issues raised by this use of computer algebra are addressed. Working examples from core calculus and algebra which have been used with first year university…

  19. Data on NAEP 2011 writing assessment prior computer use.

    PubMed

    Tate, Tamara P; Warschauer, Mark; Abedi, Jamal

    2016-09-01

    This data article contains information based on the 2011 National Assessment of Educational Progress in Writing Restricted-Use Data, available from the National Center for Education Statistics (NCES Pub. No. 2014476). https://nces.ed.gov/nationsreportcard/researchcenter/datatools.aspx. The data include the statistical relationships between survey reports of teachers and students regarding prior use of computers and other technology and writing achievement levels on the 2011 computer-based NAEP writing assessment. This data article accompanies "The Effects of Prior Computer Use on Computer-Based Writing: The 2011 NAEP Writing Assessment" [1]. PMID:27508253

  20. Critical Assessment of the Evidence for Striped Nanoparticles

    PubMed Central

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques. PMID:25402426

  1. Critical assessment of the evidence for striped nanoparticles.

    PubMed

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques.

  2. Critical assessment of the evidence for striped nanoparticles.

    PubMed

    Stirling, Julian; Lekkas, Ioannis; Sweetman, Adam; Djuranovic, Predrag; Guo, Quanmin; Pauw, Brian; Granwehr, Josef; Lévy, Raphaël; Moriarty, Philip

    2014-01-01

    There is now a significant body of literature which reports that stripes form in the ligand shell of suitably functionalised Au nanoparticles. This stripe morphology has been proposed to strongly affect the physicochemical and biochemical properties of the particles. We critique the published evidence for striped nanoparticles in detail, with a particular focus on the interpretation of scanning tunnelling microscopy (STM) data (as this is the only technique which ostensibly provides direct evidence for the presence of stripes). Through a combination of an exhaustive re-analysis of the original data, in addition to new experimental measurements of a simple control sample comprising entirely unfunctionalised particles, we show that all of the STM evidence for striped nanoparticles published to date can instead be explained by a combination of well-known instrumental artefacts, or by issues with data acquisition/analysis protocols. We also critically re-examine the evidence for the presence of ligand stripes which has been claimed to have been found from transmission electron microscopy, nuclear magnetic resonance spectroscopy, small angle neutron scattering experiments, and computer simulations. Although these data can indeed be interpreted in terms of stripe formation, we show that the reported results can alternatively be explained as arising from a combination of instrumental artefacts and inadequate data analysis techniques. PMID:25402426

  3. Critical Zone Experimental Design to Assess Soil Processes and Function

    NASA Astrophysics Data System (ADS)

    Banwart, Steve

    2010-05-01

    Through unsustainable land use practices, mining, deforestation, urbanisation and degradation by industrial pollution, soil losses are now hypothesized to be much faster (100 times or more) than soil formation - with the consequence that soil has become a finite resource. The crucial challenge for the international research community is to understand the rates of processes that dictate soil mass stocks and their function within Earth's Critical Zone (CZ). The CZ is the environment where soils are formed, degrade and provide their essential ecosystem services. Key among these ecosystem services are food and fibre production, filtering, buffering and transformation of water, nutrients and contaminants, storage of carbon and maintaining biological habitat and genetic diversity. We have initiated a new research project to address the priority research areas identified in the European Union Soil Thematic Strategy and to contribute to the development of a global network of Critical Zone Observatories (CZO) committed to soil research. Our hypothesis is that the combined physical-chemical-biological structure of soil can be assessed from first-principles and the resulting soil functions can be quantified in process models that couple the formation and loss of soil stocks with descriptions of biodiversity and nutrient dynamics. The objectives of this research are to 1. Describe from 1st principles how soil structure influences processes and functions of soils, 2. Establish 4 European Critical Zone Observatories to link with established CZOs, 3. Develop a CZ Integrated Model of soil processes and function, 4. Create a GIS-based modelling framework to assess soil threats and mitigation at EU scale, 5. Quantify impacts of changing land use, climate and biodiversity on soil function and its value and 6. Form with international partners a global network of CZOs for soil research and deliver a programme of public outreach and research transfer on soil sustainability. The

  4. A critically appraised topic review of computer-aided design/computer-aided machining of removable partial denture frameworks.

    PubMed

    Lang, Lisa A; Tulunoglu, Ibrahim

    2014-01-01

    A critically appraised topic (CAT) review is presented about the use of computer-aided design (CAD)/computer-aided machining (CAM) removable partial denture (RPD) frameworks. A systematic search of the literature supporting CAD/CAM RPD systems revealed no randomized clinical trials, hence the CAT review was performed. A PubMed search yielded 9 articles meeting the inclusion criteria. Each article was characterized by study design and level of evidence. No clinical outcomes research has been published on the use of CAD/CAM RPDs. Low levels of evidence were found in the available literature. Clinical research studies are needed to determine the efficacy of this treatment modality.

  5. A Computer Assessment Tool for Concept Mapping

    ERIC Educational Resources Information Center

    Akkaya, Recai; Karakirik, Erol; Durmus, Soner

    2005-01-01

    Current educational theories emphasize assessment as a vital part of teaching-learning process. Alternative assessment techniques aim to expose and promote the process of the learning rather than the final outcome. Concept mapping is a technique for representing conceptual knowledge and relationships between concepts in a graphical form. Requiring…

  6. Application of queueing models to multiprogrammed computer systems operating in a time-critical environment

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.

    1979-01-01

    A model of a central processor (CPU) which services background applications in the presence of time critical activity is presented. The CPU is viewed as an M/M/1 queueing system subject to periodic interrupts by deterministic, time critical process. The Laplace transform of the distribution of service times for the background applications is developed. The use of state of the art queueing models for studying the background processing capability of time critical computer systems is discussed and the results of a model validation study which support this application of queueing models are presented.

  7. Does Computer-Aided Formative Assessment Improve Learning Outcomes?

    ERIC Educational Resources Information Center

    Hannah, John; James, Alex; Williams, Phillipa

    2014-01-01

    Two first-year engineering mathematics courses used computer-aided assessment (CAA) to provide students with opportunities for formative assessment via a series of weekly quizzes. Most students used the assessment until they achieved very high (>90%) quiz scores. Although there is a positive correlation between these quiz marks and the final…

  8. Experiences of Using Automated Assessment in Computer Science Courses

    ERIC Educational Resources Information Center

    English, John; English, Tammy

    2015-01-01

    In this paper we discuss the use of automated assessment in a variety of computer science courses that have been taught at Israel Academic College by the authors. The course assignments were assessed entirely automatically using Checkpoint, a web-based automated assessment framework. The assignments all used free-text questions (where the students…

  9. Assessing Critical Thinking in Higher Education: The HEIghten™ Approach and Preliminary Validity Evidence

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Mao, Liyang; Frankel, Lois; Xu, Jun

    2016-01-01

    Critical thinking is a learning outcome highly valued by higher education institutions and the workforce. The Educational Testing Service (ETS) has designed a next generation assessment, the HEIghten™ critical thinking assessment, to measure students' critical thinking skills in analytical and synthetic dimensions. This paper introduces the…

  10. Empirically Assessing the Importance of Computer Skills

    ERIC Educational Resources Information Center

    Baker, William M.

    2013-01-01

    This research determines which computer skills are important for entry-level accountants, and whether some skills are more important than others. Students participated before and after internships in public accounting. Longitudinal analysis is also provided; responses from 2001 are compared to those from 2008-2009. Responses are also compared to…

  11. Critical Assessment of Current Force Fields. Short Peptide Test Case.

    PubMed

    Vymětal, Jiří; Vondrášek, Jiří

    2013-01-01

    The applicability of molecular dynamics simulations for studies of protein folding or intrinsically disordered proteins critically depends on quality of energetic functions-force fields. The four popular force fields for biomolecular simulations, CHARMM22/CMAP, AMBER FF03, AMBER FF99SB, and OPLS-AA/L, were compared in prediction of conformational propensities of all common proteinogenic amino acids. The minimalistic model of terminally block amino acids (dipeptides) was chosen for assessment of side chain effects on backbone propensities. The precise metadynamics simulations revealed striking inconsistency of trends in conformational preferences as manifested by investigated force fields for both backbone and side chains. To trace this disapproval between force fields, the two related AMBER force fields were studied more closely. In the cases of FF99SB and FF03, we uncovered that the distinct tends were driven by different charge models. Additionally, the effects of recent correction for side chain torsion (FF99SB-ILDN) were examined on affected amino acids and exposed significant coupling between free energy profiles and propensities of backbone and side chain conformers. These findings have important consequences for further force field development.

  12. A critical assessment of wind tunnel results for the NACA 0012 airfoil

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.

    1987-01-01

    A large body of experimental results, obtained in more than 40 wind tunnels on a single, well-known two-dimensional configuration, has been critically examined and correlated. An assessment of some of the possible sources of error has been made for each facility, and data which are suspect have been identified. It was found that no single experiment provided a complete set of reliable data, although one investigation stands out as superior in many respects. However, from the aggregate of data the representative properties of the NACA 0012 airfoil can be identified with reasonable confidence over wide ranges of Mach number, Reynolds number, and angles of attack. This synthesized information can now be used to assess and validate existing and future wind tunnel results and to evaluate advanced Computational Fluid Dynamics codes.

  13. Critical Literacy in School-College Collaboration through Computer Networking: A Feminist Research Project.

    ERIC Educational Resources Information Center

    Fey, Marion

    1998-01-01

    Investigates the practice of critical literacy through asynchronous computer networking as students in a school-college collaboration examined assumptions relating to gender issues. Finds the medium proved to be an apt environment--students named experiences relating to gender issues that touched their lives, and students felt free to share ideas…

  14. Two Configurations for Accessing Classroom Computers: Differential Impact on Students' Critical Reflections and Their Empowerment

    ERIC Educational Resources Information Center

    Solhaug, T.

    2009-01-01

    The context of this article is the new technological environment and the struggle to use meaningful teaching practices in Norwegian schools. Students' critical reflections in two different technological learning environments in six upper secondary schools are compared. Three of these schools offer Internet-connected computers in special computer…

  15. Computer assessment of atherosclerosis from angiographic images

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Blankenhorn, D. H.; Brooks, S. H.; Crawford, D. W.; Cashin, W. L.

    1982-01-01

    A computer method for detection and quantification of atherosclerosis from angiograms has been developed and used to measure lesion change in human clinical trials. The technique involves tracking the vessel edges and measuring individual lesions as well as the overall irregularity of the arterial image. Application of the technique to conventional arterial-injection femoral and coronary angiograms is outlined and an experimental study to extend the technique to analysis of intravenous angiograms of the carotid and cornary arteries is described.

  16. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  17. Demonstration Assessment: Measuring Conceptual Understanding and Critical Thinking with Rubrics.

    ERIC Educational Resources Information Center

    Radford, David L.; And Others

    1995-01-01

    Presents the science demonstration assessment as an authentic- assessment technique to assess whether students understand basic science concepts and can use them to solve problems. Uses rubrics to prepare students for the assessment and to assign final grades. Provides examples of science demonstration assessments and the scoring of rubrics in the…

  18. Computational Toxicology in Cancer Risk Assessment

    EPA Science Inventory

    Risk assessment over the last half century has, for many individual cases served us well, but has proceeded on an extremely slow pace and has left us with considerable uncertainty. There are certainly thousands of compounds and thousands of exposure scenarios that remain unteste...

  19. Assessing Knowledge Change in Computer Science

    ERIC Educational Resources Information Center

    Nash, Jane Gradwohl; Bravaco, Ralph J.; Simonson, Shai

    2006-01-01

    The purpose of this study was to assess structural knowledge change across a two-week workshop designed to provide high-school teachers with training in Java and Object Oriented Programming. Both before and after the workshop, teachers assigned relatedness ratings to pairs of key concepts regarding Java and Object Oriented Programming. Their…

  20. Developing a Critical Lens among Preservice Teachers while Working within Mandated Performance-Based Assessment Systems

    ERIC Educational Resources Information Center

    Moss, Glenda

    2008-01-01

    This article addresses the dilemma of promoting critical pedagogy within portfolio assessment, which has been implemented in many teacher education programs to meet state and national mandates for performance-based assessment. It explores how one teacher educator works to move portfolio assessment to a level of critical self-reflection that…

  1. Perforated Appendicitis: Assessment With Multidetector Computed Tomography.

    PubMed

    Iacobellis, Francesca; Iadevito, Isabella; Romano, Federica; Altiero, Michele; Bhattacharjee, Bikram; Scaglione, Mariano

    2016-02-01

    Appendicitis is one of the most common abdominal surgical emergencies. In some cases, the correct diagnosis may be challenging, owing to different conditions that can mimic this pathology. In this context, abdominal computed tomography (CT) is the imaging modality of choice, leading to an accurate diagnosis and to a reduction in unnecessary laparotomies. The diagnosis of perforated appendix is crucial, but the detection of the perforation signs by CT may not be so simple in the early process. The aim of this article is to review the multiple detector CT signs of perforated appendicitis.

  2. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    NASA Technical Reports Server (NTRS)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  3. Use of Computer Assisted Assessment: Benefits to Students and Staff.

    ERIC Educational Resources Information Center

    Stephens, Derek

    2001-01-01

    Compares the use of computers with traditional paper and pencil to deliver objective tests for summative assessment with undergraduates in the United Kingdom. Considers issues of gender differences, objective testing, computer anxiety, and benefits to staff and students, and recommends the need for pre-test preparation and practice testing.…

  4. Using Computer-Assisted Assessment Heuristics for Usability Evaluations

    ERIC Educational Resources Information Center

    Sim, Gavin; Read, Janet C.

    2016-01-01

    Teaching practices within educational institutions have evolved through the increased adoption of technology to deliver the curriculum and the use of computers for assessment purposes. For educational technologists, there is a vast array of commercial computer applications available for the delivery of objective tests, and in some instances,…

  5. International Computer and Information Literacy Study: Assessment Framework

    ERIC Educational Resources Information Center

    Fraillon, Julian; Schulz, Wolfram; Ainley, John

    2013-01-01

    The purpose of the International Computer and Information Literacy Study 2013 (ICILS 2013) is to investigate, in a range of countries, the ways in which young people are developing "computer and information literacy" (CIL) to support their capacity to participate in the digital age. To achieve this aim, the study will assess student…

  6. Modelling Critical Thinking through Learning-Oriented Assessment

    ERIC Educational Resources Information Center

    Lombard, B. J. J.

    2008-01-01

    One of the cornerstones peculiar to the outcomes-based approach adopted by the South African education and training sector is the so-called "critical outcomes". Included in one of these outcomes is the ability to think critically. Although this outcome articulates well with the cognitive domain of holistic development, it also gives rise to some…

  7. Developing Critical Thinking Skills: Assessing the Effectiveness of Workbook Exercises

    ERIC Educational Resources Information Center

    Wallace, Elise D.; Jefferson, Renee N.

    2015-01-01

    To address the challenge of developing critical thinking skills in college students, this empirical study examines the effectiveness of cognitive exercises in developing those skills. The study uses Critical Thinking: Building the Basics by Walter, Knudsvig, and Smith (2003). This workbook is specifically designed to exercise and develop critical…

  8. Assessing Critical Thinking Performance of Postgraduate Students in Threaded Discussions

    ERIC Educational Resources Information Center

    Tan, Cheng Lee; Ng, Lee Luan

    2014-01-01

    Critical thinking has increasingly been seen as one of the important attributes where human capital is concerned and in line with this recognition, the tertiary educational institutions worldwide are putting more effort into designing courses that produce university leavers who are critical thinkers. This study aims to investigate the critical…

  9. Critical Thinking Assessment: Measuring a Moving Target. Report & Recommendations of the South Carolina Higher Education Assessment Network Critical Thinking Task Force.

    ERIC Educational Resources Information Center

    Cook, Patricia; Johnson, Reid; Moore, Phil; Myers, Phyllis; Pauly, Susan; Pendarvis, Faye; Prus, Joe; Ulmer-Sottong, Lovely

    This report is part of South Carolina's effort to move toward "100 percent performance funding" for the state's public colleges and universities and results from a task force's investigation of ways to assess critical thinking. The following eight major findings are reported: (1) policy makers must determine priorities; (2) critical thinking lacks…

  10. Assessment of the current computer literacy and future computer needs of emergency medicine residents and faculty.

    PubMed

    Debehnke, D J; Valley, V T

    1993-07-01

    The purpose of this study was to assess the current computer literacy and future computer needs of emergency medicine residents and faculty to aid in developing a computer literacy curriculum. All emergency medicine residents and full-time faculty from a random sample of emergency medicine residencies were mailed questionnaires assessing current computer familiarity and future computer needs. Twenty-one residencies were surveyed; 15 resident and 17 faculty questionnaires were returned. Thirty-seven percent (116 of 314) faculty and 29% (135 of 470) resident questionnaires were completed and returned. Eighty percent (12 of 15) of residencies had a designated computer for resident use; 93% (14 of 15) had a computer for use in the emergency department. Forty-seven percent of residents owned their own computer; 68% of faculty had a computer in their home, and 52% had computers in their office. Less than 30% of residents and faculty had formal computer training. Residents and faculty rated the current familiarity and future needs for various software applications on a five-point scale. Data were analyzed using the Wilcoxon-Rank Sum Test. Residents and faculty had the most anticipated need for word processing, graphics, literature searching, data base, and patient management programs. Future computer need was rated significantly higher than current computer familiarity in all computer application areas (P < or = .0002). It seems that emergency medicine residents and faculty have adequate access to computers, but minimal computer training. Residents and faculty have a high anticipated need for various basic computer applications.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8216519

  11. Computational assessment of organic photovoltaic candidate compounds

    NASA Astrophysics Data System (ADS)

    Borunda, Mario; Dai, Shuo; Olivares-Amaya, Roberto; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan

    2015-03-01

    Organic photovoltaic (OPV) cells are emerging as a possible renewable alternative to petroleum based resources and are needed to meet our growing demand for energy. Although not as efficient as silicon based cells, OPV cells have as an advantage that their manufacturing cost is potentially lower. The Harvard Clean Energy Project, using a cheminformatic approach of pattern recognition and machine learning strategies, has ranked a molecular library of more than 2.6 million candidate compounds based on their performance as possible OPV materials. Here, we present a ranking of the top 1000 molecules for use as photovoltaic materials based on their optical absorption properties obtained via time-dependent density functional theory. This computational search has revealed the molecular motifs shared by the set of most promising molecules.

  12. Portfolios Plus: A Critical Guide to Alternative Assessment.

    ERIC Educational Resources Information Center

    Mabry, Linda

    This book explains some basic assumptions that underlie different assessment systems, some connections between education and assessment, and some assessment options that have gone unrecognized. The discussion serves as a guide to designing a custom assessment program individualized to fit the students, school, and community. Part 2 contains…

  13. Critical Assessment Issues in Work-Integrated Learning

    ERIC Educational Resources Information Center

    Ferns, Sonia; Zegwaard, Karsten E.

    2014-01-01

    Assessment has long been a contentious issue in work-integrated learning (WIL) and cooperative education. Despite assessment being central to the integrity and accountability of a university and long-standing theories around best practice in assessment, enacting quality assessment practices has proven to be more difficult. Authors in this special…

  14. Is Model-Based Development a Favorable Approach for Complex and Safety-Critical Computer Systems on Commercial Aircraft?

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    A system is safety-critical if its failure can endanger human life or cause significant damage to property or the environment. State-of-the-art computer systems on commercial aircraft are highly complex, software-intensive, functionally integrated, and network-centric systems of systems. Ensuring that such systems are safe and comply with existing safety regulations is costly and time-consuming as the level of rigor in the development process, especially the validation and verification activities, is determined by considerations of system complexity and safety criticality. A significant degree of care and deep insight into the operational principles of these systems is required to ensure adequate coverage of all design implications relevant to system safety. Model-based development methodologies, methods, tools, and techniques facilitate collaboration and enable the use of common design artifacts among groups dealing with different aspects of the development of a system. This paper examines the application of model-based development to complex and safety-critical aircraft computer systems. Benefits and detriments are identified and an overall assessment of the approach is given.

  15. Review of Estelle and LOTOS with respect to critical computer applications

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Man rated NASA space vehicles seem to represent a set of ultimate critical computer applications. These applications require a high degree of security, integrity, and safety. A variety of formal and/or precise modeling techniques are becoming available for the designer of critical systems. The design phase of the software engineering life cycle includes the modification of non-development components. A review of the Estelle and LOTOS formal description languages is presented. Details of the languages and a set of references are provided. The languages were used to formally describe some of the Open System Interconnect (OSI) protocols.

  16. Perceptions of University Students regarding Computer Assisted Assessment

    ERIC Educational Resources Information Center

    Jamil, Mubashrah

    2012-01-01

    Computer assisted assessment (CAA) is a common technique of assessment in higher educational institutions in Western countries, but a relatively new concept for students and teachers in Pakistan. It was therefore interesting to investigate students' perceptions about CAA practices from different universities of Pakistan. Information was collected…

  17. eWorkbook: A Computer Aided Assessment System

    ERIC Educational Resources Information Center

    Costagliola, Gennaro; Ferrucci, Filomena; Fuccella, Vittorio; Oliveto, Rocco

    2007-01-01

    Computer aided assessment (CAA) tools are more and more widely adopted in academic environments mixed to other assessment means. In this article, we present a CAA Web application, named eWorkbook, which can be used for evaluating learner's knowledge by creating (the tutor) and taking (the learner) on-line tests based on multiple choice, multiple…

  18. Computer-Aided Self Assessment: The Intelligent Answer?

    ERIC Educational Resources Information Center

    Waite, Alice; Goodman, Linda M.

    1989-01-01

    Describes the development of a computer-assisted self assessment system in the United Kingdom that was designed to explore the use of artificial intelligence techniques in the area of self assessment for training applications. The expert systems used are explained, development of a pilot prototype is outlined, and field tests are described. (eight…

  19. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    SciTech Connect

    Ivanova, T.; Laville, C.; Dyrda, J.; Mennerdahl, D.; Golovko, Y.; Raskach, K.; Tsiboulia, A.; Lee, G. S.; Woo, S. W.; Bidaud, A.; Sabouri, P.; Bledsoe, K.; Rearden, B.; Gulliford, J.; Michel-Sendis, F.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)

  20. Critical phenomena in communication/computation networks with various topologies and suboptimal to optimal resource allocation

    NASA Astrophysics Data System (ADS)

    Cogoni, Marco; Busonera, Giovanni; Anedda, Paolo; Zanetti, Gianluigi

    2015-01-01

    We generalize previous studies on critical phenomena in communication networks [1,2] by adding computational capabilities to the nodes. In our model, a set of tasks with random origin, destination and computational structure is distributed on a computational network, modeled as a graph. By varying the temperature of a Metropolis Montecarlo, we explore the global latency for an optimal to suboptimal resource assignment at a given time instant. By computing the two-point correlation function for the local overload, we study the behavior of the correlation distance (both for links and nodes) while approaching the congested phase: a transition from peaked to spread g(r) is seen above a critical (Montecarlo) temperature Tc. The average latency trend of the system is predicted by averaging over several network traffic realizations while maintaining a spatially detailed information for each node: a sharp decrease of performance is found over Tc independently of the workload. The globally optimized computational resource allocation and network routing defines a baseline for a future comparison of the transition behavior with respect to existing routing strategies [3,4] for different network topologies.

  1. Actor-critic models of the basal ganglia: new anatomical and computational perspectives.

    PubMed

    Joel, Daphna; Niv, Yael; Ruppin, Eytan

    2002-01-01

    A large number of computational models of information processing in the basal ganglia have been developed in recent years. Prominent in these are actor-critic models of basal ganglia functioning, which build on the strong resemblance between dopamine neuron activity and the temporal difference prediction error signal in the critic, and between dopamine-dependent long-term synaptic plasticity in the striatum and learning guided by a prediction error signal in the actor. We selectively review several actor-critic models of the basal ganglia with an emphasis on two important aspects: the way in which models of the critic reproduce the temporal dynamics of dopamine firing, and the extent to which models of the actor take into account known basal ganglia anatomy and physiology. To complement the efforts to relate basal ganglia mechanisms to reinforcement learning (RL), we introduce an alternative approach to modeling a critic network, which uses Evolutionary Computation techniques to 'evolve' an optimal RL mechanism, and relate the evolved mechanism to the basic model of the critic. We conclude our discussion of models of the critic by a critical discussion of the anatomical plausibility of implementations of a critic in basal ganglia circuitry, and conclude that such implementations build on assumptions that are inconsistent with the known anatomy of the basal ganglia. We return to the actor component of the actor-critic model, which is usually modeled at the striatal level with very little detail. We describe an alternative model of the basal ganglia which takes into account several important, and previously neglected, anatomical and physiological characteristics of basal ganglia-thalamocortical connectivity and suggests that the basal ganglia performs reinforcement-biased dimensionality reduction of cortical inputs. We further suggest that since such selective encoding may bias the representation at the level of the frontal cortex towards the selection of rewarded

  2. A Critical Examination of PISA's Assessment on Scientific Literacy

    ERIC Educational Resources Information Center

    Lau, Kwok-Chi

    2009-01-01

    The OECD "Programme for International Student Assessment" or (PISA) is one of the largest-scale international efforts that have been launched to assess students' scientific literacy. Such an international assessment would likely exert a profound impact on the science education policies of the participating countries/regions, including Hong Kong.…

  3. Critical Thinking and Political Participation: Development and Assessment of a Casual Model.

    ERIC Educational Resources Information Center

    Guyton, Edith M.

    1988-01-01

    This study assessed a model of the relationship between critical thinking and political participation. Findings indicated that critical thinking has indirect positive effects on orientations toward political participation, that critical thinking positively affects personal control, political efficacy, and democratic attitude, and that personal…

  4. An Assessment of Post-Professional Athletic Training Students' Critical Thinking Skills and Dispositions

    ERIC Educational Resources Information Center

    Walter, Jessica Marie

    2013-01-01

    The need for outcome measures in critical thinking skills and dispositions for post-professional athletic training programs (PPATPs) is significant. It has been suggested that athletic trainers who are competent and disposed towards thinking critically will be successful in the profession. The purpose of this study is to assess critical thinking…

  5. Does Computer-Based Motor Skill Assessment Training Transfer to Live Assessing?

    ERIC Educational Resources Information Center

    Kelly, Luke E.; Taliaferro, Andrea; Krause, Jennifer

    2012-01-01

    Developing competency in motor skill assessment has been identified as a critical need in physical educator preparation. We conducted this study to evaluate (a) the effectiveness of a web-based instructional program--Motor Skill Assessment Program (MSAP)--for developing assessment competency, and specifically (b) whether competency developed by…

  6. Criticism or praise? The impact of verbal versus text-only computer feedback on social presence, intrinsic motivation, and recall.

    PubMed

    Bracken, Cheryl Campanella; Jeffres, Leo W; Neuendorf, Kimberly A

    2004-06-01

    The Computers Are Social Actors (CASA) paradigm asserts that human computer users interact socially with computers, and has provided extensive evidence that this is the case. In this experiment (n = 134), participants received either praise or criticism from a computer. Independent variables were the direction feedback (praise or criticism), and voice channel (verbal or text-only). Dependent variables measured via a computer-based questionnaire were recall, perceived ability, intrinsic motivation, and perceptions of the computer as a social entity. Results demonstrate that participants had similar reactions to computers as predicted by interpersonal communication research with participants who received text-only criticism reporting higher levels of intrinsic motivation, perceived ability, and recall. Additionally, the computer was seen as more intelligent. Implications for theory and application are discussed.

  7. Evaluation of theoretical critical angle including mass effects for channeling by computer simulation

    NASA Astrophysics Data System (ADS)

    Takeuchi, Wataru

    2011-06-01

    The calculated critical angles using the theory included mass effects of Zheng et al. for the axial channeling of ion have been investigated by the computer simulations, making comparisons with the theory of Lindhard and the precise formula of Barrett's numerical simulations. The computer simulations employing the ACOCT program code, which treats the atomic collisions three-dimensionally and is based on the binary collision approximation (BCA), were carried out for the channeling of He, Ne, Ar, Kr, Xe and Rn ions incident along the <1 0 0> axis in Al, Cu, Ag and Pt crystals. A slight dependence of the channeling critical angle on the atomic number of incident ion in the ACOCT results is in agreement with that in the calculated ones using the theory of mass effects. The average critical angles in the ACOCT results for the channeling of six rare gas ions are approximately 5.0/ Z2 times the magnitude of the theoretical critical angles with mass effects, where Z2 is the atomic number of crystal atom. Besides, the results show that the calculated critical angles using the theory with mass effects are substantially larger than those using the theory of Lindhard, the Barrett's formula and the formula by the ACOCT simulations for He ions impinging on Al, Cu, Ag and Pt crystals, and that the channeling critical angles in the ACOCT results agree well with those in the calculated ones using Barrett's formula for 0.6-50 MeV He ions incident on Cu and Ag crystals and 5-50 MeV He ions impinging on Al and Pt crystals.

  8. Using student writing assignments to assess critical thinking skills: a holistic approach.

    PubMed

    Niedringhaus, L K

    2001-04-01

    This work offers an example of one school's holistic approach to the evaluation of critical thinking by using student writing assignments. Faculty developed tools to assess achievement of critical thinking competencies, such as analysis, synthesis, insight, reflection, open mindedness, and depth, breadth, and appropriateness of clinical interventions. Faculty created a model for the development of program-specific critical thinking competencies, selected appropriate writing assignments that demonstrate critical thinking, and implemented a holistic assessment plan for data collection and analysis. Holistic assessment involves the identification of shared values and practices, and the use of concepts and language important to nursing.

  9. Assessment and treatment of hyperglycemia in critically ill patients

    PubMed Central

    Viana, Marina Verçoza; Moraes, Rafael Barberena; Fabbrin, Amanda Rodrigues; Santos, Manoella Freitas; Gerchman, Fernando

    2014-01-01

    Hyperglycemia is a commonly encountered issue in critically ill patients in the intensive care setting. The presence of hyperglycemia is associated with increased morbidity and mortality, regardless of the reason for admission (e.g., acute myocardial infarction, status post-cardiovascular surgery, stroke, sepsis). However, the pathophysiology and, in particular, the treatment of hyperglycemia in the critically ill patient remain controversial. In clinical practice, several aspects must be taken into account in the management of these patients, including blood glucose targets, history of diabetes mellitus, the route of nutrition (enteral or parenteral), and available monitoring equipment, which substantially increases the workload of providers involved in the patients' care. This review describes the epidemiology, pathophysiology, management, and monitoring of hyperglycemia in the critically ill adult patient. PMID:24770692

  10. [Assessment and treatment of hyperglycemia in critically ill patients].

    PubMed

    Viana, Marina Verçoza; Moraes, Rafael Barberena; Fabbrin, Amanda Rodrigues; Santos, Manoella Freitas; Gerchman, Fernando

    2014-01-01

    Hyperglycemia is a commonly encountered issue in critically ill patients in the intensive care setting. The presence of hyperglycemia is associated with increased morbidity and mortality, regardless of the reason for admission (e.g., acute myocardial infarction, status post-cardiovascular surgery, stroke, sepsis). However, the pathophysiology and, in particular, the treatment of hyperglycemia in the critically ill patient remain controversial. In clinical practice, several aspects must be taken into account in the management of these patients, including blood glucose targets, history of diabetes mellitus, the route of nutrition (enteral or parenteral), and available monitoring equipment, which substantially increases the workload of providers involved in the patients' care. This review describes the epidemiology, pathophysiology, management, and monitoring of hyperglycemia in the critically ill adult patient.

  11. Multiple Measures of Critical Thinking Skills and Predisposition in Assessment of Critical Thinking.

    ERIC Educational Resources Information Center

    Spicer, Karin-Leigh; Hanks, William E.

    A panel of 46 experts from philosophy and education defines critical thinking as "purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based." At…

  12. Assessing Reliability: Critical Corrections for a Critical Examination of the Rorschach Comprehensive System.

    ERIC Educational Resources Information Center

    Meyer, Gregory J.

    1997-01-01

    In reply to criticism of the Rorschach Comprehensive System (CS) by J. Wood, M. Nezworski, and W. Stejskal (1996), this article presents a meta-analysis of published data indicating that the CS has excellent chance-corrected interrater reliability. It is noted that the erroneous assumptions of Wood et al. make their assertions about validity…

  13. Moving beyond Assessment to Improving Students' Critical Thinking Skills: A Model for Implementing Change

    ERIC Educational Resources Information Center

    Haynes, Ada; Lisic, Elizabeth; Goltz, Michele; Stein, Barry; Harris, Kevin

    2016-01-01

    This research examines how the use of the CAT (Critical thinking Assessment Test) and involvement in CAT-Apps (CAT Applications within the discipline) training can serve as an important part of a faculty development model that assists faculty in the assessment of students' critical thinking skills and in the development of these skills within…

  14. Workplace Educators' Interpretations of Their Assessment Practices: A View through a Critical Practice Lens

    ERIC Educational Resources Information Center

    Trede, Franziska; Smith, Megan

    2014-01-01

    In this paper, we examine workplace educators' interpretations of their assessment practices. We draw on a critical practice lens to conceptualise assessment practice as a social, relational and situated practice that becomes critical through critique and emancipation. We conducted semi-structured interviews followed by roundtable discussions…

  15. The Halpern Critical Thinking Assessment and Real-World Outcomes: Cross-National Applications

    ERIC Educational Resources Information Center

    Butler, Heather A.; Dwyer, Christopher P.; Hogan, Michael J.; Franco, Amanda; Rivas, Silvia F.; Saiz, Carlos; Almeida, Leandro S.

    2012-01-01

    The Halpern Critical Thinking Assessment (HCTA) is a reliable measure of critical thinking that has been validated with numerous qualitatively different samples and measures of academic success (Halpern, 2010a). This paper presents several cross-national applications of the assessment, and recent work to expand the validation of the HCTA with…

  16. Providing Formative Feedback From a Summative Computer-aided Assessment

    PubMed Central

    Sewell, Robert D. E.

    2007-01-01

    Objectives To examine the effectiveness of providing formative feedback for summative computer-aided assessment. Design Two groups of first-year undergraduate life science students in pharmacy and neuroscience who were studying an e-learning package in a common pharmacology module were presented with a computer-based summative assessment. A sheet with individualized feedback derived from each of the 5 results sections of the assessment was provided to each student. Students were asked via a questionnaire to evaluate the form and method of feedback. Assessment The students were able to reflect on their performance and use the feedback provided to guide their future study or revision. There was no significant difference between the responses from pharmacy and neuroscience students. Students' responses on the questionnaire indicated a generally positive reaction to this form of feedback. Conclusions Findings suggest that additional formative assessment conveyed by this style and method would be appreciated and valued by students. PMID:17533442

  17. Risk Assessment Methodology for Protecting Our Critical Physical Infrastructures

    SciTech Connect

    BIRINGER,BETTY E.; DANNEELS,JEFFREY J.

    2000-12-13

    Critical infrastructures are central to our national defense and our economic well-being, but many are taken for granted. Presidential Decision Directive (PDD) 63 highlights the importance of eight of our critical infrastructures and outlines a plan for action. Greatly enhanced physical security systems will be required to protect these national assets from new and emerging threats. Sandia National Laboratories has been the lead laboratory for the Department of Energy (DOE) in developing and deploying physical security systems for the past twenty-five years. Many of the tools, processes, and systems employed in the protection of high consequence facilities can be adapted to the civilian infrastructure.

  18. Computer Simulations to Support Science Instruction and Learning: A critical review of the literature

    NASA Astrophysics Data System (ADS)

    Smetana, Lara Kathleen; Bell, Randy L.

    2012-06-01

    Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.

  19. Assessing Program Impact with the Critical Incident Technique

    ERIC Educational Resources Information Center

    O'Neill, Barbara

    2013-01-01

    The critical incident technique (CIT) is a qualitative research method where subjects are encouraged to tell personal stories that provide descriptive data. Researchers who use the CIT employ a structured methodology to encourage respondents to share their experiences regarding a particular topic. Incidents are considered effective/successful when…

  20. Teaching in the Zone: Formative Assessments for Critical Thinking

    ERIC Educational Resources Information Center

    Maniotes, Leslie K.

    2010-01-01

    This article discusses how a school librarian can help students improve their critical thinking and strengthen their higher order thinking skills through the inquiry process. First, it will use a Guided Inquiry approach to examine how higher order thinking skills are taught within an inquiry paradigm. Next, it will consider how formative…

  1. Assessment of Prospective Teachers' Views Regarding the Concept of Criticism

    ERIC Educational Resources Information Center

    Karakus, Neslihan

    2015-01-01

    Critical thinking is one of the skills that exist in the Turkish course curriculum and is aimed to be acquired by students. The objective of the study is to determine prospective Turkish teachers' perspectives regarding the concept of critism, which is both a mental exercise and carries an important role in the world of ideas. In order to assess…

  2. Assess the Critical Period Hypothesis in Second Language Acquisition

    ERIC Educational Resources Information Center

    Du, Lihong

    2010-01-01

    The Critical Period Hypothesis aims to investigate the reason for significant difference between first language acquisition and second language acquisition. Over the past few decades, researchers carried out a series of studies to test the validity of the hypothesis. Although there were certain limitations in these studies, most of their results…

  3. The role of computer modelling in participatory integrated assessments

    SciTech Connect

    Siebenhuener, Bernd . E-mail: bernd.siebenhuener@uni-oldenburg.de; Barth, Volker . E-mail: volker.barth@uni-oldenburg.de

    2005-05-15

    In a number of recent research projects, computer models have been included in participatory procedures to assess global environmental change. The intention was to support knowledge production and to help the involved non-scientists to develop a deeper understanding of the interactions between natural and social systems. This paper analyses the experiences made in three projects with the use of computer models from a participatory and a risk management perspective. Our cross-cutting analysis of the objectives, the employed project designs and moderation schemes and the observed learning processes in participatory processes with model use shows that models play a mixed role in informing participants and stimulating discussions. However, no deeper reflection on values and belief systems could be achieved. In terms of the risk management phases, computer models serve best the purposes of problem definition and option assessment within participatory integrated assessment (PIA) processes.

  4. Assessment of computer game as a psychological stressor.

    PubMed

    Sharma, Ratna; Khera, Shveta; Mohan, Amit; Gupta, Nidhi; Ray, Rooma Basu

    2006-01-01

    To simulate the effects of acute psychological stress, the effects of stressful computer game in young adult subjects were assessed by various physiological, psychological and biochemical parameters. The results showed a significant increase in the physiological and psychological markers of stress. It is concluded from these results that computer game can be used as an acute laboratory psychological stressor for future studies on physiological effects of stress.

  5. Engaging Faculty in the Assessment and Improvement of Students' Critical Thinking Using the Critical Thinking Assessment Test

    ERIC Educational Resources Information Center

    Stein, Barry; Haynes, Ada

    2011-01-01

    Many assessment experts believe it is essential to develop faculty-driven assessment tools in order to engage faculty in meaningful assessment that can improve student learning. Tennessee Technological University (TTU) has been involved in an extended effort during the last ten years to develop, refine, and nationally disseminate an instrument to…

  6. Assessing Critical Thinking: A College's Journey and Lessons Learned

    ERIC Educational Resources Information Center

    Peach, Brian E.; Mukherjee, Arup; Hornyak, Martin

    2007-01-01

    The business college at University of West Florida is currently in the throes of implementing an assessment initiative to develop student learning outcomes, design assessment devices to measure learning, analyze the measurement results to identify learning shortfalls, and establish feedback mechanisms to modify the curriculum to address the…

  7. Assessment of examinations in computer science doctoral education

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types of candidacy examinations that are required. The performance of computer science departments, estimated via two common surrogate metrics, in these different categories of candidacy requirements are compared and contrasted and the correlation between candidacy requirements and program/department performance is assessed.

  8. Evaluation of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) and the Collected Data.

    PubMed

    Huff, Andrew G; Hodges, James S; Kennedy, Shaun P; Kircher, Amy

    2015-08-01

    To protect and secure food resources for the United States, it is crucial to have a method to compare food systems' criticality. In 2007, the U.S. government funded development of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) to determine which food and agriculture systems were most critical to the nation. FASCAT was developed in a collaborative process involving government officials and food industry subject matter experts (SMEs). After development, data were collected using FASCAT to quantify threats, vulnerabilities, consequences, and the impacts on the United States from failure of evaluated food and agriculture systems. To examine FASCAT's utility, linear regression models were used to determine: (1) which groups of questions posed in FASCAT were better predictors of cumulative criticality scores; (2) whether the items included in FASCAT's criticality method or the smaller subset of FASCAT items included in DHS's risk analysis method predicted similar criticality scores. Akaike's information criterion was used to determine which regression models best described criticality, and a mixed linear model was used to shrink estimates of criticality for individual food and agriculture systems. The results indicated that: (1) some of the questions used in FASCAT strongly predicted food or agriculture system criticality; (2) the FASCAT criticality formula was a stronger predictor of criticality compared to the DHS risk formula; (3) the cumulative criticality formula predicted criticality more strongly than weighted criticality formula; and (4) the mixed linear regression model did not change the rank-order of food and agriculture system criticality to a large degree.

  9. Evaluation of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) and the Collected Data.

    PubMed

    Huff, Andrew G; Hodges, James S; Kennedy, Shaun P; Kircher, Amy

    2015-08-01

    To protect and secure food resources for the United States, it is crucial to have a method to compare food systems' criticality. In 2007, the U.S. government funded development of the Food and Agriculture Sector Criticality Assessment Tool (FASCAT) to determine which food and agriculture systems were most critical to the nation. FASCAT was developed in a collaborative process involving government officials and food industry subject matter experts (SMEs). After development, data were collected using FASCAT to quantify threats, vulnerabilities, consequences, and the impacts on the United States from failure of evaluated food and agriculture systems. To examine FASCAT's utility, linear regression models were used to determine: (1) which groups of questions posed in FASCAT were better predictors of cumulative criticality scores; (2) whether the items included in FASCAT's criticality method or the smaller subset of FASCAT items included in DHS's risk analysis method predicted similar criticality scores. Akaike's information criterion was used to determine which regression models best described criticality, and a mixed linear model was used to shrink estimates of criticality for individual food and agriculture systems. The results indicated that: (1) some of the questions used in FASCAT strongly predicted food or agriculture system criticality; (2) the FASCAT criticality formula was a stronger predictor of criticality compared to the DHS risk formula; (3) the cumulative criticality formula predicted criticality more strongly than weighted criticality formula; and (4) the mixed linear regression model did not change the rank-order of food and agriculture system criticality to a large degree. PMID:25857323

  10. Optimal recovery sequencing for critical infrastructure resilience assessment.

    SciTech Connect

    Vugrin, Eric D.; Brown, Nathanael J. K.; Turnquist, Mark Alan

    2010-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the identification of optimal recovery strategies that maximize resilience. To this goal, we formulate a bi-level optimization problem for infrastructure network models. In the 'inner' problem, we solve for network flows, and we use the 'outer' problem to identify the optimal recovery modes and sequences. We draw from the literature of multi-mode project scheduling problems to create an effective solution strategy for the resilience optimization model. We demonstrate the application of this approach to a set of network models, including a national railroad model and a supply chain for Army munitions production.

  11. Investigation of the "Convince Me" Computer Environment as a Tool for Critical Argumentation about Public Policy Issues

    ERIC Educational Resources Information Center

    Adams, Stephen T.

    2003-01-01

    The "Convince Me" computer environment supports critical thinking by allowing users to create and evaluate computer-based representations of arguments. This study investigates theoretical and design considerations pertinent to using "Convince Me" as an educational tool to support reasoning about public policy issues. Among computer environments…

  12. Pain assessment and management in critically ill older adults.

    PubMed

    Kirksey, Kenn M; McGlory, Gayle; Sefcik, Elizabeth F

    2015-01-01

    Older adults comprise approximately 50% of patients admitted to critical care units in the United States. This population is particularly susceptible to multiple morbidities that can be exacerbated by confounding factors like age-related safety risks, polypharmacy, poor nutrition, and social isolation. The elderly are particularly vulnerable to health conditions (heart disease, stroke, and diabetes) that put them at greater risk of morbidity and mortality. When an older adult presents to the emergency department with 1 or more of these life-altering diagnoses, an admission to the intensive care unit is often inevitable. Pain is one of the most pervasive manifestations exhibited by intensive care unit patients. There are myriad challenges for critical care nurses in caring for patients experiencing pain-inadequate communication (cognitively impaired or intubated patients), addressing the concerns of family members, or gaps in patients' knowledge. The purpose of this article was to discuss the multidimensional nature of pain and identify concepts innate to pain homeostenosis for elderly patients in the critical care setting. Evidence-based strategies, including an interprofessional team approach and best practice recommendations regarding pharmacological and nonpharmacological pain management, are presented. PMID:26039645

  13. The Cranfield II Relevance Assessments: A Critical Evaluation

    ERIC Educational Resources Information Center

    Harter, Stephen P.

    1971-01-01

    The relevance assessments belonging to the Cranfield II document/query collection are shown to be faulty, in the sense that many" relevant documents were not so identified by the Cranfield judges. 9 references. (Author)

  14. Assessment of Teaching Methods and Critical Thinking in a Course for Science Majors

    NASA Astrophysics Data System (ADS)

    Speck, Angela; Ruzhitskaya, L.; Whittington, A. G.

    2014-01-01

    Ability to think critically is a key ingredient to the scientific mindset. Students who take science courses may or may not be predisposed to critical thinking - the ability to evaluate information analytically. Regardless of their initial stages, students can significantly improve their critical thinking through learning and practicing their reasoning skills, critical assessments, conducting and reflecting on observations and experiments, building their questioning and communication skills, and through the use of other techniques. While, there are several of teaching methods that may help to improve critical thinking, there are only a few assessment instruments that can help in evaluating the efficacy of these methods. Critical thinking skills and improvement in those skills are notoriously difficult to measure. Assessments that are based on multiple-choice questions demonstrate students’ final decisions but not their thinking processes. In addition, during the course of studies students may develop subject-based critical thinking while not being able to extend the skills to the general critical thinking. As such, we wanted to design and conduct a study on efficacy of several teaching methods in which we would learn how students’ improve their thinking processes within a science discipline as well as in everyday life situations. We conducted a study among 20 astronomy, physics and geology majors-- both graduate and undergraduate students-- enrolled in our Solar System Science course (mostly seniors and early graduate students) at the University of Missouri. We used the Ennis-Weir Critical Thinking Essay test to assess students’ general critical thinking and, in addition, we implemented our own subject-based critical thinking assessment. Here, we present the results of this study and share our experience on designing a subject-based critical thinking assessment instrument.

  15. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    DOE PAGESBeta

    Jacobsen, Jesper Lykke; Scullard, Christian R.

    2013-02-04

    We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size ofmore » B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.« less

  16. Validation of a scenario-based assessment of critical thinking using an externally validated tool.

    PubMed

    Buur, Jennifer L; Schmidt, Peggy; Smylie, Dean; Irizarry, Kris; Crocker, Carlos; Tyler, John; Barr, Margaret

    2012-01-01

    With medical education transitioning from knowledge-based curricula to competency-based curricula, critical thinking skills have emerged as a major competency. While there are validated external instruments for assessing critical thinking, many educators have created their own custom assessments of critical thinking. However, the face validity of these assessments has not been challenged. The purpose of this study was to compare results from a custom assessment of critical thinking with the results from a validated external instrument of critical thinking. Students from the College of Veterinary Medicine at Western University of Health Sciences were administered a custom assessment of critical thinking (ACT) examination and the externally validated instrument, California Critical Thinking Skills Test (CCTST), in the spring of 2011. Total scores and sub-scores from each exam were analyzed for significant correlations using Pearson correlation coefficients. Significant correlations between ACT Blooms 2 and deductive reasoning and total ACT score and deductive reasoning were demonstrated with correlation coefficients of 0.24 and 0.22, respectively. No other statistically significant correlations were found. The lack of significant correlation between the two examinations illustrates the need in medical education to externally validate internal custom assessments. Ultimately, the development and validation of custom assessments of non-knowledge-based competencies will produce higher quality medical professionals.

  17. Assessment of Critical Mass Laboratory safeguards and security upgrades

    SciTech Connect

    Merrill, B.J.; DeMyer, J.J.

    1985-05-31

    Pacific Northwest Laboratory (PNL) conducted an evaluation of the safeguards and security systems at the Critical Mass Laboratory (CML) in February 1985, to identify appropriate upgrading actions necessary to ensure that effective and efficient systems consistent with DOE-RL policies, procedures, and site priorities are in place. Since that evaluation, there have been changes in Patrol contingency philosophy, response tactics, and distribution of manpower. Because of these changes, and at the request of DOE-RL, PNL has re-evaluated the safeguards and security systems in place at CML.

  18. How Effective Is Feedback in Computer-Aided Assessments?

    ERIC Educational Resources Information Center

    Gill, Mundeep; Greenhow, Martin

    2008-01-01

    Computer-Aided Assessments (CAAs) have been used increasingly at Brunel University for over 10 years to test students' mathematical abilities. Recently, we have focussed on providing very rich feedback to the students; given the work involved in designing and coding such feedback, it is important to study the impact of the interaction between…

  19. Computer-aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.

    1984-01-01

    This research on pilot response to critical in-flight events employs a unique methodology including an interactive computer-aided scenario-testing system. Navigation displays, instrument-panel displays, and assorted textual material are presented on a touch-sensitive CRT screen. Problem diagnosis scenarios, destination-diversion scenarios and combined destination/diagnostic tests are available. A complete time history of all data inquiries and responses is maintained. Sample results of diagnosis scenarios obtained from testing 38 licensed pilots are presented and discussed.

  20. Assessment of Critical Business Skill Development by MBA Alumni

    ERIC Educational Resources Information Center

    Glynn, Joseph G.; Wood, Gregory R.

    2008-01-01

    Six years of survey data were analyzed to assess, among other things, the degree to which an AACSB accredited graduate business program successfully developed student skills in a variety of areas deemed important for career success. The study illustrates a methodology institutions can use to respond to increasing demands for program evaluation and…

  1. Assessing Preservice Teachers' Dispositions: A Critical Dimension of Professional Preparation

    ERIC Educational Resources Information Center

    Rike, Cheryl J.; Sharp, L. Kathryn

    2008-01-01

    The early childhood faculty at the University of Memphis developed the Early Childhood Education Behaviors & Dispositions Checklist for four main purposes: (1) The faculty needed a way to clearly communicate to students the expectations for their dispositions and the means of assessment; (2) It is a professional obligation in preservice teacher…

  2. Critical Issues in Assessing Teacher Compensation. Backgrounder. No. 2638

    ERIC Educational Resources Information Center

    Richwine, Jason; Biggs, Andrew G.

    2012-01-01

    A November 2011 Heritage Foundation report--"Assessing the Compensation of Public-School Teachers"--presented data on teacher salaries and benefits in order to inform debates about teacher compensation reform. The report concluded that public-school teacher compensation is far ahead of what comparable private-sector workers enjoy, and that…

  3. Needs Assessment: A Critical Tool for Guidance Planning.

    ERIC Educational Resources Information Center

    Martin, Susan A.

    This study was conducted to identify what elementary school staff and district parents believed to be important elementary guidance services. A needs assessment questionnaire was given to all 112 staff members (principals, teaching staff, teacher aides, secretaries, and school nurse) in the district's 2 elementary schools. Fifty-eight completed…

  4. Critical Inquiry and Writing Centers: A Methodology of Assessment

    ERIC Educational Resources Information Center

    Bell, Diana Calhoun; Frost, Alanna

    2012-01-01

    By examining one writing center's role in student success, this project offers two examples of the way writing centers impact student engagement. This analysis models a methodology that writing and learning center directors can utilize in order to foster effective communication with stakeholders. By conducting data-driven assessment, directors can…

  5. A critical review of seven selected neighborhood sustainability assessment tools

    SciTech Connect

    Sharifi, Ayyoob Murayama, Akito

    2013-01-15

    Neighborhood sustainability assessment tools have become widespread since the turn of 21st century and many communities, mainly in the developed world, are utilizing these tools to measure their success in approaching sustainable development goals. In this study, seven tools from Australia, Europe, Japan, and the United States are selected and analyzed with the aim of providing insights into the current situations; highlighting the strengths, weaknesses, successes, and failures; and making recommendations for future improvements. Using a content analysis, the issues of sustainability coverage, pre-requisites, local adaptability, scoring and weighting, participation, reporting, and applicability are discussed in this paper. The results of this study indicate that most of the tools are not doing well regarding the coverage of social, economic, and institutional aspects of sustainability; there are ambiguities and shortcomings in the weighting, scoring, and rating; in most cases, there is no mechanism for local adaptability and participation; and, only those tools which are embedded within the broader planning framework are doing well with regard to applicability. - Highlights: Black-Right-Pointing-Pointer Seven widely used assessment tools were analyzed. Black-Right-Pointing-Pointer There is a lack of balanced assessment of sustainability dimensions. Black-Right-Pointing-Pointer Tools are not doing well regarding the applicability. Black-Right-Pointing-Pointer Refinements are needed to make the tools more effective. Black-Right-Pointing-Pointer Assessment tools must be integrated into the planning process.

  6. Embodied cognition and mirror neurons: a critical assessment.

    PubMed

    Caramazza, Alfonso; Anzellotti, Stefano; Strnad, Lukas; Lingnau, Angelika

    2014-01-01

    According to embodied cognition theories, higher cognitive abilities depend on the reenactment of sensory and motor representations. In the first part of this review, we critically analyze the central claims of embodied theories and argue that the existing behavioral and neuroimaging data do not allow investigators to discriminate between embodied cognition and classical cognitive accounts, which assume that conceptual representations are amodal and symbolic. In the second part, we review the main claims and the core electrophysiological findings typically cited in support of the mirror neuron theory of action understanding, one of the most influential examples of embodied cognition theories. In the final part, we analyze the claim that mirror neurons subserve action understanding by mapping visual representations of observed actions on motor representations, trying to clarify in what sense the representations carried by these neurons can be claimed motor.

  7. Prospective evaluation of an internet-linked handheld computer critical care knowledge access system

    PubMed Central

    Lapinsky, Stephen E; Wax, Randy; Showalter, Randy; Martinez-Motta, J Carlos; Hallett, David; Mehta, Sangeeta; Burry, Lisa; Stewart, Thomas E

    2004-01-01

    Introduction Critical care physicians may benefit from immediate access to medical reference material. We evaluated the feasibility and potential benefits of a handheld computer based knowledge access system linking a central academic intensive care unit (ICU) to multiple community-based ICUs. Methods Four community hospital ICUs with 17 physicians participated in this prospective interventional study. Following training in the use of an internet-linked, updateable handheld computer knowledge access system, the physicians used the handheld devices in their clinical environment for a 12-month intervention period. Feasibility of the system was evaluated by tracking use of the handheld computer and by conducting surveys and focus group discussions. Before and after the intervention period, participants underwent simulated patient care scenarios designed to evaluate the information sources they accessed, as well as the speed and quality of their decision making. Participants generated admission orders during each scenario, which were scored by blinded evaluators. Results Ten physicians (59%) used the system regularly, predominantly for nonmedical applications (median 32.8/month, interquartile range [IQR] 28.3–126.8), with medical software accessed less often (median 9/month, IQR 3.7–13.7). Eight out of 13 physicians (62%) who completed the final scenarios chose to use the handheld computer for information access. The median time to access information on the handheld handheld computer was 19 s (IQR 15–40 s). This group exhibited a significant improvement in admission order score as compared with those who used other resources (P = 0.018). Benefits and barriers to use of this technology were identified. Conclusion An updateable handheld computer system is feasible as a means of point-of-care access to medical reference material and may improve clinical decision making. However, during the study, acceptance of the system was variable. Improved training and new

  8. Scoring Systems in Assessing Survival of Critically Ill ICU Patients

    PubMed Central

    Sekulic, Ana D.; Trpkovic, Sladjana V.; Pavlovic, Aleksandar P.; Marinkovic, Olivera M.; Ilic, Aleksandra N.

    2015-01-01

    Background The aim of this study was to determine which of the most commonly used scoring systems for evaluation of critically ill patients in the ICU is the best and simplest to use in our hospital. Material/Methods This prospective study included 60 critically ill patients. After admittance to the ICU, APACHE II, SAPS II, and MPM II0 were calculated. During further treatment in the ICU, SOFA and MPM II were calculated at 24 h, 48 h, and 72 h and 7 days after admittance using laboratory and radiological measures. Results In comparison with survivors, non-survivors were older (p<0.01) and spent significantly more days on mechanical ventilation (p<0.01). ARDS was significantly more common in patients who survived compared to those who did not (chi-square=7.02, p<0.01), which is not the case with sepsis (chi-square=0.388, p=0.53). AUROC SAPS II was 0.690, and is only slightly higher than the other 2 AUROC incipient scoring systems, MPM II and APACHE II (0.654 and 0.623). The APACHE II has the highest specificity (81.8%) and MPM II the highest sensitivity (85.2%). MPM II7day AUROC (1.0) shows the best discrimination between patients who survived and those who did not. MPM II48 (0.836), SOFA72 (0.821) and MPM II72 (0.817) also had good discrimination scores. Conclusions APACHE II and SAPS II measured on admission to the ICU were significant predictors of complications. MPM II7day has the best discriminatory power, followed by SOFA7day and MPM II48. MPM II7day has the best calibration followed by SOFA7day and APACHE II. PMID:26336861

  9. Prediction of State Mandated Assessment Mathematics Scores from Computer Based Mathematics and Reading Preview Assessments

    ERIC Educational Resources Information Center

    Costa-Guerra, Boris

    2012-01-01

    The study sought to understand whether MAPs computer based assessment of math and language skills using MAPs reading scores can predict student scores on the NMSBA. A key question was whether or not the prediction can be improved by including student language skill scores. The study explored the effectiveness of computer based preview assessments…

  10. Ecological risk assessment of acidification in the Northern Eurasia using critical load concept

    SciTech Connect

    Bashkin, V.; Golinets, O.

    1995-12-31

    This research presents the risk analysis of acid forming compounds input using critical loads (CL) values of sulfur, nitrogen, and acidity under the computer calculations for terrestrial and freshwater ecosystems of Northern Eurasia. The Cl values are used to set goals for future deposition rates of acidifying and eutrophication compounds so that the environment is protected. CL values for various ecosystems are determined using EM GIS approach. The most influential sources, such as nitrogen, sulfur and base cations uptake by vegetation, surface and groundwater leaching from terrestrial to freshwater ecosystems are described for the whole territory under study regarding uncertainty analysis and the level of corresponding risk assessment. This may be explained by many factors of which the most important are: the estimation of plant uptake is carried out on the basis of data on the biogeochemical cycling of various elements, for which adequate quantitative characterization for all ecosystems under study is either absent or insufficient; reliable information on the quantitative assessment of the ratio between perennial plant biomes increase and dead matter is absent for the required level of spatial and temporal resolution; reliable data on surface and underground runoff in various ecosystems are rare; the influence of hydrothermic factors on the above mentioned processes has not been quantitatively determined at required level of model resolution.

  11. Assessing computer waste generation in Chile using material flow analysis.

    PubMed

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  12. CART V: recent advancements in computer-aided camouflage assessment

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Müller, Markus

    2011-05-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007-2010 [1], [2], [3], [4]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors as well as methods to assess the object's movement conspicuity. In this fifth part in an annual series at the SPIE conference in Orlando, this paper presents the enhancements over the recent year and addresses the camouflage assessment of static and moving objects in multispectral image data that can show noise or image artefacts. The presented methods fathom the correlations between image processing and camouflage assessment. A novel algorithm is presented based on template matching to assess the structural inconspicuity of an object objectively and quantitatively. The results can easily be combined with an MTI (moving target indication) based movement conspicuity assessment function in order to explore the influence of object movement to a camouflage effect in different environments. As the results show, the presented methods contribute to a significant benefit in the field of camouflage assessment.

  13. Assessment technique for computer-aided manufactured sockets.

    PubMed

    Sanders, Joan E; Severance, Michael R

    2011-01-01

    This article presents an assessment technique for testing the quality of prosthetic socket fabrication processes at computer-aided manufacturing facilities. The assessment technique is potentially useful to both facilities making sockets and companies marketing manufacturing equipment seeking to assess and improve product quality. To execute the assessment technique, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file (e-file) shapes. To characterize carving performance, model shapes are compared with e-file shapes. To characterize forming performance, socket shapes are compared with model shapes. The mean radial error (MRE), which is the average difference in radii between the two compared shapes, provides insight into sizing quality. Interquartile range (IQR), the range of radial error for the best-matched half of the points on the compared socket surfaces, provides insight into regional shape quality. The source(s) of socket shape error may be pinpointed by separately determining MRE and IQR for carving and forming. The developed assessment technique may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer-aided manufacturing and give insight into appropriate modifications to overcome them. PMID:21938663

  14. Assessment technique for computer-aided manufactured sockets.

    PubMed

    Sanders, Joan E; Severance, Michael R

    2011-01-01

    This article presents an assessment technique for testing the quality of prosthetic socket fabrication processes at computer-aided manufacturing facilities. The assessment technique is potentially useful to both facilities making sockets and companies marketing manufacturing equipment seeking to assess and improve product quality. To execute the assessment technique, an evaluator fabricates a collection of test models and sockets using the manufacturing suite under evaluation, then measures their shapes using scanning equipment. Overall socket quality is assessed by comparing socket shapes with electronic file (e-file) shapes. To characterize carving performance, model shapes are compared with e-file shapes. To characterize forming performance, socket shapes are compared with model shapes. The mean radial error (MRE), which is the average difference in radii between the two compared shapes, provides insight into sizing quality. Interquartile range (IQR), the range of radial error for the best-matched half of the points on the compared socket surfaces, provides insight into regional shape quality. The source(s) of socket shape error may be pinpointed by separately determining MRE and IQR for carving and forming. The developed assessment technique may provide a useful tool to the prosthetics community and industry to help identify problems and limitations in computer-aided manufacturing and give insight into appropriate modifications to overcome them.

  15. FORTRAN 4 computer program for calculating critical speeds of rotating shafts

    NASA Technical Reports Server (NTRS)

    Trivisonno, R. J.

    1973-01-01

    A FORTRAN 4 computer program, written for the IBM DCS 7094/7044 computer, that calculates the critical speeds of rotating shafts is described. The shaft may include bearings, couplings, extra masses (nonshaft mass), and disks for the gyroscopic effect. Shear deflection is also taken into account, and provision is made in the program for sections of the shaft that are tapered. The boundary conditions at the ends of the shaft can be fixed (deflection and slope equal to zero) or free (shear and moment equal to zero). The fixed end condition enables the program to calculate the natural frequencies of cantilever beams. Instead of using the lumped-parameter method, the program uses continuous integration of the differential equations of beam flexure across different shaft sections. The advantages of this method over the usual lumped-parameter method are less data preparation and better approximation of the distribution of the mass of the shaft. A main feature of the program is the nature of the output. The Calcomp plotter is used to produce a drawing of the shaft with superimposed deflection curves at the critical speeds, together with all pertinent information related to the shaft.

  16. Assessment of optical localizer accuracy for computer aided surgery systems.

    PubMed

    Elfring, Robert; de la Fuente, Matías; Radermacher, Klaus

    2010-01-01

    The technology for localization of surgical tools with respect to the patient's reference coordinate system in three to six degrees of freedom is one of the key components in computer aided surgery. Several tracking methods are available, of which optical tracking is the most widespread in clinical use. Optical tracking technology has proven to be a reliable method for intra-operative position and orientation acquisition in many clinical applications; however, the accuracy of such localizers is still a topic of discussion. In this paper, the accuracy of three optical localizer systems, the NDI Polaris P4, the NDI Polaris Spectra (in active and passive mode) and the Stryker Navigation System II camera, is assessed and compared critically. Static tests revealed that only the Polaris P4 shows significant warm-up behavior, with a significant shift of accuracy being observed within 42 minutes of being switched on. Furthermore, the intrinsic localizer accuracy was determined for single markers as well as for tools using a volumetric measurement protocol on a coordinate measurement machine. To determine the relative distance error within the measurement volume, the Length Measurement Error (LME) was determined at 35 test lengths. As accuracy depends strongly on the marker configuration employed, the error to be expected in typical clinical setups was estimated in a simulation for different tool configurations. The two active localizer systems, the Stryker Navigation System II camera and the Polaris Spectra (active mode), showed the best results, with trueness values (mean +/- standard deviation) of 0.058 +/- 0.033 mm and 0.089 +/- 0.061 mm, respectively. The Polaris Spectra (passive mode) showed a trueness of 0.170 +/- 0.090 mm, and the Polaris P4 showed the lowest trueness at 0.272 +/- 0.394 mm with a higher number of outliers than for the other cameras. The simulation of the different tool configurations in a typical clinical setup revealed that the tracking error can

  17. Standards for chemical quality of drinking water: a critical assessment.

    PubMed

    Zielhuis, R L

    1982-01-01

    The author critically reviews present standards for the chemical quality of drinking water, particularly the limits proposed by the Commission of the European Communities (CEC) in 1979. Particularly, the general principles of standard setting are discussed. It appears that there exists a surprisingly high similarity in drinking water limits, issued by various national and international authorities, although for other environmental compartments important discrepancies exist. Usually, drinking water limits lack adequate documentation, and appear often to be copied from other existing lists. There is an apparent lack of logical consistency in limits set for food, ambient or workroom air, and drinking water, probably due to lack of communication between health experts and decision-making authorities. Moreover, there is a lack of toxicologic studies, explicitly aimed at setting limits. Extrapolation from the acceptable daily intakes (ADI) for food or the Threshold Limit Value (TLV)-Maximum Acceptable Concentration (MAC) for workroom air could be undertaken to derive tentative drinking water limits, as long as explicitly designed studies for drinking water are not yet available. PMID:6749691

  18. Nuclear criticality safety assessment of the proposed CFC replacement coolants

    SciTech Connect

    Jordan, W.C.; Dyer, H.R.

    1993-12-01

    The neutron multiplication characteristics of refrigerant-114 (R-114) and proposed replacement coolants perfluorobutane (C{sub 4}F{sub 10}) and cycloperfluorobutane C{sub 4}F{sub 8}) have been compared by evaluating the infinite media multiplication factors of UF{sub 6}/H/coolant systems and by replacement calculations considering a 10-MW freezer/sublimer. The results of these comparisons demonstrate that R-114 is a neutron absorber, due to its chlorine content, and that the alternative fluorocarbon coolants are neutron moderators. Estimates of critical spherical geometries considering mixtures of UF{sub 6}/HF/C{sub 4}F{sub 10} indicate that the flourocarbon-moderated systems are large compared with water-moderated systems. The freezer/sublimer calculations indicate that the alternative coolants are more reactive than R-114, but that the reactivity remains significantly below the condition of water in the tubes, which was a limiting condition. Based on these results, the alternative coolants appear to be acceptable; however, several follow-up tasks have been recommended, and additional evaluation will be required on an individual equipment basis.

  19. A conceptual framework for developing a critical thinking self-assessment scale.

    PubMed

    Nair, Girija G; Stamler, Lynnette Leeseberg

    2013-03-01

    Nurses must be talented critical thinkers to cope with the challenges related to the ever-changing health care system, population trends, and extended role expectations. Several countries now recognize critical thinking skills (CTS) as an expected outcome of nursing education programs. Critical thinking has been defined in multiple ways by philosophers, critical thinking experts, and educators. Nursing experts conceptualize critical thinking as a process involving cognitive and affective domains of reasoning. Nurse educators are often challenged with teaching and measuring CTS because of their latent nature and the lack of a uniform definition of the concept. In this review of the critical thinking literature, we examine various definitions, identify a set of constructs that define critical thinking, and suggest a conceptual framework on which to base a self-assessment scale for measuring CTS.

  20. Content Validation and Utility of a Critical Reflective Inquiry Assessment Tool.

    PubMed

    Asselin, Marilyn E; Fain, James A

    2016-01-01

    A Critical Reflective Inquiry (CRI) Assessment Tool was developed based on the CRI Model to assess reflection in nursing practice. Experienced clinicians evaluated the CRI Assessment for clarity and relevance to the CRI Model and nursing practice utility. Content validity index was calculated for each item in the scale and then averaged across all items. The tool has potential in education and orientation for assessing the depth and focus of reflection and what is learned. PMID:27648898

  1. Criticality Model

    SciTech Connect

    A. Alsaed

    2004-09-14

    computational method will be used for evaluating the criticality potential of configurations of fissionable materials (in-package and external to the waste package) within the repository at Yucca Mountain, Nevada for all waste packages/waste forms. The criticality computational method is also applicable to preclosure configurations. The criticality computational method is a component of the methodology presented in ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003). How the criticality computational method fits in the overall disposal criticality analysis methodology is illustrated in Figure 1 (YMP 2003, Figure 3). This calculation will not provide direct input to the total system performance assessment for license application. It is to be used as necessary to determine the criticality potential of configuration classes as determined by the configuration probability analysis of the configuration generator model (BSC 2003a).

  2. [Risk assessment for pressure ulcer in critical patients].

    PubMed

    Gomes, Flávia Sampaio Latini; Bastos, Marisa Antonini Ribeiro; Matozinhos, Fernanda Penido; Temponi, Hanrieti Rotelli; Velásquez-Meléndez, Gustavo

    2011-04-01

    Bedridden patients are in risk to developing pressure ulcers and represent a priority group to be studied to identify this condition. To reach this goal, specific instruments are used to assess this problem. The objective of this study was to analyze the risk factors to developing pressure ulcers in adult patients hospitalized in ICUs. This is a sectional analytical study, in which evaluations were performed on 140 patients, hospitalized in 22 ICUs, using the Braden scale. Results showed that patients hospitalized from 15 days or more showed some level of risk. The highest frequencies of pressure ulcers were found in patients in the following categories: sensorial perception (completely limited), moistness (constantly moist), mobility (completely immobilized), activity (bedridden), nutrition (adequate) and friction and shear (problem). In conclusion, the use of this scale is an important strategy when providing care to patients in intensive treatment.

  3. Complexity theory and geographies of health: a critical assessment.

    PubMed

    Gatrell, Anthony C

    2005-06-01

    The interest of social scientists in complexity theory has developed rapidly in recent years. Here, I consider briefly the primary characteristics of complexity theory, with particular emphasis given to relations and networks, non-linearity, emergence, and hybrids. I assess the 'added value' compared with other, existing perspectives that emphasise relationality and connectedness. I also consider the philosophical underpinnings of complexity theory and its reliance on metaphor. As a vehicle for moving away from reductionist accounts, complexity theory potentially has much to say to those interested in research on health inequalities, spatial diffusion, emerging and resurgent infections, and risk. These and other applications in health geography that have invoked complexity theory are examined in the paper. Finally, I consider some of the missing elements in complexity theory and argue that while it is refreshing to see a fruitful line of theoretical debate in health geography, we need good empirical work to illuminate it.

  4. Assessing the Amazon Cloud Suitability for CLARREO's Computational Needs

    NASA Technical Reports Server (NTRS)

    Goldin, Daniel; Vakhnin, Andrei A.; Currey, Jon C.

    2015-01-01

    In this document we compare the performance of the Amazon Web Services (AWS), also known as Amazon Cloud, with the CLARREO (Climate Absolute Radiance and Refractivity Observatory) cluster and assess its suitability for computational needs of the CLARREO mission. A benchmark executable to process one month and one year of PARASOL (Polarization and Anistropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar) data was used. With the optimal AWS configuration, adequate data-processing times, comparable to the CLARREO cluster, were found. The assessment of alternatives to the CLARREO cluster continues and several options, such as a NASA-based cluster, are being considered.

  5. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use.

  6. Transfer matrix computation of critical polynomials for two-dimensional Potts models

    SciTech Connect

    Jacobsen, Jesper Lykke; Scullard, Christian R.

    2013-02-04

    We showed, In our previous work, that critical manifolds of the q-state Potts model can be studied by means of a graph polynomial PB(q, v), henceforth referred to as the critical polynomial. This polynomial may be defined on any periodic two-dimensional lattice. It depends on a finite subgraph B, called the basis, and the manner in which B is tiled to construct the lattice. The real roots v = eK — 1 of PB(q, v) either give the exact critical points for the lattice, or provide approximations that, in principle, can be made arbitrarily accurate by increasing the size of B in an appropriate way. In earlier work, PB(q, v) was defined by a contraction-deletion identity, similar to that satisfied by the Tutte polynomial. Here, we give a probabilistic definition of PB(q, v), which facilitates its computation, using the transfer matrix, on much larger B than was previously possible.We present results for the critical polynomial on the (4, 82), kagome, and (3, 122) lattices for bases of up to respectively 96, 162, and 243 edges, compared to the limit of 36 edges with contraction-deletion. We discuss in detail the role of the symmetries and the embedding of B. The critical temperatures vc obtained for ferromagnetic (v > 0) Potts models are at least as precise as the best available results from Monte Carlo simulations or series expansions. For instance, with q = 3 we obtain vc(4, 82) = 3.742 489 (4), vc(kagome) = 1.876 459 7 (2), and vc(3, 122) = 5.033 078 49 (4), the precision being comparable or superior to the best simulation results. More generally, we trace the critical manifolds in the real (q, v) plane and discuss the intricate structure of the phase diagram in the antiferromagnetic (v < 0) region.

  7. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  8. Use of writing portfolios for interdisciplinary assessment of critical thinking outcomes of nursing students.

    PubMed

    Sorrell, J M; Brown, H N; Silva, M C; Kohlenberg, E M

    1997-01-01

    This article discusses an interdisciplinary research project in which faculty from nursing and english collaborated in the assessment of students' critical thinking skills as reflected in writing portfolios. Faculty reviewed students' writing portfolios and then corresponded on email from two different universities about evidence of critical thinking in the portfolios. Findings suggest that writing portfolios can provide important evidence of critical thinking outcomes. To do this, however, faculty need to design writing assignments to foster critical thinking skills, helping students to think not only about learning to write, but also about using writing to learn.

  9. Need Assessment of Computer Science and Engineering Graduates

    NASA Astrophysics Data System (ADS)

    Surakka, Sami; Malmi, Lauri

    2005-06-01

    This case study considered the syllabus of the first and second year studies in computer science. The aim of the study was to reveal which topics covered in the syllabi were really needed during the following years of study or in working life. The program that was assessed in the study was a Masters program in computer science and engineering at a university of technology in Finland. The necessity of different subjects for the advanced studies (years 3? ?5) and for working life was assessed using four content analyses: (a) the course catalog of the institution where this study was carried out, (b) employment reports that were attached to the applications for internship credits, (c) masters theses, and (d) job advertisements in a newspaper. The results of the study imply that the necessity of physics for the advanced study and work was very low compared to the extent to which it was studied. On the other hand, the necessity for mathematics was moderate, and it had remained quite steady during the period 1989? ?2002. The most necessary computer science topic was programming. Also telecommunications and networking was needed often, whereas theoretical computer science was needed quite rarely.

  10. Prediction of critical heat flux in water-cooled plasma facing components using computational fluid dynamics.

    SciTech Connect

    Bullock, James H.; Youchison, Dennis Lee; Ulrickson, Michael Andrew

    2010-11-01

    Several commercial computational fluid dynamics (CFD) codes now have the capability to analyze Eulerian two-phase flow using the Rohsenow nucleate boiling model. Analysis of boiling due to one-sided heating in plasma facing components (pfcs) is now receiving attention during the design of water-cooled first wall panels for ITER that may encounter heat fluxes as high as 5 MW/m2. Empirical thermalhydraulic design correlations developed for long fission reactor channels are not reliable when applied to pfcs because fully developed flow conditions seldom exist. Star-CCM+ is one of the commercial CFD codes that can model two-phase flows. Like others, it implements the RPI model for nucleate boiling, but it also seamlessly transitions to a volume-of-fluid model for film boiling. By benchmarking the results of our 3d models against recent experiments on critical heat flux for both smooth rectangular channels and hypervapotrons, we determined the six unique input parameters that accurately characterize the boiling physics for ITER flow conditions under a wide range of absorbed heat flux. We can now exploit this capability to predict the onset of critical heat flux in these components. In addition, the results clearly illustrate the production and transport of vapor and its effect on heat transfer in pfcs from nucleate boiling through transition to film boiling. This article describes the boiling physics implemented in CCM+ and compares the computational results to the benchmark experiments carried out independently in the United States and Russia. Temperature distributions agreed to within 10 C for a wide range of heat fluxes from 3 MW/m2 to 10 MW/m2 and flow velocities from 1 m/s to 10 m/s in these devices. Although the analysis is incapable of capturing the stochastic nature of critical heat flux (i.e., time and location may depend on a local materials defect or turbulence phenomenon), it is highly reliable in determining the heat flux where boiling instabilities begin

  11. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention.

    PubMed

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2015-09-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald and Garland, Psycholog Assess 25:146-156, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on "joining," which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached 0.83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings.

  12. Evaluation and critical assessment of putative MCL-1 inhibitors

    PubMed Central

    Varadarajan, S; Vogler, M; Butterworth, M; Dinsdale, D; Walensky, L D; Cohen, G M

    2013-01-01

    High levels of BCL-2 family proteins are implicated in a failed/ineffective apoptotic programme, often resulting in diseases, including cancer. Owing to their potential as drug targets in cancer therapy, several inhibitors of BCL-2 family proteins have been developed. These primarily target specific members of the BCL-2 family, particularly BCL-2 and BCL-XL but are ineffective against MCL-1. Major efforts have been invested in developing inhibitors of MCL-1, which is commonly amplified in human tumours and associated with tumour relapse and chemoresistance. In this report, the specificity of several BCL-2 family inhibitors (ABT-263, UCB-1350883, apogossypol and BH3I-1) was investigated and compared with putative MCL-1 inhibitors designed to exhibit improved or selective binding affinities for MCL-1 (TW-37, BI97C1, BI97C10, BI112D1, compounds 6 and 7, and MCL-1 inhibitor molecule (MIM-1)). ABT-263, BI97C1, BI112D1, MIM-1 and TW-37 exhibited specificity in inducing apoptosis in a Bax/Bak- and caspase-9-dependent manner, whereas the other agents showed no killing activity, or little or no specificity. Of these inhibitors, only ABT-263 and UCB-1350883 induced apoptosis in a BCL-2- or BCL-XL-dependent system. In cells that depend on MCL-1 for survival, ABT-263 and TW-37 induced extensive apoptosis, suggesting that at high concentrations these inhibitors have the propensity to inhibit MCL-1 in a cellular context. TW-37 induced apoptosis, assessed by chromatin condensation, caspase processing and phosphatidylserine externalisation, in a BAK-dependent manner and in cells that require MCL-1 for survival. TW-37-mediated apoptosis was also partly dependent on NOXA, suggesting that derivatives of TW-37, if engineered to exhibit better selectivity and efficacy at low nanomolar concentrations, may provide useful lead compounds for further synthetic programmes. Expanded medicinal chemistry iteration, as performed for the ABT series, may likewise improve the potency and

  13. Response to "Critical Assessment of the Evidence for Striped Nanoparticles".

    PubMed

    Ong, Quy Khac; Stellacci, Francesco

    2015-01-01

    Stirling et al., (10.1371/journal.pone.0108482) presented an analysis on some of our publications on the formation of stripe-like domains on mixed-ligand coated gold nanoparticles. The authors shed doubts on some of our results however no valid argument is provided against what we have shown since our first publication: scanning tunneling microscopy (STM) images of striped nanoparticles show stripe-like domains that are independent of imaging parameters and in particular of imaging speed. We have consistently ruled out the presence of artifacts by comparing sets of images acquired at different tip speeds, finding invariance of the stipe-like domains. Stirling and co-workers incorrectly analyzed this key control, using a different microscope and imaging conditions that do not compare to ours. We show here data proving that our approach is rigorous. Furthermore, we never solely relied on image analysis to draw our conclusions; we have always used the chemical nature of the particles to assess the veracity of our images. Stirling et al. do not provide any justification for the spacing of the features that we find on nanoparticles: ~1 nm for mixed ligand particles and ~ 0.5 nm for homoligand particles. Hence our two central arguments remain unmodified: independence from imaging parameters and dependence on ligand shell chemical composition. The paper report observations on our STM images; none is a sufficient condition to prove that our images are artifacts. We thoroughly addressed issues related to STM artifacts throughout our microscopy work. Stirling et al. provide guidelines for what they consider good STM images of nanoparticles, such images are indeed present in our literature. They conclude that the evidences we provided to date are insufficient, this is a departure from one of the authors' previous article which concluded that our images were composed of artifacts. Given that four independent laboratories have reproduced our measurements and that no

  14. Assessing Critical Thinking in Middle and High Schools: Meeting the Common Core

    ERIC Educational Resources Information Center

    Stobaugh, Rebecca

    2013-01-01

    This practical, very effective resource helps middle and high school teachers and curriculum leaders develop the skills to design instructional tasks and assessments that engage students in higher-level critical thinking, as recommended by the Common Core State Standards. Real examples of formative and summative assessments from a variety of…

  15. Evidence Based Clinical Assessment of Child and Adolescent Social Phobia: A Critical Review of Rating Scales

    ERIC Educational Resources Information Center

    Tulbure, Bogdan T.; Szentagotai, Aurora; Dobrean, Anca; David, Daniel

    2012-01-01

    Investigating the empirical support of various assessment instruments, the evidence based assessment approach expands the scientific basis of psychotherapy. Starting from Hunsley and Mash's evaluative framework, we critically reviewed the rating scales designed to measure social anxiety or phobia in youth. Thirteen of the most researched social…

  16. What Can You Learn in Three Minutes? Critical Reflection on an Assessment Task that Embeds Technology

    ERIC Educational Resources Information Center

    Brown, Natalie Ruth

    2009-01-01

    Purpose: The purpose of this paper is to critically examine an assessment task, undertaken by pre-service science teachers, that integrates the use of technology (in this case digital video-recorders and video-editing software) whilst scaffolding skill development. The embedding of technology into the assessment task is purposeful, aiming to…

  17. Concepts and techniques: Active electronics and computers in safety-critical accelerator operation

    SciTech Connect

    Frankel, R.S.

    1995-12-31

    The Relativistic Heavy Ion Collider (RHIC) under construction at Brookhaven National Laboratory, requires an extensive Access Control System to protect personnel from Radiation, Oxygen Deficiency and Electrical hazards. In addition, the complicated nature of operation of the Collider as part of a complex of other Accelerators necessitates the use of active electronic measurement circuitry to ensure compliance with established Operational Safety Limits. Solutions were devised which permit the use of modern computer and interconnections technology for Safety-Critical applications, while preserving and enhancing, tried and proven protection methods. In addition a set of Guidelines, regarding required performance for Accelerator Safety Systems and a Handbook of design criteria and rules were developed to assist future system designers and to provide a framework for internal review and regulation.

  18. Open-ended approaches to science assessment using computers

    NASA Astrophysics Data System (ADS)

    Singley, Mark K.; Taft, Hessy L.

    1995-03-01

    We discuss the potential role of technology in evaluating learning outcomes in large-scale, widespread science assessments of the kind typically done at ETS, such as the GRE, or the College Board SAT II Subject Tests. We describe the current state-of-the-art in this area, as well as briefly outline the history of technology in large-scale science assessment and ponder possibilities for the future. We present examples from our own work in the domain of chemistry, in which we are designing problem solving interfaces and scoring programs for stoichiometric and other kinds of quantitative problem solving. We also present a new scientific reasoning item type that we are prototyping on the computer. It is our view that the technological infrastructure for large-scale constructed response science assessment is well on its way to being available, although many technical and practical hurdles remain.

  19. Assessment of asthmatic inflammation using hybrid fluorescence molecular tomography-x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Ma, Xiaopeng; Prakash, Jaya; Ruscitti, Francesca; Glasl, Sarah; Stellari, Fabio Franco; Villetti, Gino; Ntziachristos, Vasilis

    2016-01-01

    Nuclear imaging plays a critical role in asthma research but is limited in its readings of biology due to the short-lived signals of radio-isotopes. We employed hybrid fluorescence molecular tomography (FMT) and x-ray computed tomography (XCT) for the assessment of asthmatic inflammation based on resolving cathepsin activity and matrix metalloproteinase activity in dust mite, ragweed, and Aspergillus species-challenged mice. The reconstructed multimodal fluorescence distribution showed good correspondence with ex vivo cryosection images and histological images, confirming FMT-XCT as an interesting alternative for asthma research.

  20. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    PubMed

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture. PMID:27441149

  1. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    PubMed

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  2. Guiding dental student learning and assessing performance in critical thinking with analysis of emerging strategies.

    PubMed

    Johnsen, David C; Lipp, Mitchell J; Finkelstein, Michael W; Cunningham-Ford, Marsha A

    2012-12-01

    Patient-centered care involves an inseparable set of knowledge, abilities, and professional traits on the part of the health care provider. For practical reasons, health professions education is segmented into disciplines or domains like knowledge, technical skills, and critical thinking, and the culture of dental education is weighted toward knowledge and technical skills. Critical thinking, however, has become a growing presence in dental curricula. To guide student learning and assess performance in critical thinking, guidelines have been developed over the past several decades in the educational literature. Prominent among these guidelines are the following: engage the student in multiple situations/exercises reflecting critical thinking; for each exercise, emulate the intended activity for validity; gain agreement of faculty members across disciplines and curriculum years on the learning construct, application, and performance assessment protocol for reliability; and use the same instrument to guide learning and assess performance. The purposes of this article are 1) to offer a set of concepts from the education literature potentially helpful to guide program design or corroborate existing programs in dental education; 2) to offer an implementation model consolidating these concepts as a guide for program design and execution; 3) to cite specific examples of exercises and programs in critical thinking in the dental education literature analyzed against these concepts; and 4) to discuss opportunities and challenges in guiding student learning and assessing performance in critical thinking for dentistry.

  3. Control System Applicable Use Assessment of the Secure Computing Corporation - Secure Firewall (Sidewinder)

    SciTech Connect

    Hadley, Mark D.; Clements, Samuel L.

    2009-01-01

    Battelle’s National Security & Defense objective is, “applying unmatched expertise and unique facilities to deliver homeland security solutions. From detection and protection against weapons of mass destruction to emergency preparedness/response and protection of critical infrastructure, we are working with industry and government to integrate policy, operational, technological, and logistical parameters that will secure a safe future”. In an ongoing effort to meet this mission, engagements with industry that are intended to improve operational and technical attributes of commercial solutions that are related to national security initiatives are necessary. This necessity will ensure that capabilities for protecting critical infrastructure assets are considered by commercial entities in their development, design, and deployment lifecycles thus addressing the alignment of identified deficiencies and improvements needed to support national cyber security initiatives. The Secure Firewall (Sidewinder) appliance by Secure Computing was assessed for applicable use in critical infrastructure control system environments, such as electric power, nuclear and other facilities containing critical systems that require augmented protection from cyber threat. The testing was performed in the Pacific Northwest National Laboratory’s (PNNL) Electric Infrastructure Operations Center (EIOC). The Secure Firewall was tested in a network configuration that emulates a typical control center network and then evaluated. A number of observations and recommendations are included in this report relating to features currently included in the Secure Firewall that support critical infrastructure security needs.

  4. Improving Educational Assessment: A Computer-Adaptive Multiple Choice Assessment Using NRET as the Scoring Method

    ERIC Educational Resources Information Center

    Sie Hoe, Lau; Ngee Kiong, Lau; Kian Sam, Hong; Bin Usop, Hasbee

    2009-01-01

    Assessment is central to any educational process. Number Right (NR) scoring method is a conventional scoring method for multiple choice items, where students need to pick one option as the correct answer. One point is awarded for the correct response and zero for any other responses. However, it has been heavily criticized for guessing and failure…

  5. Computational Fluid Dynamics Framework for Turbine Biological Performance Assessment

    SciTech Connect

    Richmond, Marshall C.; Serkowski, John A.; Carlson, Thomas J.; Ebner, Laurie L.; Sick, Mirjam; Cada, G. F.

    2011-05-04

    In this paper, a method for turbine biological performance assessment is introduced to bridge the gap between field and laboratory studies on fish injury and turbine design. Using this method, a suite of biological performance indicators is computed based on simulated data from a computational fluid dynamics (CFD) model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. If the relationship between the dose of an injury mechanism and frequency of injury (dose-response) is known from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from various turbine designs, the engineer can identify the more-promising designs. Discussion here is focused on Kaplan-type turbines, although the method could be extended to other designs. Following the description of the general methodology, we will present sample risk assessment calculations based on CFD data from a model of the John Day Dam on the Columbia River in the USA.

  6. Computational Pollutant Environment Assessment from Propulsion-System Testing

    NASA Technical Reports Server (NTRS)

    Wang, Ten-See; McConnaughey, Paul; Chen, Yen-Sen; Warsi, Saif

    1996-01-01

    An asymptotic plume growth method based on a time-accurate three-dimensional computational fluid dynamics formulation has been developed to assess the exhaust-plume pollutant environment from a simulated RD-170 engine hot-fire test on the F1 Test Stand at Marshall Space Flight Center. Researchers have long known that rocket-engine hot firing has the potential for forming thermal nitric oxides, as well as producing carbon monoxide when hydrocarbon fuels are used. Because of the complex physics involved, most attempts to predict the pollutant emissions from ground-based engine testing have used simplified methods, which may grossly underpredict and/or overpredict the pollutant formations in a test environment. The objective of this work has been to develop a computational fluid dynamics-based methodology that replicates the underlying test-stand flow physics to accurately and efficiently assess pollutant emissions from ground-based rocket-engine testing. A nominal RD-170 engine hot-fire test was computed, and pertinent test-stand flow physics was captured. The predicted total emission rates compared reasonably well with those of the existing hydrocarbon engine hot-firing test data.

  7. Assessment of Zero Power Critical Experiments and Needs for a Fission Surface Power System

    SciTech Connect

    Jim R Parry; John Darrell bess; Brad T. Rearden; Gary A. Harms

    2009-06-01

    The National Aeronautics and Space Administration (NASA) is providing funding to the Department of Energy (DOE) to assess, develop, and test nuclear technologies that could provide surface power to a lunar outpost. Sufficient testing of this fission surface power (FSP) system will need to be completed to enable a decision by NASA for flight development. The near-term goal for the FSP work is to conduct the minimum amount of testing needed to validate the system performance within an acceptable risk. This report attempts to assess the current modeling capabilities and quantify any bias associated with the modeling methods for designing the nuclear reactor. The baseline FSP system is a sodium-potassium (NaK) cooled, fast spectrum reactor with 93% 235U enriched HEU-O2 fuel, SS316 cladding, and beryllium reflectors with B4C control drums. The FSP is to produce approximately 40 kWe net power with a lifetime of at least 8 years at full power. A flight-ready FSP is to be ready for launch and deployment by 2020. Existing benchmarks from the International Criticality Safety Benchmark Evaluation Program (ICSBEP) were reviewed and modeled in MCNP. An average bias of less than 0.6% was determined using the ENDF/B-VII cross-section libraries except in the case of subcritical experiments, which exhibited an average bias of approximately 1.5%. The bias increases with increasing reflector worth of the beryllium. The uncertainties and sensitivities in cross section data for the FSP model and ZPPR-20 configurations were assessed using TSUNAMI-3D. The cross-section covariance uncertainty in the FSP model was calculated as 2.09%, which was dominated by the uncertainty in the 235U(n,?) reactions. Global integral indices were generated in TSUNAMI-IP using pre-release SCALE 6 cross-section covariance data. The ZPPR-20 benchmark models exhibit strong similarity with the FSP model. A penalty assessment was performed to determine the degree of which the FSP model could not be characterized

  8. Blending Qualitative and Computational Linguistics Methods for Fidelity Assessment: Experience with the Familias Unidas Preventive Intervention

    PubMed Central

    Gallo, Carlos; Pantin, Hilda; Villamar, Juan; Prado, Guillermo; Tapia, Maria; Ogihara, Mitsunori; Cruden, Gracelyn; Brown, C Hendricks

    2014-01-01

    Careful fidelity monitoring and feedback are critical to implementing effective interventions. A wide range of procedures exist to assess fidelity; most are derived from observational assessments (Schoenwald et al, 2013). However, these fidelity measures are resource intensive for research teams in efficacy/effectiveness trials, and are often unattainable or unmanageable for the host organization to rate when the program is implemented on a large scale. We present a first step towards automated processing of linguistic patterns in fidelity monitoring of a behavioral intervention using an innovative mixed methods approach to fidelity assessment that uses rule-based, computational linguistics to overcome major resource burdens. Data come from an effectiveness trial of the Familias Unidas intervention, an evidence-based, family-centered preventive intervention found to be efficacious in reducing conduct problems, substance use and HIV sexual risk behaviors among Hispanic youth. This computational approach focuses on “joining,” which measures the quality of the working alliance of the facilitator with the family. Quantitative assessments of reliability are provided. Kappa scores between a human rater and a machine rater for the new method for measuring joining reached .83. Early findings suggest that this approach can reduce the high cost of fidelity measurement and the time delay between fidelity assessment and feedback to facilitators; it also has the potential for improving the quality of intervention fidelity ratings. PMID:24500022

  9. Use of a computer-assisted administrative control to enhance criticality safety in LLNL for fissile material disposition operations

    SciTech Connect

    Huang, Song T.; Lappa, D.A.; Chiao, Tang

    1997-04-01

    This paper deals primarily with the use of a two-person rule on the mass limit control. Main emphasis is placed on the appropriate use of a computer program to assist operators in carrying out mass control. An attempt will be exercised to compare the use of a mass control card system under a two-person rule with a computer-assist two-person system. The interface points relevant to criticality safety between computer and human operators will be identified. Features that will make a computer program useful in a multiple workstation application environment will be discussed along with the merits of the using the computer program. How such a computer-assist administrative control may be incorporated in the overall infrastructure for criticality safety will be analyzed. Suggestion of future development of using a computer program to enhance safety margin will also be made to stimulate further discussion on the application of computer technology for real-time criticality safety control.

  10. Can Dental Cone Beam Computed Tomography Assess Bone Mineral Density?

    PubMed Central

    2014-01-01

    Mineral density distribution of bone tissue is altered by active bone modeling and remodeling due to bone complications including bone disease and implantation surgery. Clinical cone beam computed tomography (CBCT) has been examined whether it can assess oral bone mineral density (BMD) in patient. It has been indicated that CBCT has disadvantages of higher noise and lower contrast than conventional medical computed tomography (CT) systems. On the other hand, it has advantages of a relatively lower cost and radiation dose but higher spatial resolution. However, the reliability of CBCT based mineral density measurement has not yet been fully validated. Thus, the objectives of this review are to discuss 1) why assessment of BMD distribution is important and 2) whether the clinical CBCT can be used as a potential tool to measure the BMD. Brief descriptions of image artefacts associated with assessment of gray value, which has been used to account for mineral density, in CBCT images are provided. Techniques to correct local and conversion errors in obtaining the gray values in CBCT images are also introduced. This review can be used as a quick reference for users who may encounter these errors during analysis of CBCT images. PMID:25006568

  11. Performance Assessment of OVERFLOW on Distributed Computing Environment

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Rizk, Yehia M.

    2000-01-01

    The aerodynamic computer code, OVERFLOW, with a multi-zone overset grid feature, has been parallelized to enhance its performance on distributed and shared memory paradigms. Practical application benchmarks have been set to assess the efficiency of code's parallelism on high-performance architectures. The code's performance has also been experimented with in the context of the distributed computing paradigm on distant computer resources using the Information Power Grid (IPG) toolkit, Globus. Two parallel versions of the code, namely OVERFLOW-MPI and -MLP, have developed around the natural coarse grained parallelism inherent in a multi-zonal domain decomposition paradigm. The algorithm invokes a strategy that forms a number of groups, each consisting of a zone, a cluster of zones and/or a partition of a large zone. Each group can be thought of as a process with one or multithreads assigned to it and that all groups run in parallel. The -MPI version of the code uses explicit message-passing based on the standard MPI library for sending and receiving interzonal boundary data across processors. The -MLP version employs no message-passing paradigm; the boundary data is transferred through the shared memory. The -MPI code is suited for both distributed and shared memory architectures, while the -MLP code can only be used on shared memory platforms. The IPG applications are implemented by the -MPI code using the Globus toolkit. While a computational task is distributed across multiple computer resources, the parallelism can be explored on each resource alone. Performance studies are achieved with some practical aerodynamic problems with complex geometries, consisting of 2.5 up to 33 million grid points and a large number of zonal blocks. The computations were executed primarily on SGI Origin 2000 multiprocessors and on the Cray T3E. OVERFLOW's IPG applications are carried out on NASA homogeneous metacomputing machines located at three sites, Ames, Langley and Glenn. Plans

  12. Nuclear criticality safety assessment of the Consolidated Edison Uranium-Solidification Program Facility

    SciTech Connect

    Thomas, J.T.

    1984-01-01

    A nuclear criticality assessment of the Consolidated Edison Uranium-Solidification Program facility confirms that all operations involved in the process may be conducted with an acceptable margin of subcriticality. Normal operation presents no concern since subcriticality is maintained by design. Several recommendations are presented to prevent, or mitigate the consequences of, any abnormal events that might occur in the various portions of the process. These measures would also serve to reduce to a minimum the administrative controls required to prevent criticality.

  13. Computer database takes confusion out of multi-property assessments

    SciTech Connect

    Kinworthy, M.L.

    1996-03-01

    Managing environmental site assessments in multi-property transactions poses a special challenge. Multi-site ESAs require a tremendous amount of coordination, data collection and interpretation; often, these tasks must be completed according to accelerated timeframes to meet client deadlines. The tasks can be particularly challenging when several hundred sites are included in the transaction. In such cases, a computer database can be an effective, powerful tool for tracking and managing property data, and generating customized reports for large, multi-site ESAs.

  14. Assessment of a human computer interface prototyping environment

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1993-01-01

    A Human Computer Interface (HCI) prototyping environment with embedded evaluation capability has been successfully assessed which will be valuable in developing and refining HCI standards and evaluating program/project interface development, especially Space Station Freedom on-board displays for payload operations. The HCI prototyping environment is designed to include four components: (1) a HCI format development tool, (2) a test and evaluation simulator development tool, (3) a dynamic, interactive interface between the HCI prototype and simulator, and (4) an embedded evaluation capability to evaluate the adequacy of an HCI based on a user's performance.

  15. RESRAD-CHEM: A computer code for chemical risk assessment

    SciTech Connect

    Cheng, J.J.; Yu, C.; Hartmann, H.M.; Jones, L.G.; Biwer, B.M.; Dovel, E.S.

    1993-10-01

    RESRAD-CHEM is a computer code developed at Argonne National Laboratory for the U.S. Department of Energy to evaluate chemically contaminated sites. The code is designed to predict human health risks from multipathway exposure to hazardous chemicals and to derive cleanup criteria for chemically contaminated soils. The method used in RESRAD-CHEM is based on the pathway analysis method in the RESRAD code and follows the U.S. Environmental Protection Agency`s (EPA`s) guidance on chemical risk assessment. RESRAD-CHEM can be used to evaluate a chemically contaminated site and, in conjunction with the use of the RESRAD code, a mixed waste site.

  16. Assessment of liver ablation using cone beam computed tomography

    PubMed Central

    Abdel-Rehim, Mohamed; Ronot, Maxime; Sibert, Annie; Vilgrain, Valérie

    2015-01-01

    AIM: To investigate the feasibility and accuracy of cone beam computed tomography (CBCT) in assessing the ablation zone after liver tumor ablation. METHODS: Twenty-three patients (17 men and 6 women, range: 45-85 years old, mean age 65 years) with malignant liver tumors underwent ultrasound-guided percutaneous tumor ablation [radiofrequency (n = 14), microwave (n = 9)] followed by intravenous contrast-enhanced CBCT. Baseline multidetector computed tomography (MDCT) and peri-procedural CBCT images were compared. CBCT image quality was assessed as poor, good, or excellent. Image fusion was performed to assess tumor coverage, and quality of fusion was rated as bad, good, or excellent. Ablation zone volumes on peri-procedural CBCT and post-procedural MDCT were compared using the non-parametric paired Wilcoxon t-test. RESULTS: Rate of primary ablation effectiveness was 100%. There were no complications related to ablation. Local tumor recurrence and new liver tumors were found 3 mo after initial treatment in one patient (4%). The ablation zone was identified in 21/23 (91.3%) patients on CBCT. The fusion of baseline MDCT and peri-procedural CBCT images was feasible in all patients and showed satisfactory tumor coverage (at least 5-mm margin). CBCT image quality was poor, good, and excellent in 2 (9%), 8 (35%), and 13 (56%), patients respectively. Registration quality between peri-procedural CBCT and post-procedural MDCT images was good to excellent in 17/23 (74%) patients. The median ablation volume on peri-procedural CBCT and post-procedural MDCT was 30 cm3 (range: 4-95 cm3) and 30 cm3 (range: 4-124 cm3), respectively (P-value > 0.2). There was a good correlation (r = 0.79) between the volumes of the two techniques. CONCLUSION: Contrast-enhanced CBCT after tumor ablation of the liver allows early assessment of the ablation zone. PMID:25593467

  17. Assessment of spare reliability for multi-state computer networks within tolerable packet unreliability

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Kuei; Huang, Cheng-Fu

    2015-04-01

    From a quality of service viewpoint, the transmission packet unreliability and transmission time are both critical performance indicators in a computer system when assessing the Internet quality for supervisors and customers. A computer system is usually modelled as a network topology where each branch denotes a transmission medium and each vertex represents a station of servers. Almost every branch has multiple capacities/states due to failure, partial failure, maintenance, etc. This type of network is known as a multi-state computer network (MSCN). This paper proposes an efficient algorithm that computes the system reliability, i.e., the probability that a specified amount of data can be sent through k (k ≥ 2) disjoint minimal paths within both the tolerable packet unreliability and time threshold. Furthermore, two routing schemes are established in advance to indicate the main and spare minimal paths to increase the system reliability (referred to as spare reliability). Thus, the spare reliability can be readily computed according to the routing scheme.

  18. The development and testing of a qualitative instrument designed to assess critical thinking

    NASA Astrophysics Data System (ADS)

    Clauson, Cynthia Louisa

    This study examined a qualitative approach to assess critical thinking. An instrument was developed that incorporates an assessment process based on Dewey's (1933) concepts of self-reflection and critical thinking as problem solving. The study was designed to pilot test the critical thinking assessment process with writing samples collected from a heterogeneous group of students. The pilot test included two phases. Phase 1 was designed to determine the validity and inter-rater reliability of the instrument using two experts in critical thinking, problem solving, and literacy development. Validity of the instrument was addressed by requesting both experts to respond to ten questions in an interview. The inter-rater reliability was assessed by analyzing the consistency of the two experts' scorings of the 20 writing samples to each other, as well as to my scoring of the same 20 writing samples. Statistical analyses included the Spearman Rho and the Kuder-Richardson (Formula 20). Phase 2 was designed to determine the validity and reliability of the critical thinking assessment process with seven science teachers. Validity was addressed by requesting the teachers to respond to ten questions in a survey and interview. Inter-rater reliability was addressed by comparing the seven teachers' scoring of five writing samples with my scoring of the same five writing samples. Again, the Spearman Rho and the Kuder-Richardson (Formula 20) were used to determine the inter-rater reliability. The validity results suggest that the instrument is helpful as a guide for instruction and provides a systematic method to teach and assess critical thinking while problem solving with students in the classroom. The reliability results show the critical thinking assessment instrument to possess fairly high reliability when used by the experts, but weak reliability when used by classroom teachers. A major conclusion was drawn that teachers, as well as students, would need to receive instruction

  19. A Structured Security Assessment Methodology for Manufacturers of Critical Infrastructure Components

    NASA Astrophysics Data System (ADS)

    Brandstetter, Thomas; Knorr, Konstantin; Rosenbaum, Ute

    Protecting our critical infrastructures like energy generation and distribution, telecommunication, production and traffic against cyber attacks is one of the major challenges of the new millennium. However, as security is such a complex and multilayer topic often the necessary structured foundation is missing for a manufacturer to assess the current security level of a system. This paper introduces a methodology for structured security assessments which has been successfully applied during the development of several products for critical infrastructures. The methodology is described in detail and the lessons learnt are given from applying it to several systems during their development.

  20. Assessing Executive Function Using a Computer Game: Computational Modeling of Cognitive Processes

    PubMed Central

    Hagler, Stuart; Jimison, Holly B.; Pavel, Misha

    2014-01-01

    Early and reliable detection of cognitive decline is one of the most important challenges of current healthcare. In this project we developed an approach whereby a frequently played computer game can be used to assess a variety of cognitive processes and estimate the results of the pen-and-paper Trail-Making Test (TMT) – known to measure executive function, as well as visual pattern recognition, speed of processing, working memory, and set-switching ability. We developed a computational model of the TMT based on a decomposition of the test into several independent processes, each characterized by a set of parameters that can be estimated from play of a computer game designed to resemble the TMT. An empirical evaluation of the model suggests that it is possible to use the game data to estimate the parameters of the underlying cognitive processes and using the values of the parameters to estimate the TMT performance. Cognitive measures and trends in these measures can be used to identify individuals for further assessment, to provide a mechanism for improving the early detection of neurological problems, and to provide feedback and monitoring for cognitive interventions in the home. PMID:25014944

  1. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment

    PubMed Central

    Jain, Sangeeta; Hutchison, James; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees.

  2. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment.

    PubMed

    Jain, Sangeeta; Menon, Kusum; Piquette, Dominique; Gottesman, Ronald; Hutchison, James; Gilfoyle, Elaine; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees. PMID:27610029

  3. The Development of a Critical Care Resident Research Curriculum: A Needs Assessment

    PubMed Central

    Jain, Sangeeta; Hutchison, James; Group, Canadian Critical Care Trials

    2016-01-01

    Background. Conducting research is expected from many clinicians' professional profile, yet many do not have advanced research degrees. Research training during residency is variable amongst institutions and research education needs of trainees are not well understood. Objective. To understand needs of critical care trainees regarding research education. Methods. Canadian critical care trainees, new critical care faculty, program directors, and research coordinators were surveyed regarding research training, research expectations, and support within their programs. Results. Critical care trainees and junior faculty members highlighted many gaps in research knowledge and skills. In contrast, critical care program directors felt that trainees were prepared to undertake research careers. Major differences in opinion amongst program directors and other respondent groups exist regarding preparation for designing a study, navigating research ethics board applications, and managing a research budget. Conclusion. We demonstrated that Canadian critical care trainees and junior faculty reported gaps in knowledge in all areas of research. There was disagreement amongst trainees, junior faculty, research coordinators, and program directors regarding learning needs. Results from this needs assessment will be used to help redesign the education program of the Canadian Critical Care Trials Group to complement local research training offered for critical care trainees. PMID:27610029

  4. Sustainable Assessment? Critical Features of the Assessment Process in a Modularised Engineering Programme

    ERIC Educational Resources Information Center

    Lindberg-Sand, Asa; Olsson, Thomas

    2008-01-01

    This paper reports a project researching the interplay between a formal assessment system on the one hand and the development of students' and teachers' work in the actual assessment process on the other. Applying a social practice perspective, empirical data from the first year of an engineering programme mapped the assessment process through…

  5. Development and Evaluation of the Diagnostic Power for a Computer-Based Two-Tier Assessment

    ERIC Educational Resources Information Center

    Lin, Jing-Wen

    2016-01-01

    This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…

  6. Primary School Students' Attitudes towards Computer Based Testing and Assessment in Turkey

    ERIC Educational Resources Information Center

    Yurdabakan, Irfan; Uzunkavak, Cicek

    2012-01-01

    This study investigated the attitudes of primary school students towards computer based testing and assessment in terms of different variables. The sample for this research is primary school students attending a computer based testing and assessment application via CITO-OIS. The "Scale on Attitudes towards Computer Based Testing and Assessment" to…

  7. Pain assessment and management in the critically ill: wizardry or science?

    PubMed

    Puntillo, Kathleen

    2003-07-01

    Assessment and management of patients' pain across practice settings have recently received the increased attention of providers, patients, patients' families, and regulatory agencies. Scientific advances in understanding pain mechanisms, multidimensional methods of pain assessment, and analgesic pharmacology have aided in the improvement of pain management practices. However, pain assessment and management for critical care patients, especially those with communication barriers, continue to present challenges to clinicians and researchers. The state of nursing science of pain in critically ill patients, including development and testing of pain assessment methods and clinical trials of pharmacological interventions, is described. Special emphasis is placed on results from the Thunder Project II, a major multisite investigation of procedural pain. PMID:12882060

  8. Sedimentation equilibria in polydisperse ferrofluids: critical comparisons between experiment, theory, and computer simulation.

    PubMed

    Elfimova, Ekaterina A; Ivanov, Alexey O; Lakhtina, Ekaterina V; Pshenichnikov, Alexander F; Camp, Philip J

    2016-05-14

    The sedimentation equilibrium of dipolar particles in a ferrofluid is studied using experiment, theory, and computer simulation. A theory of the particle-concentration profile in a dipolar hard-sphere fluid is developed, based on the local-density approximation and accurate expressions from a recently introduced logarithmic free energy approach. The theory is tested critically against Monte Carlo simulation results for monodisperse and bidisperse dipolar hard-sphere fluids in homogeneous gravitational fields. In the monodisperse case, the theory is very accurate over broad ranges of gravitational field strength, volume fraction, and dipolar coupling constant. In the bidisperse case, with realistic dipolar coupling constants and compositions, the theory is excellent at low volume fraction, but is slightly inaccurate at high volume fraction in that it does not capture a maximum in the small-particle concentration profile seen in simulations. Possible reasons for this are put forward. Experimental measurements of the magnetic-susceptibility profile in a real ferrofluid are then analysed using the theory. The concentration profile is linked to the susceptibility profile using the second-order modified mean-field theory. It is shown that the experimental results are not consistent with the sample being monodisperse. By introducing polydispersity in the simplest possible way, namely by assuming the system is a binary mixture, almost perfect agreement between theory and experiment is achieved.

  9. Experimental evidence validating the computational inference of functional associations from gene fusion events: a critical survey.

    PubMed

    Promponas, Vasilis J; Ouzounis, Christos A; Iliopoulos, Ioannis

    2014-05-01

    More than a decade ago, a number of methods were proposed for the inference of protein interactions, using whole-genome information from gene clusters, gene fusions and phylogenetic profiles. This structural and evolutionary view of entire genomes has provided a valuable approach for the functional characterization of proteins, especially those without sequence similarity to proteins of known function. Furthermore, this view has raised the real possibility to detect functional associations of genes and their corresponding proteins for any entire genome sequence. Yet, despite these exciting developments, there have been relatively few cases of real use of these methods outside the computational biology field, as reflected from citation analysis. These methods have the potential to be used in high-throughput experimental settings in functional genomics and proteomics to validate results with very high accuracy and good coverage. In this critical survey, we provide a comprehensive overview of 30 most prominent examples of single pairwise protein interaction cases in small-scale studies, where protein interactions have either been detected by gene fusion or yielded additional, corroborating evidence from biochemical observations. Our conclusion is that with the derivation of a validated gold-standard corpus and better data integration with big experiments, gene fusion detection can truly become a valuable tool for large-scale experimental biology.

  10. Sedimentation equilibria in polydisperse ferrofluids: critical comparisons between experiment, theory, and computer simulation.

    PubMed

    Elfimova, Ekaterina A; Ivanov, Alexey O; Lakhtina, Ekaterina V; Pshenichnikov, Alexander F; Camp, Philip J

    2016-05-14

    The sedimentation equilibrium of dipolar particles in a ferrofluid is studied using experiment, theory, and computer simulation. A theory of the particle-concentration profile in a dipolar hard-sphere fluid is developed, based on the local-density approximation and accurate expressions from a recently introduced logarithmic free energy approach. The theory is tested critically against Monte Carlo simulation results for monodisperse and bidisperse dipolar hard-sphere fluids in homogeneous gravitational fields. In the monodisperse case, the theory is very accurate over broad ranges of gravitational field strength, volume fraction, and dipolar coupling constant. In the bidisperse case, with realistic dipolar coupling constants and compositions, the theory is excellent at low volume fraction, but is slightly inaccurate at high volume fraction in that it does not capture a maximum in the small-particle concentration profile seen in simulations. Possible reasons for this are put forward. Experimental measurements of the magnetic-susceptibility profile in a real ferrofluid are then analysed using the theory. The concentration profile is linked to the susceptibility profile using the second-order modified mean-field theory. It is shown that the experimental results are not consistent with the sample being monodisperse. By introducing polydispersity in the simplest possible way, namely by assuming the system is a binary mixture, almost perfect agreement between theory and experiment is achieved. PMID:27042815

  11. Experimental evidence validating the computational inference of functional associations from gene fusion events: a critical survey

    PubMed Central

    Promponas, Vasilis J.; Ouzounis, Christos A.; Iliopoulos, Ioannis

    2014-01-01

    More than a decade ago, a number of methods were proposed for the inference of protein interactions, using whole-genome information from gene clusters, gene fusions and phylogenetic profiles. This structural and evolutionary view of entire genomes has provided a valuable approach for the functional characterization of proteins, especially those without sequence similarity to proteins of known function. Furthermore, this view has raised the real possibility to detect functional associations of genes and their corresponding proteins for any entire genome sequence. Yet, despite these exciting developments, there have been relatively few cases of real use of these methods outside the computational biology field, as reflected from citation analysis. These methods have the potential to be used in high-throughput experimental settings in functional genomics and proteomics to validate results with very high accuracy and good coverage. In this critical survey, we provide a comprehensive overview of 30 most prominent examples of single pairwise protein interaction cases in small-scale studies, where protein interactions have either been detected by gene fusion or yielded additional, corroborating evidence from biochemical observations. Our conclusion is that with the derivation of a validated gold-standard corpus and better data integration with big experiments, gene fusion detection can truly become a valuable tool for large-scale experimental biology. PMID:23220349

  12. Electronic Quality of Life Assessment Using Computer-Adaptive Testing

    PubMed Central

    2016-01-01

    Background Quality of life (QoL) questionnaires are desirable for clinical practice but can be time-consuming to administer and interpret, making their widespread adoption difficult. Objective Our aim was to assess the performance of the World Health Organization Quality of Life (WHOQOL)-100 questionnaire as four item banks to facilitate adaptive testing using simulated computer adaptive tests (CATs) for physical, psychological, social, and environmental QoL. Methods We used data from the UK WHOQOL-100 questionnaire (N=320) to calibrate item banks using item response theory, which included psychometric assessments of differential item functioning, local dependency, unidimensionality, and reliability. We simulated CATs to assess the number of items administered before prespecified levels of reliability was met. Results The item banks (40 items) all displayed good model fit (P>.01) and were unidimensional (fewer than 5% of t tests significant), reliable (Person Separation Index>.70), and free from differential item functioning (no significant analysis of variance interaction) or local dependency (residual correlations < +.20). When matched for reliability, the item banks were between 45% and 75% shorter than paper-based WHOQOL measures. Across the four domains, a high standard of reliability (alpha>.90) could be gained with a median of 9 items. Conclusions Using CAT, simulated assessments were as reliable as paper-based forms of the WHOQOL with a fraction of the number of items. These properties suggest that these item banks are suitable for computerized adaptive assessment. These item banks have the potential for international development using existing alternative language versions of the WHOQOL items. PMID:27694100

  13. Connecting Assessment and Instruction to Help Students Become More Critical Producers of Multimedia

    ERIC Educational Resources Information Center

    Ostenson, Jonathan William

    2012-01-01

    Classroom teachers have been encouraged to incorporate more multimedia production in the classroom as a means of helping students develop critical media literacy skills. However, they have not always been well trained in how to evaluate the work students create; many teachers struggle to know which criteria to use in assessing student work. This…

  14. Critical Thinking and Formative Assessments: Increasing the Rigor in Your Classroom

    ERIC Educational Resources Information Center

    Moore, Betsy; Stanley, Todd

    2010-01-01

    Develop your students' critical thinking skills and prepare them to perform competitively in the classroom, on state tests, and beyond. In this book, Moore and Stanley show you how to effectively instruct your students to think on higher levels, and how to assess their progress. As states move toward common achievement standards, teachers have…

  15. Assessing Change in Student Critical Thinking for Introduction to Sociology Classes

    ERIC Educational Resources Information Center

    Rickles, Michael L.; Schneider, Rachel Zimmer; Slusser, Suzanne R.; Williams, Dana M.; Zipp, John F.

    2013-01-01

    Although there is widespread agreement among academics that critical thinking is an important component to the college classroom, there is little empirical evidence to verify that it is being taught in courses. Using four sections of introductory sociology, we developed an experimental design using pretests and posttests to assess students'…

  16. Critical Thinking and Political Participation: The Development and Assessment of a Causal Model.

    ERIC Educational Resources Information Center

    Guyton, Edith M.

    An assessment of a four-stage conceptual model reveals that critical thinking has indirect positive effects on political participation through its direct effects on personal control, political efficacy, and democratic attitudes. The model establishes causal relationships among selected personality variables (self-esteem, personal control, and…

  17. Development of Critical Thinking Self-Assessment System Using Wearable Device

    ERIC Educational Resources Information Center

    Gotoh, Yasushi

    2015-01-01

    In this research the author defines critical thinking as skills and dispositions which enable one to solve problems logically and to attempt to reflect autonomously by means of meta-cognitive activities on one's own problem-solving processes. The author focuses on providing meta-cognitive knowledge to help with self-assessment. To develop…

  18. Controversies of Standardized Assessment in School Accountability Reform: A Critical Synthesis of Multidisciplinary Research Evidence

    ERIC Educational Resources Information Center

    Wang, Lihshing; Beckett, Gulbahar H.; Brown, Lionel

    2006-01-01

    Standardized assessment in school systems has been the center of debate for decades. Although the voices of opponents of standardized tests have dominated the public forum, only a handful of scholars and practitioners have argued in defense of standardized tests. This article provides a critical synthesis of the controversial issues on…

  19. 77 FR 68795 - Protected Critical Infrastructure Information (PCII) Office Self-Assessment Questionnaire

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ...: $0 (This assessment will reside on existing PCII information storage systems). Total Burden Cost... information sharing by owners and operators of critical infrastructure and protected systems. IICD administers... protected systems, which is voluntarily submitted to DHS for homeland security purposes and validated...

  20. A Study on Critical Thinking Assessment System of College English Writing

    ERIC Educational Resources Information Center

    Dong, Tian; Yue, Lu

    2015-01-01

    This research attempts to discuss the validity of introducing the evaluation of students' critical thinking skills (CTS) into the assessment system of college English writing through an empirical study. In this paper, 30 College English Test Band 4 (CET-4) writing samples were collected and analyzed. Students' CTS and the final scores of collected…

  1. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  2. Assessing the delivery of patient critical laboratory results to primary care providers.

    PubMed

    Montes, Angelica; Francis, Michelle; Ciulla, Anna P

    2014-01-01

    Approximately 60% to 70% of all health care decisions are based on laboratory test results; therefore, it is important to ensure that patient laboratory results are communicated to the physician in a timely fashion. The objective of this study was to assess the delivery of critical laboratory results in outpatient physician offices in Delaware. Contact information for physician offices was obtained using the Highmark. Blue Cross Blue Shield. physician provider directory. A survey was created using a series of questions regarding the procurement and timely communication of critical laboratory results. Of the offices surveyed, 61.4% indicated that they did not utilize a standard operating procedure specifying who is able to receive the critical laboratory test results and how they should be delivered to the physician. These findings indicate that a change may be necessary to improve the way that critical test results are managed by physician offices. PMID:25219070

  3. Collaborative mobile sensing and computing for civil infrastructure condition assessment: framework and applications

    NASA Astrophysics Data System (ADS)

    Chen, Jianfei; Chen, ZhiQiang

    2012-04-01

    Multi-function sensing and imaging devices, GPS, communication and computing devices are being ubiquitously used in field by engineers in civil engineering and emergence response practice. Field engineers, however, still have difficulty to balance between ever-increasing data collection demand and capacity of real-time data processing and knowledge sharing. In addition, field engineers usually work collaboratively in a geospatially large area; however, the existing sensing and computing modalities used in the field are not designed to accommodate this condition. In this paper, we present a solution framework of collaborative mobile sensing and computing (CMSC) for civil infrastructure condition assessment, with the Android-based mobile devices as the basic nodes in the framework with a potential of adding other auxiliary imaging and sensing devices into the network. Difficulties in mixed C++ and Java code programming that are critical to realize the framework are discussed. With a few prototypes illustrated in this paper, we envisage that the proposed CMSC framework will enable seamless integration of sensing, imaging, real-time processing and knowledge discovery in future engineers-centered field reconnaissances and civil infrastructure condition assessment.

  4. An assessment of criticality safety at the Department of Energy Rocky Flats Plant, Golden, Colorado, July--September 1989

    SciTech Connect

    Mattson, Roger J.

    1989-09-01

    This is a report on the 1989 independent Criticality Safety Assessment of the Rocky Flats Plant, primarily in response to public concerns that nuclear criticality accidents involving plutonium may have occurred at this nuclear weapon component fabrication and processing plant. The report evaluates environmental issues, fissile material storage practices, ventilation system problem areas, and criticality safety practices. While no evidence of a criticality accident was found, several recommendations are made for criticality safety improvements. 9 tabs.

  5. Pain assessment in the critically ill adult: Recent evidence and new trends.

    PubMed

    Gélinas, Céline

    2016-06-01

    Pain assessment in the critically ill adult remains a daily clinical challenge. Position statements and practice guidelines exist to guide the ICU care team in the pain assessment process. The patient's self-report of pain remains the gold standard measure for pain and should be obtained as often as possible. When self-report is impossible to obtain, observational pain scales including the Behavioural Pain Scale (BPS) and the Critical-Care Pain Observation Tool (CPOT) have been recommended for clinical use in the critically ill adult. However, their adaptation and validation in brain-injured and burn ICU patients is required. Family caregivers may help in the identification of pain-related behaviours and should be more involved in the ICU pain assessment process. Fluctuations in vital signs should only be considered as cues for further assessment of pain with appropriate tools, and may better represent adverse events of severe pain. Other physiologic measures of pain should be explored in the ICU, and pupillometry appears as a promising technique to further study. Implementation of systematic pain assessment approaches using tools adapted to the patient's ability to communicate and condition has shown positive effects on ICU pain practices and patient outcomes, but randomised control trials are needed to confirm these conclusions. PMID:27067745

  6. Computer-based assessment for facioscapulohumeral dystrophy diagnosis.

    PubMed

    Chambers, O; Milenković, J; Pražnikar, A; Tasič, J F

    2015-06-01

    The paper presents a computer-based assessment for facioscapulohumeral dystrophy (FSHD) diagnosis through characterisation of the fat and oedema percentages in the muscle region. A novel multi-slice method for the muscle-region segmentation in the T1-weighted magnetic resonance images is proposed using principles of the live-wire technique to find the path representing the muscle-region border. For this purpose, an exponential cost function is used that incorporates the edge information obtained after applying the edge-enhancement algorithm formerly designed for the fingerprint enhancement. The difference between the automatic segmentation and manual segmentation performed by a medical specialists is characterised using the Zijdenbos similarity index, indicating a high accuracy of the proposed method. Finally, the fat and oedema are quantified from the muscle region in the T1-weighted and T2-STIR magnetic resonance images, respectively, using the fuzzy c-mean clustering approach for 10 FSHD patients. PMID:25910520

  7. Computer Aided Safety Assessment(CASA) Tool for ISS Payloads

    NASA Astrophysics Data System (ADS)

    Hochstein, Jason; Festa, Fabrizio

    2010-09-01

    In an effort to streamline the processes established by the partners of the International Space Station(ISS) to certify the safety of hardware and experiments destined for the Station, the European Space Agency’s(ESA) ISS System Safety Team is developing the Computer Aided Safety Assessment(CASA) tool suite. These software tools guide payload developers through the creation process of two types of standard payload hazard reports via a series of questions following a predetermined logic. The responses provided by the user are used by the CASA system to complete the majority of each hazard report requisite for payload flight safety reviews, employing consistent, approved descriptions of most hazards, hazard causes, controls and verification methods. Though some manual inputs will still be required to complete these reports, working with CASA will considerably reduce the amount of time necessary to review the documentation by agency safety authorities.

  8. Japanese technology assessment: Computer science, opto- and microelectronics mechatronics, biotechnology

    SciTech Connect

    Brandin, D.; Wieder, H.; Spicer, W.; Nevins, J.; Oxender, D.

    1986-01-01

    The series studies Japanese research and development in four high-technology areas - computer science, opto and microelectronics, mechatronics (a term created by the Japanese to describe the union of mechanical and electronic engineering to produce the next generation of machines, robots, and the like), and biotechnology. The evaluations were conducted by panels of U.S. scientists - chosen from academia, government, and industry - actively involved in research in areas of expertise. The studies were prepared for the purpose of aiding the U.S. response to Japan's technological challenge. The main focus of the assessments is on the current status and long-term direction and emphasis of Japanese research and development. Other aspects covered include evolution of the state of the art; identification of Japanese researchers, R and D organizations, and resources; and comparative U.S. efforts. The general time frame of the studies corresponds to future industrial applications and potential commercial impacts spanning approximately the next two decades.

  9. New Dental Accreditation Standard on Critical Thinking: A Call for Learning Models, Outcomes, Assessments.

    PubMed

    Johnsen, David C; Williams, John N; Baughman, Pauletta Gay; Roesch, Darren M; Feldman, Cecile A

    2015-10-01

    This opinion article applauds the recent introduction of a new dental accreditation standard addressing critical thinking and problem-solving, but expresses a need for additional means for dental schools to demonstrate they are meeting the new standard because articulated outcomes, learning models, and assessments of competence are still being developed. Validated, research-based learning models are needed to define reference points against which schools can design and assess the education they provide to their students. This article presents one possible learning model for this purpose and calls for national experts from within and outside dental education to develop models that will help schools define outcomes and assess performance in educating their students to become practitioners who are effective critical thinkers and problem-solvers.

  10. Approaches for the computationally efficient assessment of the plug-in HEV impact on the grid

    NASA Astrophysics Data System (ADS)

    Lee, Tae-Kyung; Filipi, Zoran S.

    2012-11-01

    Realistic duty cycles are critical for design and assessment of hybrid propulsion systems, in particular, plug-in hybrid electric vehicles. The analysis of the PHEV impact requires a large amount of data about daily missions for ensuring realism in predicted temporal loads on the grid. This paper presents two approaches for the reduction of the computational effort while assessing the large scale PHEV impact on the grid, namely 1) "response surface modelling" approach; and 2) "daily driving schedule modelling" approach. The response surface modelling approach replaces the time-consuming vehicle simulations by response surfaces constructed off-line with the consideration of the real-world driving. The daily driving modelling approach establishes a correlation between departure and arrival times, and it predicts representative driving patterns with a significantly reduced number of simulation cases. In both cases, representative synthetic driving cycles are used to capture the naturalistic driving characteristics for a given trip length. The proposed approaches enable construction of 24-hour missions, assessments of charging requirements at the time of plugging-in, and temporal distributions of the load on the grid with high computational efficiency.

  11. Cyst-based measurements for assessing lymphangioleiomyomatosis in computed tomography

    SciTech Connect

    Lo, P. Brown, M. S.; Kim, H.; Kim, H.; Goldin, J. G.; Argula, R.; Strange, C.

    2015-05-15

    Purpose: To investigate the efficacy of a new family of measurements made on individual pulmonary cysts extracted from computed tomography (CT) for assessing the severity of lymphangioleiomyomatosis (LAM). Methods: CT images were analyzed using thresholding to identify a cystic region of interest from chest CT of LAM patients. Individual cysts were then extracted from the cystic region by the watershed algorithm, which separates individual cysts based on subtle edges within the cystic regions. A family of measurements were then computed, which quantify the amount, distribution, and boundary appearance of the cysts. Sequential floating feature selection was used to select a small subset of features for quantification of the severity of LAM. Adjusted R{sup 2} from multiple linear regression and R{sup 2} from linear regression against measurements from spirometry were used to compare the performance of our proposed measurements with currently used density based CT measurements in the literature, namely, the relative area measure and the D measure. Results: Volumetric CT data, performed at total lung capacity and residual volume, from a total of 49 subjects enrolled in the MILES trial were used in our study. Our proposed measures had adjusted R{sup 2} ranging from 0.42 to 0.59 when regressing against the spirometry measures, with p < 0.05. For previously used density based CT measurements in the literature, the best R{sup 2} was 0.46 (for only one instance), with the majority being lower than 0.3 or p > 0.05. Conclusions: The proposed family of CT-based cyst measurements have better correlation with spirometric measures than previously used density based CT measurements. They show potential as a sensitive tool for quantitatively assessing the severity of LAM.

  12. Assessing public perceptions of computer-based models.

    PubMed

    Cockerill, Kristan; Tidwell, Vincent; Passell, Howard

    2004-11-01

    Although there is a solid body of research on both collaborative decision-making and on processes using models, there is little research on general public attitudes about models and their use in making policy decisions. This project assessed opinions about computer models in general and attitudes about a specific model being used in water planning in the Middle Rio Grande Region of New Mexico, United States. More than 1000 individuals were surveyed about their perceptions of computer-based models in general. Additionally, more than 150 attendees at public meetings related to the Middle Rio Grande planning effort were surveyed about their perceptions of the specific Rio Grande-based model. The results reveal that the majority of respondents are confident in their ability to understand models and most believe that models are appropriate tools for education and for making policy decisions. Responses also reveal that trust in who develops a model is a key issue related to public support. Regarding the specific model highlighted in this project, the public revealed tremendous support for its usefulness as a public engagement tool as well as a tool to assist decision-makers in regional water planning. Although indicating broad support for models, the results do raise questions about the role of trust in using models in contentious decisions. PMID:15633034

  13. RESRAD-ECORISK: A computer code for ecological risk assessment

    SciTech Connect

    Cheng, J.J.

    1995-12-01

    RESRAD-ECORISK is a PC-based computer code developed by Argonne National Laboratory (ANL) to estimate risks from exposure of ecological receptors at sites contaminated with potentially hazardous chemicals. The code is based on and is consistent with the methodologies of RESRAD-CHEM, an ANL-developed computer code for assessments of human health risk. RESRAD-ECORISK uses environmental fate and transport models to estimate contaminant concentrations in environmental media from an initial contaminated soil source and food-web uptake models to estimate contaminant doses to ecological receptors. The dose estimates are then used to estimate a risk for the ecological receptor and to calculate preliminary soil guidelines for reducing risks to acceptable levels. Specifically, RESRAD-ECORISK calculates (1) a species-specific applied daily dose for each contaminant (using species-specific life history information and site-specific environmental media concentrations), (2) an ecological hazard quotient (EHQ) for each contaminant and species, and (3) preliminary soil cleanup criteria for each contaminant and receptor. RESRAD-ECORISK incorporates a user-friendly menu-driven interface, databases and default values for a variety of ecological and chemical parameters, and on-line help for easy operation. The code is sufficiently flexible to simulate different contaminated sites and incorporate site-specific ecological data.

  14. Color calculations for and perceptual assessment of computer graphic images

    SciTech Connect

    Meyer, G.W.

    1986-01-01

    Realistic image synthesis involves the modelling of an environment in accordance with the laws of physics and the production of a final simulation that is perceptually acceptable. To be considered a scientific endeavor, synthetic image generation should also include the final step of experimental verification. This thesis concentrates on the color calculations that are inherent in the production of the final simulation and on the perceptual assessment of the computer graphic images that result. The fundamental spectral sensitivity functions that are active in the human visual system are introduced and are used to address color-blindness issues in computer graphics. A digitally controlled color television monitor is employed to successfully implement both the Farnsworth Munsell 100 hues test and a new color vision test that yields more accurate diagnoses. Images that simulate color blind vision are synthesized and are used to evaluate color scales for data display. Gaussian quadrature is used with a set of opponent fundamental to select the wavelengths at which to perform synthetic image generation.

  15. Prediction of critical illness in elderly outpatients using elder risk assessment: a population-based study

    PubMed Central

    Biehl, Michelle; Takahashi, Paul Y; Cha, Stephen S; Chaudhry, Rajeev; Gajic, Ognjen; Thorsteinsdottir, Bjorg

    2016-01-01

    Rationale Identifying patients at high risk of critical illness is necessary for the development and testing of strategies to prevent critical illness. The aim of this study was to determine the relationship between high elder risk assessment (ERA) score and critical illness requiring intensive care and to see if the ERA can be used as a prediction tool to identify elderly patients at the primary care visit who are at high risk of critical illness. Methods A population-based historical cohort study was conducted in elderly patients (age >65 years) identified at the time of primary care visit in Rochester, MN, USA. Predictors including age, previous hospital days, and comorbid health conditions were identified from routine administrative data available in the electronic medical record. The main outcome was critical illness, defined as sepsis, need for mechanical ventilation, or death within 2 years of initial visit. Patients with an ERA score of 16 were considered to be at high risk. The discrimination of the ERA score was assessed using area under the receiver operating characteristic curve. Results Of the 13,457 eligible patients, 9,872 gave consent for medical record review and had full information on intensive care unit utilization. The mean age was 75.8 years (standard deviation ±7.6 years), and 58% were female, 94% were Caucasian, 62% were married, and 13% were living in nursing homes. In the overall group, 417 patients (4.2%) suffered from critical illness. In the 1,134 patients with ERA >16, 154 (14%) suffered from critical illness. An ERA score ≥16 predicted critical illness (odds ratio 6.35; 95% confidence interval 3.51–11.48). The area under the receiver operating characteristic curve was 0.75, which indicated good discrimination. Conclusion A simple model based on easily obtainable administrative data predicted critical illness in the next 2 years in elderly outpatients with up to 14% of the highest risk population suffering from critical illness

  16. Computational Performance Assessment of k-mer Counting Algorithms.

    PubMed

    Pérez, Nelson; Gutierrez, Miguel; Vera, Nelson

    2016-04-01

    This article is about the assessment of several tools for k-mer counting, with the purpose to create a reference framework for bioinformatics researchers to identify computational requirements, parallelizing, advantages, disadvantages, and bottlenecks of each of the algorithms proposed in the tools. The k-mer counters evaluated in this article were BFCounter, DSK, Jellyfish, KAnalyze, KHMer, KMC2, MSPKmerCounter, Tallymer, and Turtle. Measured parameters were the following: RAM occupied space, processing time, parallelization, and read and write disk access. A dataset consisting of 36,504,800 reads was used corresponding to the 14th human chromosome. The assessment was performed for two k-mer lengths: 31 and 55. Obtained results were the following: pure Bloom filter-based tools and disk-partitioning techniques showed a lesser RAM use. The tools that took less execution time were the ones that used disk-partitioning techniques. The techniques that made the major parallelization were the ones that used disk partitioning, hash tables with lock-free approach, or multiple hash tables.

  17. Computational Performance Assessment of k-mer Counting Algorithms.

    PubMed

    Pérez, Nelson; Gutierrez, Miguel; Vera, Nelson

    2016-04-01

    This article is about the assessment of several tools for k-mer counting, with the purpose to create a reference framework for bioinformatics researchers to identify computational requirements, parallelizing, advantages, disadvantages, and bottlenecks of each of the algorithms proposed in the tools. The k-mer counters evaluated in this article were BFCounter, DSK, Jellyfish, KAnalyze, KHMer, KMC2, MSPKmerCounter, Tallymer, and Turtle. Measured parameters were the following: RAM occupied space, processing time, parallelization, and read and write disk access. A dataset consisting of 36,504,800 reads was used corresponding to the 14th human chromosome. The assessment was performed for two k-mer lengths: 31 and 55. Obtained results were the following: pure Bloom filter-based tools and disk-partitioning techniques showed a lesser RAM use. The tools that took less execution time were the ones that used disk-partitioning techniques. The techniques that made the major parallelization were the ones that used disk partitioning, hash tables with lock-free approach, or multiple hash tables. PMID:26982880

  18. Computational fluid dynamics framework for aerodynamic model assessment

    NASA Astrophysics Data System (ADS)

    Vallespin, D.; Badcock, K. J.; Da Ronch, A.; White, M. D.; Perfect, P.; Ghoreyshi, M.

    2012-07-01

    This paper reviews the work carried out at the University of Liverpool to assess the use of CFD methods for aircraft flight dynamics applications. Three test cases are discussed in the paper, namely, the Standard Dynamic Model, the Ranger 2000 jet trainer and the Stability and Control Unmanned Combat Air Vehicle. For each of these, a tabular aerodynamic model based on CFD predictions is generated along with validation against wind tunnel experiments and flight test measurements. The main purpose of the paper is to assess the validity of the tables of aerodynamic data for the force and moment prediction of realistic aircraft manoeuvres. This is done by generating a manoeuvre based on the tables of aerodynamic data, and then replaying the motion through a time-accurate computational fluid dynamics calculation. The resulting forces and moments from these simulations were compared with predictions from the tables. As the latter are based on a set of steady-state predictions, the comparisons showed perfect agreement for slow manoeuvres. As manoeuvres became more aggressive some disagreement was seen, particularly during periods of large rates of change in attitudes. Finally, the Ranger 2000 model was used on a flight simulator.

  19. Computer-aided assessment of scoliosis on posteroanterior radiographs.

    PubMed

    Zhang, Junhua; Lou, Edmond; Hill, Douglas L; Raso, James V; Wang, Yuanyuan; Le, Lawrence H; Shi, Xinling

    2010-02-01

    In order to reduce the observer variability in radiographic scoliosis assessment, a computer-aided system was developed. The system semi-automatically measured the Cobb angle and vertebral rotation on posteroanterior radiographs based on Hough transform and snake model, respectively. Both algorithms were integrated with the shape priors to improve the performance. The system was tested twice by each of three observers. The intraobserver and interobserver reliability analyses resulted in the intraclass correlation coefficients higher than 0.9 and 0.8 for Cobb measurement on 70 radiographs and rotation measurement on 156 vertebrae, respectively. Both the Cobb and rotation measurements resulted in the average intraobserver and interobserver errors less than 2 degrees and 3 degrees , respectively. There were no significant differences in the measurement variability between groups of curve location, curve magnitude, observer experience, and vertebra location. Compared with the documented results, measurement variability is reduced by using the developed system. This system can help orthopedic surgeons assess scoliosis more reliably.

  20. Assessing Critical Thinking Outcomes of Dental Hygiene Students Utilizing Virtual Patient Simulation: A Mixed Methods Study.

    PubMed

    Allaire, Joanna L

    2015-09-01

    Dental hygiene educators must determine which educational practices best promote critical thinking, a quality necessary to translate knowledge into sound clinical decision making. The aim of this small pilot study was to determine whether virtual patient simulation had an effect on the critical thinking of dental hygiene students. A pretest-posttest design using the Health Science Reasoning Test was used to evaluate the critical thinking skills of senior dental hygiene students at The University of Texas School of Dentistry at Houston Dental Hygiene Program before and after their experience with computer-based patient simulation cases. Additional survey questions sought to identify the students' perceptions of whether the experience had helped develop their critical thinking skills and improved their ability to provide competent patient care. A convenience sample of 31 senior dental hygiene students completed both the pretest and posttest (81.5% of total students in that class); 30 senior dental hygiene students completed the survey on perceptions of the simulation (78.9% response rate). Although the results did not show a significant increase in mean scores, the students reported feeling that the use of virtual patients was an effective teaching method to promote critical thinking, problem-solving, and confidence in the clinical realm. The results of this pilot study may have implications to support the use of virtual patient simulations in dental hygiene education. Future research could include a larger controlled study to validate findings from this study.

  1. Life cycle assessment study of a Chinese desktop personal computer.

    PubMed

    Duan, Huabo; Eugster, Martin; Hischier, Roland; Streicher-Porte, Martin; Li, Jinhui

    2009-02-15

    Associated with the tremendous prosperity in world electronic information and telecommunication industry, there continues to be an increasing awareness of the environmental impacts related to the accelerating mass production, electricity use, and waste management of electronic and electric products (e-products). China's importance as both a consumer and supplier of e-products has grown at an unprecedented pace in recent decade. Hence, this paper aims to describe the application of life cycle assessment (LCA) to investigate the environmental performance of Chinese e-products from a global level. A desktop personal computer system has been selected to carry out a detailed and modular LCA which follows the ISO 14040 series. The LCA is constructed by SimaPro software version 7.0 and expressed with the Eco-indicator'99 life cycle impact assessment method. For a sensitivity analysis of the overall LCA results, the so-called CML method is used in order to estimate the influence of the choice of the assessment method on the result. Life cycle inventory information is complied by ecoinvent 1.3 databases, combined with literature and field investigations on the present Chinese situation. The established LCA study shows that that the manufacturing and the use of such devices are of the highest environmental importance. In the manufacturing of such devices, the integrated circuits (ICs) and the Liquid Crystal Display (LCD) are those parts contributing most to the impact. As no other aspects are taken into account during the use phase, the impact is due to the way how the electricity is produced. The final process steps--i.e. the end of life phase--lead to a clear environmental benefit if a formal and modern, up-to-date technical system is assumed, like here in this study. PMID:19070352

  2. Computer-aided assessment of diagnostic images for epidemiological research

    PubMed Central

    2009-01-01

    Background Diagnostic images are often assessed for clinical outcomes using subjective methods, which are limited by the skill of the reviewer. Computer-aided diagnosis (CAD) algorithms that assist reviewers in their decisions concerning outcomes have been developed to increase sensitivity and specificity in the clinical setting. However, these systems have not been well utilized in research settings to improve the measurement of clinical endpoints. Reductions in bias through their use could have important implications for etiologic research. Methods Using the example of cortical cataract detection, we developed an algorithm for assisting a reviewer in evaluating digital images for the presence and severity of lesions. Available image processing and statistical methods that were easily implementable were used as the basis for the CAD algorithm. The performance of the system was compared to the subjective assessment of five reviewers using 60 simulated images. Cortical cataract severity scores from 0 to 16 were assigned to the images by the reviewers and the CAD system, with each image assessed twice to obtain a measure of variability. Image characteristics that affected reviewer bias were also assessed by systematically varying the appearance of the simulated images. Results The algorithm yielded severity scores with smaller bias on images where cataract severity was mild to moderate (approximately ≤ 6/16ths). On high severity images, the bias of the CAD system exceeded that of the reviewers. The variability of the CAD system was zero on repeated images but ranged from 0.48 to 1.22 for the reviewers. The direction and magnitude of the bias exhibited by the reviewers was a function of the number of cataract opacities, the shape and the contrast of the lesions in the simulated images. Conclusion CAD systems are feasible to implement with available software and can be valuable when medical images contain exposure or outcome information for etiologic research. Our

  3. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  4. The Computing Alliance of Hispanic-Serving Institutions: Supporting Hispanics at Critical Transition Points

    ERIC Educational Resources Information Center

    Gates, Ann Quiroz; Hug, Sarah; Thiry, Heather; Alo, Richard; Beheshti, Mohsen; Fernandez, John; Rodriguez, Nestor; Adjouadi, Malek

    2011-01-01

    Hispanics have the highest growth rates among all groups in the U.S., yet they remain considerably underrepresented in computing careers and in the numbers who obtain advanced degrees. Hispanics constituted about 7% of undergraduate computer science and computer engineering graduates and 1% of doctoral graduates in 2007-2008. The small number of…

  5. Reshaping Computer Literacy Teaching in Higher Education: Identification of Critical Success Factors

    ERIC Educational Resources Information Center

    Taylor, Estelle; Goede, Roelien; Steyn, Tjaart

    2011-01-01

    Purpose: Acquiring computer skills is more important today than ever before, especially in a developing country. Teaching of computer skills, however, has to adapt to new technology. This paper aims to model factors influencing the success of the learning of computer literacy by means of an e-learning environment. The research question for this…

  6. Computer Science: A Historical Perspective and a Current Assessment

    NASA Astrophysics Data System (ADS)

    Wirth, Niklaus

    We begin with a brief review of the early years of Computer Science. This period was dominated by large, remote computers and the struggle to master the complex problems of programming. The remedy was found in programming languages providing suitable abstractions and programming models. Outstanding was the language Algol 60, designed by an international committee, and intended as a publication language for algorithms. The early period ends with the advent of the microcomputer in the mid 1970s, bringing computing into homes and schools. The outstanding computer was the Alto, the first personal computer with substantial computing power. It changed the world of computing.

  7. The reliability of the intermittent critical velocity test and assessment of critical rest interval in men and women.

    PubMed

    Fukuda, David H; Smith, Abbie E; Kendall, Kristina L; Hetrick, Robert P; Hames, Ryan L; Cramer, Joel T; Stout, Jeffrey R

    2012-04-01

    The purpose of this study was to examine the reliability of the intermittent critical velocity (ICV) test and assess critical rest interval (CRI) during repeated-sprint exercise. The ICV test is used to examine the linear relationship between total distance and time-to-exhaustion during interval exercise, yielding a repeatable, moderate-intensity parameter (ICV), a high-intensity exhaustive parameter (W'), and CRI. CRI is the theoretical rest period needed to complete a series of repeated bouts of exercise without fatigue. Twenty-four healthy college-aged men (mean ± SD; age 22.7 ± 2.9 years; weight 85.8 ± 15.3 kg; VO(2max) 50.7 ± 8.8 ml/kg/min) and women (mean ± SD; age 21.4 ± 2.3 years; weight 58.9 ± 5.2 kg; VO(2max) 46.4 ± 4.4 ml/kg/min) participants completed two ICV tests (T1 and T2), using 10 s repeated sprints to exhaustion during separate sessions of treadmill running. Linear regression was used to determine ICV and W', while CRI was calculated using the relationship between the number of intervals completed and a variant of ICV. Intraclass correlation coefficients (ICCs) for ICV, W', and CRI were 0.89 (T1 4.42 ± 0.55 m/s; T2 4.34 ± 0.67 m/s), 0.80 (T1 125.6 ± 62.7 m; T2 144.6 ± 65.4 m), and 0.59 (T1 23.9 ± 2.0 s; T2 24.5 ± 2.6 s), respectively. These moderate to high ICC values indicate reliable measurements between ICV trials. Additionally, the evaluation of CRI demonstrated the attainment of a steady-state heart rate (94% of maximum) during a separate session of repeated supramaximal treadmill sprints. The ICV test during treadmill running provides reliable ICV and W' measures, as well as an estimated recovery time via CRI for men and women.

  8. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    NASA Astrophysics Data System (ADS)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  9. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  10. Benchmark Problems Used to Assess Computational Aeroacoustics Codes

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Envia, Edmane

    2005-01-01

    The field of computational aeroacoustics (CAA) encompasses numerical techniques for calculating all aspects of sound generation and propagation in air directly from fundamental governing equations. Aeroacoustic problems typically involve flow-generated noise, with and without the presence of a solid surface, and the propagation of the sound to a receiver far away from the noise source. It is a challenge to obtain accurate numerical solutions to these problems. The NASA Glenn Research Center has been at the forefront in developing and promoting the development of CAA techniques and methodologies for computing the noise generated by aircraft propulsion systems. To assess the technological advancement of CAA, Glenn, in cooperation with the Ohio Aerospace Institute and the AeroAcoustics Research Consortium, organized and hosted the Fourth CAA Workshop on Benchmark Problems. Participants from industry and academia from both the United States and abroad joined to present and discuss solutions to benchmark problems. These demonstrated technical progress ranging from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The results are documented in the proceedings of the workshop. Problems were solved in five categories. In three of the five categories, exact solutions were available for comparison with CAA results. A fourth category of problems representing sound generation from either a single airfoil or a blade row interacting with a gust (i.e., problems relevant to fan noise) had approximate analytical or completely numerical solutions. The fifth category of problems involved sound generation in a viscous flow. In this case, the CAA results were compared with experimental data.

  11. Orthogonal arrays for computer experiments to assess important inputs

    SciTech Connect

    Moore, L. M.; McKay, Michael D.

    2002-01-01

    The topic of this paper is experiment planning, particularly fractional factorial designs or orthogonal arrays, for computer experiments to assess important inputs. The work presented in the paper is motivated by considering a non-stochastic computer simulation which has many inputs and which can, in a reasonable period of time, be run thousands of times. With many inputs, information that allows focus on a subset of important inputs is valuable. The characterization of 'importance' is expected to follow suggestions in McKay (1995) or McKay, et al. (1992). This analysis approach leads to considering factorial experiment designs. Inputs are associated with a finite number of discrete values, referred to as levels, so if each input has K levels and there are p inputs then there are K{sup P} possible distinct runs which constitute the K{sup P} factorial design space. The suggested size of p has been 35 to 50 so that even with K=2 the complete 2{sup P} factorial design space would not be run. Further, it is expected that the complexity of the simulation code and discrete levels possibly associated with equi-probable intervals from the input distribution make it desirable to consider more than 2 level inputs. Inputs levels of 5 and 7 have been investigated. In this paper, orthogonal array experiment designs, which are subsets of factorial designs also referred to as fractional factorial designs, are suggested as candidate experiments which provide meaningful basis for calculating and comparing R{sup 2} across subsets of inputs.

  12. Examining the Critical Thinking Dispositions and the Problem Solving Skills of Computer Engineering Students

    ERIC Educational Resources Information Center

    Özyurt, Özcan

    2015-01-01

    Problem solving is an indispensable part of engineering. Improving critical thinking dispositions for solving engineering problems is one of the objectives of engineering education. In this sense, knowing critical thinking and problem solving skills of engineering students is of importance for engineering education. This study aims to determine…

  13. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    SciTech Connect

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  14. Assessment team report on flight-critical systems research at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Siewiorek, Daniel P. (Compiler); Dunham, Janet R. (Compiler)

    1989-01-01

    The quality, coverage, and distribution of effort of the flight-critical systems research program at NASA Langley Research Center was assessed. Within the scope of the Assessment Team's review, the research program was found to be very sound. All tasks under the current research program were at least partially addressing the industry needs. General recommendations made were to expand the program resources to provide additional coverage of high priority industry needs, including operations and maintenance, and to focus the program on an actual hardware and software system that is under development.

  15. Development and Evaluation of Computer-Assisted Assessment in Higher Education in Relation to BS7988

    ERIC Educational Resources Information Center

    Shephard, Kerry; Warburton, Bill; Maier, Pat; Warren, Adam

    2006-01-01

    A university-wide project team of academic and administrative staff worked together to prepare, deliver and evaluate a number of diagnostic, formative and summative computer-based assessments. The team also attempted to assess the University of Southampton's readiness to deliver computer-assisted assessment (CAA) within the "Code of practice for…

  16. Improving Student Performance through Computer-Based Assessment: Insights from Recent Research.

    ERIC Educational Resources Information Center

    Ricketts, C.; Wilks, S. J.

    2002-01-01

    Compared student performance on computer-based assessment to machine-graded multiple choice tests. Found that performance improved dramatically on the computer-based assessment when students were not required to scroll through the question paper. Concluded that students may be disadvantaged by the introduction of online assessment unless care is…

  17. Multi-intelligence critical rating assessment of fusion techniques (MiCRAFT)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik

    2015-06-01

    Assessment of multi-intelligence fusion techniques includes credibility of algorithm performance, quality of results against mission needs, and usability in a work-domain context. Situation awareness (SAW) brings together low-level information fusion (tracking and identification), high-level information fusion (threat and scenario-based assessment), and information fusion level 5 user refinement (physical, cognitive, and information tasks). To measure SAW, we discuss the SAGAT (Situational Awareness Global Assessment Technique) technique for a multi-intelligence fusion (MIF) system assessment that focuses on the advantages of MIF against single intelligence sources. Building on the NASA TLX (Task Load Index), SAGAT probes, SART (Situational Awareness Rating Technique) questionnaires, and CDM (Critical Decision Method) decision points; we highlight these tools for use in a Multi-Intelligence Critical Rating Assessment of Fusion Techniques (MiCRAFT). The focus is to measure user refinement of a situation over the information fusion quality of service (QoS) metrics: timeliness, accuracy, confidence, workload (cost), and attention (throughput). A key component of any user analysis includes correlation, association, and summarization of data; so we also seek measures of product quality and QuEST of information. Building a notion of product quality from multi-intelligence tools is typically subjective which needs to be aligned with objective machine metrics.

  18. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  19. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  20. Two-layer critical dimensions and overlay process window characterization and improvement in full-chip computational lithography

    NASA Astrophysics Data System (ADS)

    Sturtevant, John L.; Liubich, Vlad; Gupta, Rachit

    2016-04-01

    Edge placement error (EPE) was a term initially introduced to describe the difference between predicted pattern contour edge and the design target for a single design layer. Strictly speaking, this quantity is not directly measurable in the fab. What is of vital importance is the relative edge placement errors between different design layers, and in the era of multipatterning, the different constituent mask sublayers for a single design layer. The critical dimensions (CD) and overlay between two layers can be measured in the fab, and there has always been a strong emphasis on control of overlay between design layers. The progress in this realm has been remarkable, accelerated in part at least by the proliferation of multipatterning, which reduces the available overlay budget by introducing a coupling of overlay and CD errors for the target layer. Computational lithography makes possible the full-chip assessment of two-layer edge to edge distances and two-layer contact overlap area. We will investigate examples of via-metal model-based analysis of CD and overlay errors. We will investigate both single patterning and double patterning. For single patterning, we show the advantage of contour-to-contour simulation over contour to target simulation, and how the addition of aberrations in the optical models can provide a more realistic CD-overlay process window (PW) for edge placement errors. For double patterning, the interaction of 4-layer CD and overlay errors is very complex, but we illustrate that not only can full-chip verification identify potential two-layer hotspots, the optical proximity correction engine can act to mitigate such hotspots and enlarge the joint CD-overlay PW.

  1. Self-motion perception: assessment by computer-generated animations

    NASA Technical Reports Server (NTRS)

    Parker, D. E.; Harm, D. L.; Sandoz, G. R.; Skinner, N. C.

    1998-01-01

    The goal of this research is more precise description of adaptation to sensory rearrangements, including microgravity, by development of improved procedures for assessing spatial orientation perception. Thirty-six subjects reported perceived self-motion following exposure to complex inertial-visual motion. Twelve subjects were assigned to each of 3 perceptual reporting procedures: (a) animation movie selection, (b) written report selection and (c) verbal report generation. The question addressed was: do reports produced by these procedures differ with respect to complexity and reliability? Following repeated (within-day and across-day) exposures to 4 different "motion profiles," subjects either (a) selected movies presented on a laptop computer, or (b) selected written descriptions from a booklet, or (c) generated self-motion verbal descriptions that corresponded most closely with their motion experience. One "complexity" and 2 reliability "scores" were calculated. Contrary to expectations, reliability and complexity scores were essentially equivalent for the animation movie selection and written report selection procedures. Verbal report generation subjects exhibited less complexity than did subjects in the other conditions and their reports were often ambiguous. The results suggest that, when selecting from carefully written descriptions and following appropriate training, people may be better able to describe their self-motion experience with words than is usually believed.

  2. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect

    J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

    2011-06-21

    Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al

  3. Planned home birth in the United States and professionalism: a critical assessment.

    PubMed

    Chervenak, Frank A; McCullough, Laurence B; Grünebaum, Amos; Arabin, Birgit; Levene, Malcolm I; Brent, Robert L

    2013-01-01

    Planned home birth has been considered by some to be consistent with professional responsibility in patient care. This article critically assesses the ethical and scientific justification for this view and shows it to be unjustified. We critically assess recent statements by professional associations of obstetricians, one that sanctions and one that endorses planned home birth. We base our critical appraisal on the professional responsibility model of obstetric ethics, which is based on the ethical concept of medicine from the Scottish and English Enlightenments of the 18th century. Our critical assessment supports the following conclusions. Because of its significantly increased, preventable perinatal risks, planned home birth in the United States is not clinically or ethically benign. Attending planned home birth, no matter one's training or experience, is not acting in a professional capacity, because this role preventably results in clinically unnecessary and therefore clinically unacceptable perinatal risk. It is therefore not consistent with the ethical concept of medicine as a profession for any attendant to planned home birth to represent himself or herself as a "professional." Obstetric healthcare associations should neither sanction nor endorse planned home birth. Instead, these associations should recommend against planned home birth. Obstetric healthcare professionals should respond to expressions of interest in planned home birth by pregnant women by informing them that it incurs significantly increased, preventable perinatal risks, by recommending strongly against planned home birth, and by recommending strongly for planned hospital birth. Obstetric healthcare professionals should routinely provide excellent obstetric care to all women transferred to the hospital from a planned home birth.The professional responsibility model of obstetric ethics requires obstetricians to address and remedy legitimate dissatisfaction with some hospital settings and

  4. Computers for artificial intelligence a technology assessment and forecast

    SciTech Connect

    Miller, R.K.

    1986-01-01

    This study reviews the development and current state-of-the-art in computers for artificial intelligence, including LISP machines, AI workstations, professional and engineering workstations, minicomputers, mainframes, and supercomputers. Major computer systems for AI applications are reviewed. The use of personal computers for expert system development is discussed, and AI software for the IBM PC, Texas Instrument Professional Computer, and Apple MacIntosh is presented. Current research aimed at developing a new computer for artificial intelligence is described, and future technological developments are discussed.

  5. High Performance Computing Facility Operational Assessment, FY 2011 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Baker, Ann E; Bland, Arthur S Buddy; Hack, James J; Barker, Ashley D; Boudwin, Kathlyn J.; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; Wells, Jack C; White, Julia C

    2011-08-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.5 billion core hours in calendar year (CY) 2010 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Scientific achievements by OLCF users range from collaboration with university experimentalists to produce a working supercapacitor that uses atom-thick sheets of carbon materials to finely determining the resolution requirements for simulations of coal gasifiers and their components, thus laying the foundation for development of commercial-scale gasifiers. OLCF users are pushing the boundaries with software applications sustaining more than one petaflop of performance in the quest to illuminate the fundamental nature of electronic devices. Other teams of researchers are working to resolve predictive capabilities of climate models, to refine and validate genome sequencing, and to explore the most fundamental materials in nature - quarks and gluons - and their unique properties. Details of these scientific endeavors - not possible without access to leadership-class computing resources - are detailed in Section 4 of this report and in the INCITE in Review. Effective operations of the OLCF play a key role in the scientific missions and accomplishments of its users. This Operational Assessment Report (OAR) will delineate the policies, procedures, and innovations implemented by the OLCF to continue delivering a petaflop-scale resource for cutting-edge research. The 2010 operational assessment of the OLCF yielded recommendations that have been addressed (Reference Section 1) and where

  6. Assessing the impact of vulnerability modeling in the protection of critical infrastructure

    NASA Astrophysics Data System (ADS)

    Yates, Justin; Sanjeevi, Sujeevraja

    2012-10-01

    This paper examines the impact of arc metrics on the computational performance and spatial similarity in network interdiction modeling. Computational impact is measured in the number of iterations and total time required to reach an optimal solution. A combination of spatial analytical tools is offered as a methodology to assess the similarity in defense resource allocation when applying different arc interdiction metrics. An experimental design was devised and implemented using two real-world sub-networks of the Los Angeles County roadway system. This paper shows that arc metric selection has a limited effect on the spatial allocation of defense resources though metric choice does directly impact computation time. These results have direct implications to public policy and decision-making by enabling a modeler to increase his/her situational awareness and also their confidence in resource allocation decisions by selecting metrics that will improve their solution capabilities.

  7. Critical validity assessment of theoretical models: charge-exchange at intermediate and high energies

    NASA Astrophysics Data System (ADS)

    Belkić, Dževad

    1999-06-01

    Exact comprehensive computations are carried out by means of four leading second-order approximations yielding differential cross sections dQ/ dΩ for the basic charge exchange process H ++H(1s)→H(1s)+H + at intermediate and high energies. The obtained extensive set of results is thoroughly tested against all the existing experimental data with the purpose of critically assessing the validity of the boundary corrected second-Born (CB2), continuum-distorted wave (CDW), impulse approximation (IA) and the reformulated impulse approximation (RIA). The conclusion which emerges from this comparative study clearly indicates that the RIA agrees most favorably with the measurements available over a large energy range 25 keV-5 MeV. Such a finding reaffirms the few-particle quantum scattering theory which imposes several strict conditions on adequate second-order methods. These requirements satisfied by the RIA are: (i) normalisations of all the scattering wave functions, (ii) correct boundary conditions in both entrance and exit channels, (iii) introduction of a mathematically justified two-center continuum state for the sum of an attractive and a repulsive Coulomb potential with the same interaction strength, (iv) inclusion of the multiple scattering effects neglected in the IA, (v) a proper description of the Thomas double scattering in good agreement with the experiments and without any unobserved peak splittings. Nevertheless, the performed comparative analysis of the above four approximations indicates that none of the methods is free from some basic shortcomings. Despite its success, the RIA remains essentially a high-energy model like the other three methods under study. More importantly, their perturbative character leaves virtually no room for further systematic improvements, since the neglected higher-order terms are prohibitively tedious for practical purposes and have never been computed exactly. To bridge this gap, we presently introduce the variational Pad

  8. User Instructions for the Systems Assessment Capability, Rev. 1, Computer Codes Volume 3: Utility Codes

    SciTech Connect

    Eslinger, Paul W.; Aaberg, Rosanne L.; Lopresti, Charles A.; Miley, Terri B.; Nichols, William E.; Strenge, Dennis L.

    2004-09-14

    This document contains detailed user instructions for a suite of utility codes developed for Rev. 1 of the Systems Assessment Capability. The suite of computer codes for Rev. 1 of Systems Assessment Capability performs many functions.

  9. Overview of BioCreAtIvE: critical assessment of information extraction for biology

    PubMed Central

    Hirschman, Lynette; Yeh, Alexander; Blaschke, Christian; Valencia, Alfonso

    2005-01-01

    Background The goal of the first BioCreAtIvE challenge (Critical Assessment of Information Extraction in Biology) was to provide a set of common evaluation tasks to assess the state of the art for text mining applied to biological problems. The results were presented in a workshop held in Granada, Spain March 28–31, 2004. The articles collected in this BMC Bioinformatics supplement entitled "A critical assessment of text mining methods in molecular biology" describe the BioCreAtIvE tasks, systems, results and their independent evaluation. Results BioCreAtIvE focused on two tasks. The first dealt with extraction of gene or protein names from text, and their mapping into standardized gene identifiers for three model organism databases (fly, mouse, yeast). The second task addressed issues of functional annotation, requiring systems to identify specific text passages that supported Gene Ontology annotations for specific proteins, given full text articles. Conclusion The first BioCreAtIvE assessment achieved a high level of international participation (27 groups from 10 countries). The assessment provided state-of-the-art performance results for a basic task (gene name finding and normalization), where the best systems achieved a balanced 80% precision / recall or better, which potentially makes them suitable for real applications in biology. The results for the advanced task (functional annotation from free text) were significantly lower, demonstrating the current limitations of text-mining approaches where knowledge extrapolation and interpretation are required. In addition, an important contribution of BioCreAtIvE has been the creation and release of training and test data sets for both tasks. There are 22 articles in this special issue, including six that provide analyses of results or data quality for the data sets, including a novel inter-annotator consistency assessment for the test set used in task 2. PMID:15960821

  10. Quantum wavepacket ab initio molecular dynamics: an approach for computing dynamically averaged vibrational spectra including critical nuclear quantum effects.

    PubMed

    Sumner, Isaiah; Iyengar, Srinivasan S

    2007-10-18

    We have introduced a computational methodology to study vibrational spectroscopy in clusters inclusive of critical nuclear quantum effects. This approach is based on the recently developed quantum wavepacket ab initio molecular dynamics method that combines quantum wavepacket dynamics with ab initio molecular dynamics. The computational efficiency of the dynamical procedure is drastically improved (by several orders of magnitude) through the utilization of wavelet-based techniques combined with the previously introduced time-dependent deterministic sampling procedure measure to achieve stable, picosecond length, quantum-classical dynamics of electrons and nuclei in clusters. The dynamical information is employed to construct a novel cumulative flux/velocity correlation function, where the wavepacket flux from the quantized particle is combined with classical nuclear velocities to obtain the vibrational density of states. The approach is demonstrated by computing the vibrational density of states of [Cl-H-Cl]-, inclusive of critical quantum nuclear effects, and our results are in good agreement with experiment. A general hierarchical procedure is also provided, based on electronic structure harmonic frequencies, classical ab initio molecular dynamics, computation of nuclear quantum-mechanical eigenstates, and employing quantum wavepacket ab initio dynamics to understand vibrational spectroscopy in hydrogen-bonded clusters that display large degrees of anharmonicities.

  11. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    SciTech Connect

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent on the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.

  12. A Simple Widespread Computer Help Improves Nutrition Support Orders and Decreases Infection Complications in Critically Ill Patients

    PubMed Central

    Conseil, Mathieu; Carr, Julie; Molinari, Nicolas; Coisel, Yannaël; Cissé, Moussa; Belafia, Fouad; Delay, Jean-Marc; Jung, Boris; Jaber, Samir; Chanques, Gérald

    2013-01-01

    Aims To assess the impact of a simple computer-based decision-support system (computer help) on the quality of nutrition support orders and patients' outcome in Intensive-Care Unit (ICU). Methods This quality-improvement study was carried out in a 16-bed medical-surgical ICU in a French university hospital. All consecutive patients who stayed in ICU more than 10 days with non-oral feeding for more than 5 days were retrospectively included during two 12-month periods. Prescriptions of nutrition support were collected and compared to French national guidelines as a quality-improvement process. A computer help was constructed using a simple Excel-sheet (MicrosoftTM) to guide physicians' prescriptions according to guidelines. This computer help was displayed in computers previously used for medical orders. Physicians were informed but no systematic protocol was implemented. Patients included during the first (control group) and second period (computer help group) were compared for achievement of nutrition goals and ICU outcomes. Results The control and computer help groups respectively included 71 and 95 patients. Patients' characteristics were not significantly different between groups. In the computer help group, prescriptions achieved significantly more often 80% of nutrition goals for calorie (45% vs. 79% p<0.001) and nitrogen intake (3% vs. 37%, p<0.001). Incidence of nosocomial infections decreased significantly between the two groups (59% vs. 41%, p = 0.03). Mortality did not significantly differ between control (21%) and computer help groups (15%, p = 0.30). Conclusions Use of a widespread inexpensive computer help is associated with significant improvements in nutrition support orders and decreased nosocomial infections in ICU patients. This computer-help is provided in electronic supplement. PMID:23737948

  13. The Old Paradigm in Computer Aids to Invention: A Critical Review.

    ERIC Educational Resources Information Center

    Langston, M. Diane

    1986-01-01

    Reviews the major types of computer aids to invention to reveal the paper-based design paradigm that characterizes them. Discusses implications of this "old paradigm" and suggests directions for developing new, uniquely electronic paradigms for future aids. (AEW)

  14. A critical evaluation of the predictions of the NASA-Lockheed multielement airfoil computer program

    NASA Technical Reports Server (NTRS)

    Brune, G. W.; Manke, J. W.

    1978-01-01

    Theoretical predictions of several versions of the multielement airfoil computer program are evaluated. The computed results are compared with experimental high lift data of general aviation airfoils with a single trailing edge flap, and of airfoils with a leading edge flap and double slotted trailing edge flaps. Theoretical and experimental data include lift, pitching moment, profile drag and surface pressure distributions, boundary layer integral parameters, skin friction coefficients, and velocity profiles.

  15. Contemporary assessment of foot perfusion in patients with critical limb ischemia.

    PubMed

    Benitez, Erik; Sumpio, Brandon J; Chin, Jason; Sumpio, Bauer E

    2014-03-01

    Significant progress in limb salvage for patients with peripheral arterial disease and critical limb ischemia has occurred in the past 2 decades. Improved patient outcomes have resulted from increased knowledge and understanding of the disease processes, as well as efforts to improve revascularization techniques and enhance patient care after open and endovascular procedures. An imaging modality that is noninvasive, fast, and safe would be a useful tool for clinicians in assessing lower-extremity perfusion when planning interventions. Among the current and emerging regional perfusion imaging modalities are transcutaneous oxygen monitoring, hyperspectral imaging, indocyanine green dye-based fluorescent angiography, nuclear diagnostic imaging, and laser Doppler. These tests endeavor to delineate regional foot perfusion to guide directed revascularization therapy in patients with critical limb ischemia and foot ulceration. PMID:25812754

  16. Use of Bioelectrical Impedance Analysis for the Assessment of Nutritional Status in Critically Ill Patients

    PubMed Central

    Lee, Yoojin; Kwon, Oran; Shin, Cheung Soo

    2015-01-01

    Malnutrition is common in the critically ill patients and known to cause a variety of negative clinical outcomes. However, various conventional methods for nutrition assessment have several limitations. We hypothesized that body composition data, as measured using bioelectrical impedance analysis (BIA), may have a significant role in evaluating nutritional status and predicting clinical outcomes in critically ill patients. We gathered clinical, biochemical, and BIA data from 66 critically ill patients admitted to an intensive care unit. Patients were divided into three nutritional status groups according to their serum albumin level and total lymphocyte counts. The BIA results, conventional indicators of nutrition status, and clinical outcomes were compared and analyzed retrospectively. Results showed that the BIA indices including phase angle (PhA), extracellular water (ECW), and ECW/total body water (TBW) were significantly associated with the severity of nutritional status. Particularly, PhA, an indicator of the health of the cell membrane, was higher in the well-nourished patient group, whereas the edema index (ECW/TBW) was higher in the severely malnourished patient group. PhA was positively associated with albumin and ECW/TBW was negatively associated with serum albumin, hemoglobin, and duration of mechanical ventilation. In non-survivors, PhA was significantly lower and both ECW/TBW and %TBW/fat free mass were higher than in survivors. In conclusion, several BIA indexes including PhA and ECW/TBW may be useful for nutritional assessment and represent significant prognostic factors in the care of critically ill patients. PMID:25713790

  17. Development of a structural health monitoring system for the life assessment of critical transportation infrastructure.

    SciTech Connect

    Roach, Dennis Patrick; Jauregui, David Villegas; Daumueller, Andrew Nicholas

    2012-02-01

    Recent structural failures such as the I-35W Mississippi River Bridge in Minnesota have underscored the urgent need for improved methods and procedures for evaluating our aging transportation infrastructure. This research seeks to develop a basis for a Structural Health Monitoring (SHM) system to provide quantitative information related to the structural integrity of metallic structures to make appropriate management decisions and ensuring public safety. This research employs advanced structural analysis and nondestructive testing (NDT) methods for an accurate fatigue analysis. Metal railroad bridges in New Mexico will be the focus since many of these structures are over 100 years old and classified as fracture-critical. The term fracture-critical indicates that failure of a single component may result in complete collapse of the structure such as the one experienced by the I-35W Bridge. Failure may originate from sources such as loss of section due to corrosion or cracking caused by fatigue loading. Because standard inspection practice is primarily visual, these types of defects can go undetected due to oversight, lack of access to critical areas, or, in riveted members, hidden defects that are beneath fasteners or connection angles. Another issue is that it is difficult to determine the fatigue damage that a structure has experienced and the rate at which damage is accumulating due to uncertain history and load distribution in supporting members. A SHM system has several advantages that can overcome these limitations. SHM allows critical areas of the structure to be monitored more quantitatively under actual loading. The research needed to apply SHM to metallic structures was performed and a case study was carried out to show the potential of SHM-driven fatigue evaluation to assess the condition of critical transportation infrastructure and to guide inspectors to potential problem areas. This project combines the expertise in transportation infrastructure at New

  18. The National Education Association's Educational Computer Service. An Assessment.

    ERIC Educational Resources Information Center

    Software Publishers Association, Washington, DC.

    The Educational Computer Service (ECS) of the National Education Association (NEA) evaluates and distributes educational software. An investigation of ECS was conducted by the Computer Education Committee of the Software Publishers Association (SPA) at the request of SPA members. The SPA found that the service, as it is presently structured, is…

  19. Assessment of Examinations in Computer Science Doctoral Education

    ERIC Educational Resources Information Center

    Straub, Jeremy

    2014-01-01

    This article surveys the examination requirements for attaining degree candidate (candidacy) status in computer science doctoral programs at all of the computer science doctoral granting institutions in the United States. It presents a framework for program examination requirement categorization, and categorizes these programs by the type or types…

  20. Approaches for assessment of vulnerability of critical infrastructures to weather-related hazards

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Uzielli, Marco; Vidar Vangelsten, Bjørn

    2016-04-01

    Critical infrastructures are essential components for the modern society to maintain its function, and malfunctioning of one of the critical infrastructure systems may have far-reaching consequences. Climate changes may lead to increase in frequency and intensity of weather-related hazards, creating challenges for the infrastructures. This paper outlines approaches to assess vulnerability posed by weather-related hazards to infrastructures. The approaches assess factors that affect the probability of a malfunctioning of the infrastructure should a weather-related threat occur, as well factors that affect the societal consequences of the infrastructure malfunctioning. Even if vulnerability factors are normally very infrastructure specific and hazard dependent, generic factors could be defined and analyzed. For the vulnerability and resilience of the infrastructure, such factors include e.g. robustness, buffer capacity, protection, quality, age, adaptability and transparency. For the vulnerability of the society in relation to the infrastructure, such factors include e.g. redundancy, substitutes and cascading effects. A semi-quantitative, indicator-based approach is proposed, providing schemes for ranking of the most important vulnerability indicators relevant for weather-related hazards on a relative scale. The application of the indicators in a semi-quantitative risk assessment is also demonstrated. In addition, a quantitative vulnerability model is proposed in terms of vulnerability (representing degree of loss) as a function of intensity, which is adaptable to different types of degree of loss (e.g. fraction of infrastructure users that lose their service, fraction of repair costs to full reconstruction costs). The vulnerability model can be calibrated with empirical data using deterministic calibration or a variety of probabilistic calibration approaches to account for the uncertainties within the model. The research leading to these results has received funding

  1. Volcanic hazards at distant critical infrastructure: A method for bespoke, multi-disciplinary assessment

    NASA Astrophysics Data System (ADS)

    Odbert, H. M.; Aspinall, W.; Phillips, J.; Jenkins, S.; Wilson, T. M.; Scourse, E.; Sheldrake, T.; Tucker, P.; Nakeshree, K.; Bernardara, P.; Fish, K.

    2015-12-01

    Societies rely on critical services such as power, water, transport networks and manufacturing. Infrastructure may be sited to minimise exposure to natural hazards but not all can be avoided. The probability of long-range transport of a volcanic plume to a site is comparable to other external hazards that must be considered to satisfy safety assessments. Recent advances in numerical models of plume dispersion and stochastic modelling provide a formalized and transparent approach to probabilistic assessment of hazard distribution. To understand the risks to critical infrastructure far from volcanic sources, it is necessary to quantify their vulnerability to different hazard stressors. However, infrastructure assets (e.g. power plantsand operational facilities) are typically complex systems in themselves, with interdependent components that may differ in susceptibility to hazard impact. Usually, such complexity means that risk either cannot be estimated formally or that unsatisfactory simplifying assumptions are prerequisite to building a tractable risk model. We present a new approach to quantifying risk by bridging expertise of physical hazard modellers and infrastructure engineers. We use a joint expert judgment approach to determine hazard model inputs and constrain associated uncertainties. Model outputs are chosen on the basis of engineering or operational concerns. The procedure facilitates an interface between physical scientists, with expertise in volcanic hazards, and infrastructure engineers, with insight into vulnerability to hazards. The result is a joined-up approach to estimating risk from low-probability hazards to critical infrastructure. We describe our methodology and show preliminary results for vulnerability to volcanic hazards at a typical UK industrial facility. We discuss our findings in the context of developing bespoke assessment of hazards from distant sources in collaboration with key infrastructure stakeholders.

  2. Evaluation of critical materials for five advanced design photovoltaic cells with an assessment of indium and gallium

    SciTech Connect

    Watts, R.L.; Gurwell, W.E.; Jamieson, W.M.; Long, L.W.; Pawlewicz, W.T.; Smith, S.A.; Teeter, R.R.

    1980-05-01

    The objective of this study is to identify potential material supply constraints due to the large-scale deployment of five advanced photovoltaic (PV) cell designs, and to suggest strategies to reduce the impacts of these production capacity limitations and potential future material shortages. This report presents the results of the screening of the five following advanced PV cell designs: polycrystalline silicon, amorphous silicon, cadmium sulfide/copper sulfide frontwall, polycrystalline gallium arsenide MIS, and advanced concentrator-500X. Each of these five cells is screened individually assuming that they first come online in 1991, and that 25 GWe of peak capacity is online by the year 2000. A second computer screening assumes that each cell first comes online in 1991 and that each cell has 5 GWe of peak capacity by the year 2000, so that the total online cpacity for the five cells is 25 GWe. Based on a review of the preliminary basline screening results, suggestions were made for varying such parameters as the layer thickness, cell production processes, etc. The resulting PV cell characterizations were then screened again by the CMAP computer code. Earlier DOE sponsored work on the assessment of critical materials in PV cells conclusively identtified indium and gallium as warranting further investigation as to their availability. Therefore, this report includes a discussion of the future availability of gallium and indium. (WHK)

  3. Critical Thinking in and through Interactive Computer Hypertext and Art Education

    ERIC Educational Resources Information Center

    Taylor, Pamela G.

    2006-01-01

    As part of a two-year study, Pamela G. Taylor's high school art students constructed hypertext webs that linked the subject matter they studied in class to their own independent research and personal experiences, which in turn encouraged them to think critically about the material. Taylor bases this use of hypertext on the thinking of Paulo Freire…

  4. Implementing and assessing computational modeling in introductory mechanics

    NASA Astrophysics Data System (ADS)

    Caballero, Marcos D.; Kohlmyer, Matthew A.; Schatz, Michael F.

    2012-12-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term, 1357 students in this course solved a suite of 14 computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated with a proctored assignment involving a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation, and the implications for computational instruction in introductory science, technology, engineering, and mathematics (STEM) courses.

  5. How students measure up: An assessment instrument for introductory computer science

    NASA Astrophysics Data System (ADS)

    Decker, Adrienne

    This dissertation presents an assessment instrument specifically designed for programming-first introductory sequences in computer science as given in Computing Curricula 2001: Computer Science Volume. The first-year computer science course has been the focus of many recent innovations and many recent debates in the computer science curriculum. There is significant disagreement as to effective methodology in the first year of computing, and there has been no shortage of ideas as to what predicts student success in the first year of the computing curriculum. However, most investigations into predictors of success lack an appropriately validated assessment instrument to support or refute their findings. This is presumably due to the fact that there are very few validated assessment instruments available for assessing student performance in the first year of computing instruction. The instrument presented here is not designed to test particular language constructs, but rather the underlying principles of the first year of computing instruction. It has been administered to students at the end of their first year of an introductory computer science curriculum. Data needed for analysis of the instrument for reliability and validity was collected and analyzed. Use of this instrument enables validated assessment of student progress at the end of their first year, and also enables the study of further innovations in the curriculum for the first year computer science courses.

  6. Profiling of energy deposition fields in a modular HTHR with annular core: Computational/experimental studies at the ASTRA critical facility

    SciTech Connect

    Boyarinov, V. F.; Garin, V. P.; Glushkov, E. S.; Zimin, A. A.; Kompaniets, G. V.; Nevinitsa, V. A.; Polyakov, D. N.; Ponomarev, A. S.; Ponomarev-Stepnoi, N. N.; Smirnov, O. N.; Fomichenko, P. A.; Chunyaev, E. I.; Marova, E. V.; Sukharev, Yu. P.

    2010-12-15

    The paper presents the results obtained from the computational/experimental studies of the spatial distribution of the {sup 235}U fission reaction rate in a critical assembly with an annular core and poison profiling elements inserted into the inner graphite reflector. The computational analysis was carried out with the codes intended for design computation of an HTHR-type reactor.

  7. Pedagogical Utilization and Assessment of the Statistic Online Computational Resource in Introductory Probability and Statistics Courses.

    PubMed

    Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas

    2008-01-01

    Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual

  8. A Critical Evaluation of the Validity and the Reliability of Global Competency Constructs for Supervisor Assessment of Junior Medical Trainees

    ERIC Educational Resources Information Center

    McGill, D. A.; van der Vleuten, C. P. M.; Clarke, M. J.

    2013-01-01

    Supervisor assessments are critical for both formative and summative assessment in the workplace. Supervisor ratings remain an important source of such assessment in many educational jurisdictions even though there is ambiguity about their validity and reliability. The aims of this evaluation is to explore the: (1) construct validity of ward-based…

  9. Effects of Computer versus Paper Administration of an Adult Functional Writing Assessment

    ERIC Educational Resources Information Center

    Chen, Jing; White, Sheida; McCloskey, Michael; Soroui, Jaleh; Chun, Young

    2011-01-01

    This study investigated the comparability of paper and computer versions of a functional writing assessment administered to adults 16 and older. Three writing tasks were administered in both paper and computer modes to volunteers in the field test of an assessment of adult literacy in 2008. One set of analyses examined mode effects on scoring by…

  10. Computer-Assisted Assessment in Higher Education. Staff and Educational Development Series.

    ERIC Educational Resources Information Center

    Brown, Sally, Ed.; Race, Phil, Ed.; Bull, Joanna, Ed.

    This book profiles how computer-assisted assessment can help both staff and students by drawing on the experience and expertise of practitioners, in the United Kingdom and internationally, who are already using computer-assisted assessment. The publication is organized into three main sections--"Pragmatics and Practicalities of CAA,""Using CAA for…

  11. Computer-Based Assessment of Cognitive Abilities: Current Status/Future Directions.

    ERIC Educational Resources Information Center

    Eller, Ben F.; And Others

    1987-01-01

    Discusses the evolution of current decision support systems (DSS) computer software and their potential for use in assessing cognitive ability. Current applications of computer software for structured and unstructured problem solving are examined together with the feasibility of computerized adaptive testing. Characteristics of computer-based…

  12. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  13. Embedded assessment algorithms within home-based cognitive computer game exercises for elders.

    PubMed

    Jimison, Holly; Pavel, Misha

    2006-01-01

    With the recent consumer interest in computer-based activities designed to improve cognitive performance, there is a growing need for scientific assessment algorithms to validate the potential contributions of cognitive exercises. In this paper, we present a novel methodology for incorporating dynamic cognitive assessment algorithms within computer games designed to enhance cognitive performance. We describe how this approach works for variety of computer applications and describe cognitive monitoring results for one of the computer game exercises. The real-time cognitive assessments also provide a control signal for adapting the difficulty of the game exercises and providing tailored help for elders of varying abilities.

  14. Identifying Reading Problems with Computer-Adaptive Assessments

    ERIC Educational Resources Information Center

    Merrell, C.; Tymms, P.

    2007-01-01

    This paper describes the development of an adaptive assessment called Interactive Computerised Assessment System (InCAS) that is aimed at children of a wide age and ability range to identify specific reading problems. Rasch measurement has been used to create the equal interval scales that form each part of the assessment. The rationale for the…

  15. Assessment of gene order computing methods for Alzheimer's disease

    PubMed Central

    2013-01-01

    Background Computational genomics of Alzheimer disease (AD), the most common form of senile dementia, is a nascent field in AD research. The field includes AD gene clustering by computing gene order which generates higher quality gene clustering patterns than most other clustering methods. However, there are few available gene order computing methods such as Genetic Algorithm (GA) and Ant Colony Optimization (ACO). Further, their performance in gene order computation using AD microarray data is not known. We thus set forth to evaluate the performances of current gene order computing methods with different distance formulas, and to identify additional features associated with gene order computation. Methods Using different distance formulas- Pearson distance and Euclidean distance, the squared Euclidean distance, and other conditions, gene orders were calculated by ACO and GA (including standard GA and improved GA) methods, respectively. The qualities of the gene orders were compared, and new features from the calculated gene orders were identified. Results Compared to the GA methods tested in this study, ACO fits the AD microarray data the best when calculating gene order. In addition, the following features were revealed: different distance formulas generated a different quality of gene order, and the commonly used Pearson distance was not the best distance formula when used with both GA and ACO methods for AD microarray data. Conclusion Compared with Pearson distance and Euclidean distance, the squared Euclidean distance generated the best quality gene order computed by GA and ACO methods. PMID:23369541

  16. [Chemical risk assessment in the construction industry: principles and critical issues].

    PubMed

    Manno, M

    2012-01-01

    Risk assessment (RA) represents the first step to ensure the protection of the workers' health in all work sectors, production and services included. For this reason RA has become a legal duty for the occupational physician in his/her professional activity. The basic concepts of RA have been developed as a formal procedure for the management of chemical risks but they are currently applied to protect human health against all types of occupational and environmental risk factors. In the construction industry, in particular, chemical risk assessment is specially difficult due to the complexity of the working condition, and the variability and multiplicity of exposure. The critical aspects of RA in the construction industry will be discussed here, in the attempt to highlight how the occupational physician, making use of traditional and new tools, including biological monitoring, can address and partly overcome them.

  17. A nuclear criticality safety assessment of the loss of moderation control in 2 1/2 and 10-ton cylinders containing enriched UF{sub 6}

    SciTech Connect

    Newvahner, R.L.; Pryor, W.A.

    1991-12-31

    Moderation control for maintaining nuclear criticality safety in 2 {1/2}-ton, 10-ton, and 14-ton cylinders containing enriched uranium hexafluoride (UF{sub 6}) has been used safely within the nuclear industry for over thirty years, and is dependent on cylinder integrity and containment. This assessment evaluates the loss of moderation control by the breaching of containment and entry of water into the cylinders. The first objective of this study was to estimate the required amounts of water entering these large UF{sub 6} cylinders to react with, and to moderate the uranium compounds sufficiently to cause criticality. Hypothetical accident situations were modeled as a uranyl fluoride (UO{sub 2}F{sub 2}) slab above a UF{sub 6} hemicylinder, and a UO{sub 2}F{sub 2} sphere centered within a UF{sub 6} hemicylinder. These situations were investigated by computational analyses utilizing the KENO V.a Monte Carlo Computer Code. The results were used to estimate both the masses of water required for criticality, and the limiting masses of water that could be considered safe. The second objective of the assessment was to calculate the time available for emergency control actions before a criticality would occur, i.e., a {open_quotes}safetime{close_quotes}, for various sources of water and different size openings in a breached cylinder. In the situations considered, except the case for a fire hose, the safetime appears adequate for emergency control actions. The assessment shows that current practices for handling moderation controlled cylinders of low enriched UF{sub 6}, along with the continuation of established personnel training programs, ensure nuclear criticality safety for routine and emergency operations.

  18. [Assessment of surgical risk in patients with lower limb chronic critical ischaemia].

    PubMed

    Kazakov, Iu I; Lukin, I B; Sokolova, N Iu; Strakhov, M A

    2016-01-01

    Analysed herein are both immediate and remote results of surgical treatment of 93 patients presenting with chronic atherosclerotic occlusion of the femoral-popliteal-tibial segment in the stage of critical ischaemia. The patients were subjected to autovenous femoropopliteal bypass grafting to the isolated arterial segment or balloon angioplasty with stenting of the superficial femoral artery. While choosing the method of arterial reconstruction we assessed concomitant diseases, primarily lesions of the coronary and cerebral circulation. In order to objectively evaluate the patient state, we worked out a scale for assessing surgical risk. Survival rate without amputation after three years in patients with low risk amounted to 71.4%, in those with moderate risk to 60.0%, and in high-risk patients to 43.3%. Patients with initially high risk were found to have a high incidence rate of cardiac and cerebrovascular complications, exceeding 40%. It was shown that the worked out system of assessing the level of surgical risk objectively reflects the prognosis of patient survival following a reconstructive operation. This system of assessment may be appropriate while choosing an optimal method of arterial reconstruction (bypassing operation or endovascular intervention) in patients with atherosclerotic lesions of arteries of the femoropopliteal-tibial segment and critical ischaemia accompanied by severe concomitant pathology. Patients with high surgical risk should preferably be subjected to endovascular reconstruction, while those with low surgical risk should better undergo open shunting bypassing operation, and for those with moderate risk it is acceptable to perform both methods of arterial reconstruction. PMID:27626262

  19. Spoilt for choice: A critical review on the chemical and biological assessment of current wastewater treatment technologies.

    PubMed

    Prasse, Carsten; Stalter, Daniel; Schulte-Oehlmann, Ulrike; Oehlmann, Jörg; Ternes, Thomas A

    2015-12-15

    The knowledge we have gained in recent years on the presence and effects of compounds discharged by wastewater treatment plants (WWTPs) brings us to a point where we must question the appropriateness of current water quality evaluation methodologies. An increasing number of anthropogenic chemicals is detected in treated wastewater and there is increasing evidence of adverse environmental effects related to WWTP discharges. It has thus become clear that new strategies are needed to assess overall quality of conventional and advanced treated wastewaters. There is an urgent need for multidisciplinary approaches combining expertise from engineering, analytical and environmental chemistry, (eco)toxicology, and microbiology. This review summarizes the current approaches used to assess treated wastewater quality from the chemical and ecotoxicological perspective. Discussed chemical approaches include target, non-target and suspect analysis, sum parameters, identification and monitoring of transformation products, computational modeling as well as effect directed analysis and toxicity identification evaluation. The discussed ecotoxicological methodologies encompass in vitro testing (cytotoxicity, genotoxicity, mutagenicity, endocrine disruption, adaptive stress response activation, toxicogenomics) and in vivo tests (single and multi species, biomonitoring). We critically discuss the benefits and limitations of the different methodologies reviewed. Additionally, we provide an overview of the current state of research regarding the chemical and ecotoxicological evaluation of conventional as well as the most widely used advanced wastewater treatment technologies, i.e., ozonation, advanced oxidation processes, chlorination, activated carbon, and membrane filtration. In particular, possible directions for future research activities in this area are provided. PMID:26431616

  20. Spoilt for choice: A critical review on the chemical and biological assessment of current wastewater treatment technologies.

    PubMed

    Prasse, Carsten; Stalter, Daniel; Schulte-Oehlmann, Ulrike; Oehlmann, Jörg; Ternes, Thomas A

    2015-12-15

    The knowledge we have gained in recent years on the presence and effects of compounds discharged by wastewater treatment plants (WWTPs) brings us to a point where we must question the appropriateness of current water quality evaluation methodologies. An increasing number of anthropogenic chemicals is detected in treated wastewater and there is increasing evidence of adverse environmental effects related to WWTP discharges. It has thus become clear that new strategies are needed to assess overall quality of conventional and advanced treated wastewaters. There is an urgent need for multidisciplinary approaches combining expertise from engineering, analytical and environmental chemistry, (eco)toxicology, and microbiology. This review summarizes the current approaches used to assess treated wastewater quality from the chemical and ecotoxicological perspective. Discussed chemical approaches include target, non-target and suspect analysis, sum parameters, identification and monitoring of transformation products, computational modeling as well as effect directed analysis and toxicity identification evaluation. The discussed ecotoxicological methodologies encompass in vitro testing (cytotoxicity, genotoxicity, mutagenicity, endocrine disruption, adaptive stress response activation, toxicogenomics) and in vivo tests (single and multi species, biomonitoring). We critically discuss the benefits and limitations of the different methodologies reviewed. Additionally, we provide an overview of the current state of research regarding the chemical and ecotoxicological evaluation of conventional as well as the most widely used advanced wastewater treatment technologies, i.e., ozonation, advanced oxidation processes, chlorination, activated carbon, and membrane filtration. In particular, possible directions for future research activities in this area are provided.

  1. Some of the Critical Issues in Introducing Computer Technology into Schools.

    ERIC Educational Resources Information Center

    Heuston, Dustin H.

    This paper discusses some of the significant issues that school districts, superintendents, principals, board members, and faculty will have to face in the acquisition and implementation of educational hardware and software. Strengths and weaknesses of various computer configurations are presented and it is suggested that the use of professional…

  2. Embodying Our Values in Our Teaching Practices: Building Open and Critical Discourse through Computer Mediated Communication.

    ERIC Educational Resources Information Center

    Geelan, David R.; Taylor, Peter C.

    2001-01-01

    Describes the use of computer-mediated communication to develop a cooperative learning community among students in a Web-based distance education unit for practicing science and mathematics educators in Australia and Pacific Rim countries. Discusses use of the social constructivist and constructionist conceptions of teaching and learning.…

  3. Embodying Our Values in Our Teaching Practices: Building Open and Critical Discourse through Computer Mediated Communication

    ERIC Educational Resources Information Center

    Geelan, David R.; Taylor, Peter C.

    2004-01-01

    Computer mediated communication--including web pages, email and web-based bulletin boards--was used to support the development of a cooperative learning community among students in a web-based distance education unit for practicing science and mathematics educators. The students lived in several Australian states and a number of Pacific Rim…

  4. Fostering Critical Reflection in a Computer-Based, Asynchronously Delivered Diversity Training Course

    ERIC Educational Resources Information Center

    Givhan, Shawn T.

    2013-01-01

    This dissertation study chronicles the creation of a computer-based, asynchronously delivered diversity training course for a state agency. The course format enabled efficient delivery of a mandatory curriculum to the Massachusetts Department of State Police workforce. However, the asynchronous format posed a challenge to achieving the learning…

  5. The statistical-thermodynamic basis for computation of binding affinities: a critical review.

    PubMed Central

    Gilson, M K; Given, J A; Bush, B L; McCammon, J A

    1997-01-01

    Although the statistical thermodynamics of noncovalent binding has been considered in a number of theoretical papers, few methods of computing binding affinities are derived explicitly from this underlying theory. This has contributed to uncertainty and controversy in certain areas. This article therefore reviews and extends the connections of some important computational methods with the underlying statistical thermodynamics. A derivation of the standard free energy of binding forms the basis of this review. This derivation should be useful in formulating novel computational methods for predicting binding affinities. It also permits several important points to be established. For example, it is found that the double-annihilation method of computing binding energy does not yield the standard free energy of binding, but can be modified to yield this quantity. The derivation also makes it possible to define clearly the changes in translational, rotational, configurational, and solvent entropy upon binding. It is argued that molecular mass has a negligible effect upon the standard free energy of binding for biomolecular systems, and that the cratic entropy defined by Gurney is not a useful concept. In addition, the use of continuum models of the solvent in binding calculations is reviewed, and a formalism is presented for incorporating a limited number of solvent molecules explicitly. PMID:9138555

  6. Evaluating How the Computer-Supported Collaborative Learning Community Fosters Critical Reflective Practices

    ERIC Educational Resources Information Center

    Ma, Ada W.W.

    2013-01-01

    In recent research, little attention has been paid to issues of methodology and analysis methods to evaluate the quality of the collaborative learning community. To address such issues, an attempt is made to adopt the Activity System Model as an analytical framework to examine the relationship between computer supported collaborative learning…

  7. Resourcefulness training intervention: assessing critical parameters from relocated older adults' perspectives.

    PubMed

    Bekhet, Abir K; Zauszniewski, Jaclene A; Matel-Anderson, Denise M

    2012-07-01

    The population of American elders is increasing rapidly and relocation to retirement communities has been found to adversely affect their adjustment. This pilot study of 38 relocated elders evaluated, from elders' perspectives, six critical parameters of a resourcefulness training (RT) intervention designed to help elders adjust to relocation. Within the context of Zauszniewski's theory of resourcefulness, a pre-/post-test design with random assignment to RT or to diversionary activities (DA) was used. Objective questionnaires measured demographic and relocation factors. An intervention evaluation questionnaire was designed and given to the relocated elders in order to assess the six critical parameters--necessity, acceptability, feasibility, safety, fidelity, and effectiveness. Data concerning the critical parameters were collected during structured interviews within a week after the intervention. Seventy-six of the elders who scored less than 120 in the resourcefulness scale indicated a strong need for RT. While all non-white elders reported needing RT, 43% of white elders reported the same need. Elders indicated that learning about the experiences of others and taking part in discussions were the most interesting part of the RT. Approximately 95% of participants mentioned that they learned all parts of the intervention; few suggested having a stronger leader to keep the group on track. The qualitative findings from this pilot intervention study will inform future, larger clinical trials to help recently relocated elders adjust to relocation. PMID:22757595

  8. [Catchment scale risk assessment and critical source area identification of agricultural phosphorus loss].

    PubMed

    Li, Qi; Chen, Li-Ding; Qi, Xin; Zhang, Xin-Yu; Ma, Yan

    2007-09-01

    Agricultural non-point source phosphorus pollution is a severe problem for rural water bodies in China, but hard to control directly because of its special characteristics. In this paper, an approach on the catchment scale risk assessment and critical source area identification of agricultural phosphorus loss in northern China was made, based on the catchment scale phosphorus ranking scheme and the method proposed by Gburek et al. Eight factors were selected and weighed in the modified catchment scale phosphorus ranking scheme, and the phosphorus loss risk rating of each factor was adjusted based on the current professional standards and the actual circumstances in China. The areas with ' high' risk rating of phosphorus loss in definite catchment were the critical source areas for non-point source phosphorous pollution control in that catment. The availability of obtained data and the quantification of the assessment were taken into account in the new scheme, and GIS technique and geostatistics were used for confirming the factors. Therefore, the new scheme had definite operability and practicability. PMID:18062300

  9. Critical Factors Affecting the Assessment of Student Learning Outcomes: A Delphi Study of the Opinions of Community College Personnel

    ERIC Educational Resources Information Center

    Somerville, Jerry

    2008-01-01

    The purpose of this qualitative study was to identify critically important factors that affect the meaningful assessment of student learning outcomes and study why these factors were critically important. A three-round Delphi process was used to solicit the opinions of individuals who were actively involved in student learning outcomes assessment…

  10. Ebola preparedness: a rapid needs assessment of critical care in a tertiary hospital

    PubMed Central

    Sutherland, Stephanie; Robillard, Nicholas; Kim, John; Dupuis, Kirsten; Thornton, Mary; Mansour, Marlene; Cardinal, Pierre

    2015-01-01

    Background: The current outbreak of Ebola has been declared a public health emergency of international concern. We performed a rigorous and rapid needs assessment to identify the desired results, the gaps in current practice, and the barriers and facilitators to the development of solutions in the provision of critical care to patients with suspected or confirmed Ebola. Methods: We conducted a qualitative study with an emergent design at a tertiary hospital in Ontario, Canada, recently designated as an Ebola centre, from Oct. 21 to Nov. 7, 2014. Participants included physicians, nurses, respiratory therapists, and staff from infection control, housekeeping, waste management, administration, facilities, and occupational health and safety. Data collection included document analysis, focus groups, interviews and walk-throughs of critical care areas with key stakeholders. Results: Fifteen themes and 73 desired results were identified, of which 55 had gaps. During the study period, solutions were implemented to fully address 8 gaps and partially address 18 gaps. Themes identified included the following: screening; response team activation; personal protective equipment; postexposure to virus; patient placement, room setup, logging and signage; intrahospital patient movement; interhospital patient movement; critical care management; Ebola-specific diagnosis and treatment; critical care staffing; visitation and contacts; waste management, environmental cleaning and management of linens; postmortem; conflict resolution; and communication. Interpretation: This investigation identified widespread gaps across numerous themes; as such, we have been able to develop a set of credible and measureable results. All hospitals need to be prepared for contact with a patient with Ebola, and the preparedness plan will need to vary based on local context, resources and site designation. PMID:26389098

  11. Selecting an Architecture for a Safety-Critical Distributed Computer System with Power, Weight and Cost Considerations

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2014-01-01

    This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.

  12. Computer Assisted Project-Based Instruction: The Effects on Science Achievement, Computer Achievement and Portfolio Assessment

    ERIC Educational Resources Information Center

    Erdogan, Yavuz; Dede, Dinçer

    2015-01-01

    The purpose of this study is to compare the effects of computer assisted project-based instruction on learners' achievement in a science and technology course, in a computer course and in portfolio development. With this aim in mind, a quasi-experimental design was used and a sample of 70 seventh grade secondary school students from Org. Esref…

  13. Development and assessment of a clinically viable system for breast ultrasound computer-aided diagnosis

    NASA Astrophysics Data System (ADS)

    Gruszauskas, Nicholas Peter

    The chances of surviving a breast cancer diagnosis as well as the effectiveness of any potential treatments increase significantly with early detection of the disease. As such, a considerable amount of research is being conducted to augment the breast cancer detection and diagnosis process. One such area of research involves the investigation and application of sophisticated computer algorithms to assist clinicians in detecting and diagnosing breast cancer on medical images (termed generally as "computer-aided diagnosis" or CAD). This study investigated a previously-developed breast ultrasound CAD system with the intent of translating it into a clinically-viable system. While past studies have demonstrated that breast ultrasound CAD may be a beneficial aid during the diagnosis of breast cancer on ultrasound, there are no investigations concerning its potential clinical translation and there are currently no commercially-available implementations of such systems. This study "bridges the gap" between the laboratory-developed system and the steps necessary for clinical implementation. A novel observer study was conducted that mimicked the clinical use of the breast ultrasound CAD system in order to assess the impact it had on the diagnostic performance of the user. Several robustness studies were also performed: the sonographic features used by the system were evaluated and the databases used for calibration and testing were characterized, the effect of the user's input was assessed by evaluating the performance of the system with variations in lesion identification and image selection, and the performance of the system on different patient populations was investigated by evaluating its performance on a database consisting solely of patients with Asian ethnicity. The analyses performed here indicate that the breast ultrasound CAD system under investigation is robust and demonstrates only minor variability when subjected to "real-world" use. All of these results are

  14. Computer Simulation as a Tool for Assessing Decision-Making in Pandemic Influenza Response Training

    PubMed Central

    Leaming, James M.; Adoff, Spencer; Terndrup, Thomas E.

    2013-01-01

    Introduction: We sought to develop and test a computer-based, interactive simulation of a hypothetical pandemic influenza outbreak. Fidelity was enhanced with integrated video and branching decision trees, built upon the 2007 federal planning assumptions. We conducted a before-and-after study of the simulation effectiveness to assess the simulations' ability to assess participants' beliefs regarding their own hospitals' mass casualty incident preparedness. Methods: Development: Using a Delphi process, we finalized a simulation that serves up a minimum of over 50 key decisions to 6 role-players on networked laptops in a conference area. The simulation played out an 8-week scenario, beginning with pre-incident decisions. Testing: Role-players and trainees (N=155) were facilitated to make decisions during the pandemic. Because decision responses vary, the simulation plays out differently, and a casualty counter quantifies hypothetical losses. The facilitator reviews and critiques key factors for casualty control, including effective communications, working with external organizations, development of internal policies and procedures, maintaining supplies and services, technical infrastructure support, public relations and training. Pre- and post-survey data were compared on trainees. Results: Post-simulation trainees indicated a greater likelihood of needing to improve their organization in terms of communications, mass casualty incident planning, public information and training. Participants also recognized which key factors required immediate attention at their own home facilities. Conclusion: The use of a computer-simulation was effective in providing a facilitated environment for determining the perception of preparedness, evaluating general preparedness concepts and introduced participants to critical decisions involved in handling a regional pandemic influenza surge. PMID:23687542

  15. Ensuring critical event sequences in high consequence computer based systems as inspired by path expressions

    SciTech Connect

    Kidd, M.E.C.

    1997-02-01

    The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.

  16. Critical state solution and alternating current loss computation of polygonally arranged thin superconducting tapes

    NASA Astrophysics Data System (ADS)

    Brambilla, Roberto; Grilli, Francesco; Martini, Luciano

    2013-08-01

    The current density and field distributions in polygonally arranged thin superconducting tapes carrying AC current are derived under the assumption of the critical state model. Starting from the generic Biot-Savart law for a general polygonal geometry, we derive a suitable integral equation for the calculation of the current density and magnetic field in each tape. The model works for any transport current below Ic, which makes it attractive for studying cases of practical interest, particularly the dependence of AC losses on parameters such as the number of tapes, their distance from the center, and their separation.

  17. Critical anatomic region of nasopalatine canal based on tridimensional analysis: cone beam computed tomography

    PubMed Central

    Fernández-Alonso, Ana; Antonio Suárez-Quintanilla, Juan; Muinelo-Lorenzo, Juan; Varela-Mallou, Jesús; Smyth Chamosa, Ernesto; Mercedes Suárez-Cunqueiro, María

    2015-01-01

    The study aim of this was to define the critical anatomic region of the premaxilla by evaluating dimensions of nasopalatine canal, buccal bone plate (BBP) and palatal bone plate (PBP). 230 CBCTs were selected with both, one or no upper central incisors present (+/+, −/+, −/−) and periodontal condition was evaluated. T-student test, ANOVA, Pearson´s correlation and a multivariant-linear regression model (MLRM) were used. Regarding gender, significant differences at level 1 (lower NC) were found for: buccal-palatal, transversal and sagittal NC diameters, and NC length (NCL). Regarding dental status, significant differences were found for: total BBP length (tBL) and PBP width (PW2) at level 2 (NCL midpoint). NCL was correlated with PW2, tBL, and PBP length at level 3 (foramina of Stenson level). An MLRM had a high prediction value for NCL (69.3%). Gender is related to NC dimensions. Dental status has an influence on BBP dimensions, but does not influence on NC and PBP. Periodontal condition should be evaluated for precise premaxillae analysis NC diameters at the three anatomical planes are related to each other, while NCL is related to BBP and PBP lengths. A third of premaxilla is taken up by NC, thus, establishing the critical anatomic region. PMID:26245884

  18. Critical anatomic region of nasopalatine canal based on tridimensional analysis: cone beam computed tomography.

    PubMed

    Fernández-Alonso, Ana; Suárez-Quintanilla, Juan Antonio; Muinelo-Lorenzo, Juan; Varela-Mallou, Jesús; Smyth Chamosa, Ernesto; Suárez-Cunqueiro, María Mercedes

    2015-01-01

    The study aim of this was to define the critical anatomic region of the premaxilla by evaluating dimensions of nasopalatine canal, buccal bone plate (BBP) and palatal bone plate (PBP). 230 CBCTs were selected with both, one or no upper central incisors present (+/+, -/+, -/-) and periodontal condition was evaluated. T-student test, ANOVA, Pearson's correlation and a multivariant-linear regression model (MLRM) were used. Regarding gender, significant differences at level 1 (lower NC) were found for: buccal-palatal, transversal and sagittal NC diameters, and NC length (NCL). Regarding dental status, significant differences were found for: total BBP length (tBL) and PBP width (PW2) at level 2 (NCL midpoint). NCL was correlated with PW2, tBL, and PBP length at level 3 (foramina of Stenson level). An MLRM had a high prediction value for NCL (69.3%). Gender is related to NC dimensions. Dental status has an influence on BBP dimensions, but does not influence on NC and PBP. Periodontal condition should be evaluated for precise premaxillae analysis NC diameters at the three anatomical planes are related to each other, while NCL is related to BBP and PBP lengths. A third of premaxilla is taken up by NC, thus, establishing the critical anatomic region. PMID:26245884

  19. An assessment of future computer system needs for large-scale computation

    NASA Technical Reports Server (NTRS)

    Lykos, P.; White, J.

    1980-01-01

    Data ranging from specific computer capability requirements to opinions about the desirability of a national computer facility are summarized. It is concluded that considerable attention should be given to improving the user-machine interface. Otherwise, increased computer power may not improve the overall effectiveness of the machine user. Significant improvement in throughput requires highly concurrent systems plus the willingness of the user community to develop problem solutions for that kind of architecture. An unanticipated result was the expression of need for an on-going cross-disciplinary users group/forum in order to share experiences and to more effectively communicate needs to the manufacturers.

  20. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    NASA Astrophysics Data System (ADS)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  1. Cone beam computed tomography aided diagnosis and treatment of endodontic cases: Critical analysis.

    PubMed

    Yılmaz, Funda; Kamburoglu, Kıvanç; Yeta, Naz Yakar; Öztan, Meltem Dartar

    2016-07-28

    Although intraoral radiographs still remain the imaging method of choice for the evaluation of endodontic patients, in recent years, the utilization of cone beam computed tomography (CBCT) in endodontics showed a significant jump. This case series presentation shows the importance of CBCT aided diagnosis and treatment of complex endodontic cases such as; root resorption, missed extra canal, fusion, oblique root fracture, non-diagnosed periapical pathology and horizontal root fracture. CBCT may be a useful diagnostic method in several endodontic cases where intraoral radiography and clinical examination alone are unable to provide sufficient information. PMID:27551342

  2. Cone beam computed tomography aided diagnosis and treatment of endodontic cases: Critical analysis

    PubMed Central

    Yılmaz, Funda; Kamburoglu, Kıvanç; Yeta, Naz Yakar; Öztan, Meltem Dartar

    2016-01-01

    Although intraoral radiographs still remain the imaging method of choice for the evaluation of endodontic patients, in recent years, the utilization of cone beam computed tomography (CBCT) in endodontics showed a significant jump. This case series presentation shows the importance of CBCT aided diagnosis and treatment of complex endodontic cases such as; root resorption, missed extra canal, fusion, oblique root fracture, non-diagnosed periapical pathology and horizontal root fracture. CBCT may be a useful diagnostic method in several endodontic cases where intraoral radiography and clinical examination alone are unable to provide sufficient information. PMID:27551342

  3. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    NASA Astrophysics Data System (ADS)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  4. A critical review of environmental assessment tools for sustainable urban design

    SciTech Connect

    Ameen, Raed Fawzi Mohammed; Mourshed, Monjur; Li, Haijiang

    2015-11-15

    Cities are responsible for the depletion of natural resources and agricultural lands, and 70% of global CO{sub 2} emissions. There are significant risks to cities from the impacts of climate change in addition to existing vulnerabilities, primarily because of rapid urbanization. Urban design and development are generally considered as the instrument to shape the future of the city and they determine the pattern of a city's resource usage and resilience to change, from climate or otherwise. Cities are inherently dynamic and require the participation and engagement of their diverse stakeholders for the effective management of change, which enables wider stakeholder involvement and buy-in at various stages of the development process. Sustainability assessment of urban design and development is increasingly being seen as indispensable for informed decision-making. A sustainability assessment tool also acts as a driver for the uptake of sustainable pathways by recognizing excellence through their rating system and by creating a market demand for sustainable products and processes. This research reviews six widely used sustainability assessment tools for urban design and development: BREEAM Communities, LEED-ND, CASBEE-UD, SBTool{sup PT}–UP, Pearl Community Rating System (PCRS) and GSAS/QSAS, to identify, compare and contrast the aim, structure, assessment methodology, scoring, weighting and suitability for application in different geographical contexts. Strengths and weaknesses of each tool are critically discussed. The study highlights the disparity in local and international contexts for global sustainability assessment tools. Despite their similarities in aim on environmental aspects, differences exist in the relative importance and share of mandatory vs optional indicators in both environmental and social dimensions. PCRS and GSAS/QSAS are new incarnations, but have widely varying shares of mandatory indicators, at 45.4% and 11.36% respectively, compared to 30% in

  5. Implementing and Assessing Computational Modeling in Introductory Mechanics

    ERIC Educational Resources Information Center

    Caballero, Marcos D.; Kohlmyer, Matthew A.; Schatz, Michael F.

    2012-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term, 1357 students in this course solved a suite of 14 computational…

  6. Evaluation and Assessment of a Biomechanics Computer-Aided Instruction.

    ERIC Educational Resources Information Center

    Washington, N.; Parnianpour, M.; Fraser, J. M.

    1999-01-01

    Describes the Biomechanics Tutorial, a computer-aided instructional tool that was developed at Ohio State University to expedite the transition from lecture to application for undergraduate students. Reports evaluation results that used statistical analyses and student questionnaires to show improved performance on posttests as well as positive…

  7. Use of computer-aided testing in the investigation of pilot response to critical in-flight events. Volume 2: Appendix

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Giffin, W. C.

    1982-01-01

    Computer displays using PLATO are illustrated. Diagnostic scenarios are described. A sample of subject data is presented. Destination diversion displays, a combined destination, diversion scenario, and critical in-flight event (CIFE) data collection/subject testing system are presented.

  8. The Sixth Rhino: A Taxonomic Re-Assessment of the Critically Endangered Northern White Rhinoceros

    PubMed Central

    Groves, Colin P.; Fernando, Prithiviraj; Robovský, Jan

    2010-01-01

    Background The two forms of white rhinoceros; northern and southern, have had contrasting conservation histories. The Northern form, once fairly numerous is now critically endangered, while the southern form has recovered from a few individuals to a population of a few thousand. Since their last taxonomic assessment over three decades ago, new material and analytical techniques have become available, necessitating a review of available information and re-assessment of the taxonomy. Results Dental morphology and cranial anatomy clearly diagnosed the southern and northern forms. The differentiation was well supported by dental metrics, cranial growth and craniometry, and corresponded with differences in post-cranial skeleton, external measurements and external features. No distinctive differences were found in the limited descriptions of their behavior and ecology. Fossil history indicated the antiquity of the genus, dating back at least to early Pliocene and evolution into a number of diagnosable forms. The fossil skulls examined fell outside the two extant forms in the craniometric analysis. Genetic divergence between the two forms was consistent across both nuclear and mitochondrial genomes, and indicated a separation of over a million years. Conclusions On re-assessing the taxonomy of the two forms we find them to be morphologically and genetically distinct, warranting the recognition of the taxa formerly designated as subspecies; Ceratotherium simum simum the southern form and Ceratotherium simum cottoni the northern form, as two distinct species Ceratotherium simum and Ceratotherium cottoni respectively. The recognition of the northern form as a distinct species has profound implications for its conservation. PMID:20383328

  9. The Application of Web-based Computer-assisted Instruction Courseware within Health Assessment

    NASA Astrophysics Data System (ADS)

    Xiuyan, Guo

    Health assessment is a clinical nursing course and places emphasis on clinical skills. The application of computer-assisted instruction in the field of nursing teaching solved the problems in the traditional lecture class. This article stated teaching experience of web-based computer-assisted instruction, based upon a two-year study of computer-assisted instruction courseware use within the course health assessment. The computer-assisted instruction courseware could develop teaching structure, simulate clinical situations, create teaching situations and facilitate students study.

  10. Current Assessment and Classification of Suicidal Phenomena using the FDA 2012 Draft Guidance Document on Suicide Assessment: A Critical Review

    PubMed Central

    Giddens, Jennifer M.; Sheehan, Kathy Harnett

    2014-01-01

    Objective: Standard international classification criteria require that classification categories be comprehensive to avoid type II error. Categories should be mutually exclusive and definitions should be clear and unambiguous (to avoid type I and type II errors). In addition, the classification system should be robust enough to last over time and provide comparability between data collections. This article was designed to evaluate the extent to which the classification system contained in the United States Food and Drug Administration 2012 Draft Guidance for the prospective assessment and classification of suicidal ideation and behavior in clinical trials meets these criteria. Method: A critical review is used to assess the extent to which the proposed categories contained in the Food and Drug Administration 2012 Draft Guidance are comprehensive, unambiguous, and robust. Assumptions that underlie the classification system are also explored. Results: The Food and Drug Administration classification system contained in the 2012 Draft Guidance does not capture the full range of suicidal ideation and behavior (type II error). Definitions, moreover, are frequently ambiguous (susceptible to multiple interpretations), and the potential for misclassification (type I and type II errors) is compounded by frequent mismatches in category titles and definitions. These issues have the potential to compromise data comparability within clinical trial sites, across sites, and over time. Conclusion: These problems need to be remedied because of the potential for flawed data output and consequent threats to public health, to research on the safety of medications, and to the search for effective medication treatments for suicidality. PMID:25520889

  11. Computational fluid dynamics approaches in quality and hygienic production of semisolid low-moisture foods: a review of critical factors.

    PubMed

    Mondal, Arpita; Buchanan, Robert L; Lo, Y Martin

    2014-10-01

    Low-moisture foods have been responsible for a number of salmonellosis outbreaks worldwide over the last few decades, with cross contamination from contaminated equipment being the most predominant source. To date, actions have been focused on stringent hygienic practices prior to production, namely periodical sanitization of the processing equipment and lines. Not only does optimum sanitization require in-depth knowledge on the type and source of contaminants, but also the heat resistance of microorganisms is unique and often dependent on the heat transfer characteristics of the low-moisture foods. Rheological properties, including viscosity, degree of turbulence, and flow characteristics (for example, Newtonian or non-Newtonian) of both liquid and semisolid foods are critical factors impacting the flow behavior that consequently interferes heat transfer and related control elements. The demand for progressively more accurate prediction of complex fluid phenomena has called for the employment of computational fluid dynamics (CFD) to model mass and heat transfer during processing of various food products, ranging from drying to baking. With the aim of improving the quality and safety of low-moisture foods, this article critically reviewed the published literature concerning microbial survival in semisolid low-moisture foods, including chocolate, honey, and peanut butter. Critical rheological properties and state-of-the-art CFD application relevant to quality production of those products were also addressed. It is anticipated that adequate prediction of specific transport properties during optimum sanitization through CFD could be used to solve current and future food safety challenges. PMID:25224872

  12. A Tool for Music Preference Assessment in Critically Ill Patients Receiving Mechanical Ventilatory Support

    PubMed Central

    CHLAN, LINDA; HEIDERSCHEIT, ANNIE

    2010-01-01

    Music is an ideal intervention to reduce anxiety and promote relaxation in critically ill patients. This article reviews the research studies on music-listening interventions to manage distressful symptoms in this population, and describes the development and implementation of the Music Assessment Tool (MAT) to assist professionals in ascertaining patients’ music preferences in the challenging, dynamic clinical environment of the intensive care unit (ICU). The MAT is easy to use with these patients who experience profound communication challenges due to fatigue and inability to speak because of endotracheal tube placement. The music therapist and ICU nursing staff are encouraged to work collaboratively to implement music in a personalized manner to ensure the greatest benefit for mechanically ventilated patients. PMID:24489432

  13. Regulatory assessment of safety critical software used in upgrades to analog systems

    SciTech Connect

    Taylor, R.P.

    1994-12-31

    As a result of the difficulties encountered by both licensee and regulator during the licensing of the Darlington nuclear generating station software-based shutdown systems, Ontario Hydro was directed by the Atomic Energy Control Board (AECL) to produce improved company standards and procedures for safety-critical software development. In partnership with Atomic Energy of Canada Ltd. (AECL), a joint committee called OASES (Ontario Hydro/AECL Software Engineering Standards) has developed a suite of standards and procedures for software specification, design, implementation, verification, testing, and safety analysis. These standards are now being applied to new systems and are being adapted for use on upgrades to existing systems. Several digital protection systems have been installed recently in Canadian nuclear plants, such as a primary heat transport pump trip and an emergency powerhouse venting system. We have learned from the experience of assessing these systems and are now applying these lessons to systems developed under the new OASES standards.

  14. Assessment of chemical lumbar sympathectomy in critical limb ischaemia using thermal imaging.

    PubMed

    Greenstein, D; Brown, T F; Kester, R C

    1994-02-01

    Objective assessment of chemical lumbar sympathectomy (CLS) is lacking. Its success is usually judged in terms of the patient's clinical improvement. We have thermographically measured the immediate temperature changes of the lower limb following CLS using a thermal imager (SAN-EI Thermotracer 6T61). Seven patients with critical limb ischaemia and one patient with Raynaud's phenomenon underwent unilateral ablation of the lumbar sympathetic chain using 5% phenol. Four patients were diabetic, two of whom had undergone previous sympathectomy on the same side. Within fifteen minutes of injection, all patients showed a rise in skin temperature in parts of the sock distribution of between 0.8 degrees C and 8.5 degrees C. We conclude that the haemodynamic effects of CLS are immediate and can be objectively measured with thermal imaging. PMID:8195656

  15. A systematic review and critical assessment of incentive strategies for discovery and development of novel antibiotics

    PubMed Central

    Renwick, Matthew J; Brogan, David M; Mossialos, Elias

    2016-01-01

    Despite the growing threat of antimicrobial resistance, pharmaceutical and biotechnology firms are reluctant to develop novel antibiotics because of a host of market failures. This problem is complicated by public health goals that demand antibiotic conservation and equitable patient access. Thus, an innovative incentive strategy is needed to encourage sustainable investment in antibiotics. This systematic review consolidates, classifies and critically assesses a total of 47 proposed incentives. Given the large number of possible strategies, a decision framework is presented to assist with the selection of incentives. This framework focuses on addressing market failures that result in limited investment, public health priorities regarding antibiotic stewardship and patient access, and implementation constraints and operational realities. The flexible nature of this framework allows policy makers to tailor an antibiotic incentive package that suits a country's health system structure and needs. PMID:26464014

  16. A systematic review and critical assessment of incentive strategies for discovery and development of novel antibiotics.

    PubMed

    Renwick, Matthew J; Brogan, David M; Mossialos, Elias

    2016-02-01

    Despite the growing threat of antimicrobial resistance, pharmaceutical and biotechnology firms are reluctant to develop novel antibiotics because of a host of market failures. This problem is complicated by public health goals that demand antibiotic conservation and equitable patient access. Thus, an innovative incentive strategy is needed to encourage sustainable investment in antibiotics. This systematic review consolidates, classifies and critically assesses a total of 47 proposed incentives. Given the large number of possible strategies, a decision framework is presented to assist with the selection of incentives. This framework focuses on addressing market failures that result in limited investment, public health priorities regarding antibiotic stewardship and patient access, and implementation constraints and operational realities. The flexible nature of this framework allows policy makers to tailor an antibiotic incentive package that suits a country's health system structure and needs.

  17. PROBABILISTIC ASSESSMENT OF A CRITICALITY IN A WASTE CONTAINER AT SRS

    SciTech Connect

    Eghbali, D

    2006-12-26

    Transuranic solid waste that has been generated as a result of the production of nuclear material for the United States defense program at the Savannah River Site (SRS) has been stored in more than 30,000 55-gallon drums and various size carbon steel boxes since 1953. Nearly two thirds of those containers have been processed and shipped to the Waste Isolation Pilot Plant. Among the containers assayed so far, the results indicate several drums with fissile inventories significantly higher (600-1000 grams {sup 239}Pu) than their original assigned values. While part of this discrepancy can be attributed to the past limited assay capabilities, human errors are believed to be the primary contributor. This paper summarizes an assessment of the probability of occurrence of a criticality accident during handling of the remaining transuranic waste containers at SRS.

  18. Assessing Critical Source Areas in Watersheds for Conservation Buffer Planning and Riparian Restoration

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyuan

    2009-11-01

    A science-based geographic information system (GIS) approach is presented to target critical source areas in watersheds for conservation buffer placement. Critical source areas are the intersection of hydrologically sensitive areas and pollutant source areas in watersheds. Hydrologically sensitive areas are areas that actively generate runoff in the watershed and are derived using a modified topographic index approach based on variable source area hydrology. Pollutant source areas are the areas in watersheds that are actively and intensively used for such activities as agricultural production. The method is applied to the Neshanic River watershed in Hunterdon County, New Jersey. The capacity of the topographic index in predicting the spatial pattern of runoff generation and the runoff contribution to stream flow in the watershed is evaluated. A simple cost-effectiveness assessment is conducted to compare the conservation buffer placement scenario based on this GIS method to conventional riparian buffer scenarios for placing conservation buffers in agricultural lands in the watershed. The results show that the topographic index reasonably predicts the runoff generation in the watershed. The GIS-based conservation buffer scenario appears to be more cost-effective than the conventional riparian buffer scenarios.

  19. Temporal discounting in life cycle assessment: A critical review and theoretical framework

    SciTech Connect

    Yuan, Chris; Wang, Endong; Zhai, Qiang; Yang, Fan

    2015-02-15

    Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting in LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.

  20. Critical issues in the formation of quantum computer test structures by ion implantation

    SciTech Connect

    Schenkel, T.; Lo, C. C.; Weis, C. D.; Schuh, A.; Persaud, A.; Bokor, J.

    2009-04-06

    The formation of quantum computer test structures in silicon by ion implantation enables the characterization of spin readout mechanisms with ensembles of dopant atoms and the development of single atom devices. We briefly review recent results in the characterization of spin dependent transport and single ion doping and then discuss the diffusion and segregation behaviour of phosphorus, antimony and bismuth ions from low fluence, low energy implantations as characterized through depth profiling by secondary ion mass spectrometry (SIMS). Both phosphorus and bismuth are found to segregate to the SiO2/Si interface during activation anneals, while antimony diffusion is found to be minimal. An effect of the ion charge state on the range of antimony ions, 121Sb25+, in SiO2/Si is also discussed.

  1. Critical issues in the formation of quantum computer test structures by ion implantation

    NASA Astrophysics Data System (ADS)

    Schenkel, T.; Lo, C. C.; Weis, C. D.; Schuh, A.; Persaud, A.; Bokor, J.

    2009-08-01

    The formation of quantum computer test structures in silicon by ion implantation enables the characterization of spin readout mechanisms with ensembles of dopant atoms and the development of single atom devices. We briefly review recent results in the characterization of spin dependent transport and single ion doping and then discuss the diffusion and segregation behaviour of phosphorus, antimony and bismuth ions from low fluence, low energy implantations as characterized through depth profiling by secondary ion mass spectrometry (SIMS). Both phosphorus and bismuth are found to segregate to the SiO 2/Si interface during activation anneals, while antimony diffusion is found to be minimal. An effect of the ion charge state on the range of antimony ions, 121Sb 25+, in SiO 2/Si is also discussed.

  2. Faculty Approaches to Assessing Critical Thinking in the Humanities and the Natural and Social Sciences: Implications for General Education

    ERIC Educational Resources Information Center

    Nicholas, Mark C.; Labig, Chalmer E., Jr.

    2013-01-01

    An analysis of interviews, focus-group discussions, assessment instruments, and assignment prompts revealed that within general education, faculty assessed critical thinking as faceted using methods and criteria that varied epistemically across disciplines. Faculty approaches were misaligned with discipline-general institutional approaches.…

  3. Assessing a Critical Aspect of Construct Continuity when Test Specifications Change or Test Forms Deviate from Specifications

    ERIC Educational Resources Information Center

    Liu, Jinghua; Dorans, Neil J.

    2013-01-01

    We make a distinction between two types of test changes: inevitable deviations from specifications versus planned modifications of specifications. We describe how score equity assessment (SEA) can be used as a tool to assess a critical aspect of construct continuity, the equivalence of scores, whenever planned changes are introduced to testing…

  4. Computer-aided diagnosis in lung nodule assessment.

    PubMed

    Goldin, Jonathan G; Brown, Matthew S; Petkovska, Iva

    2008-05-01

    Computed tomography (CT) imaging is playing an increasingly important role in cancer detection, diagnosis, and lesion characterization, and it is the most sensitive test for lung nodule detection. Interpretation of lung nodules involves characterization and integration of clinical and other imaging information. Advances in lung nodule management using CT require optimization of CT data acquisition, postprocessing tools, and computer-aided diagnosis (CAD). The goal of CAD systems being developed is to both assist radiologists in the more sensitive detection of nodules and noninvasively differentiate benign from malignant lesions; the latter is important given that malignant lesions account for between 1% and 11% of pulmonary nodules. The aim of this review is to summarize the current state of the art regarding CAD techniques for the detection and characterization of solitary pulmonary nodules and their potential applications in the clinical workup of these lesions.

  5. Long-Term Assessment of Critical Radionuclides and Associated Environmental Media at the Savannah River Site

    SciTech Connect

    Jannik, G. T.; Baker, R. A.; Lee, P. L.; Eddy, T. P.; Blount, G. C.; Whitney, G. R.

    2012-11-06

    During the operational history of the Savannah River Site (SRS), many different radionuclides have been released from site facilities. However, only a relatively small number of the released radionuclides have been significant contributors to doses and risks to the public. At SRS dose and risk assessments indicate tritium oxide in air and surface water, and Cs-137 in fish and deer have been, and continue to be, the critical radionuclides and pathways. In this assessment, indepth statistical analyses of the long-term trends of tritium oxide in atmospheric and surface water releases and Cs-137 concentrations in fish and deer are provided. Correlations also are provided with 1) operational changes and improvements, 2) geopolitical events (Cold War cessation), and 3) recent environmental remediation projects and decommissioning of excess facilities. For example, environmental remediation of the F- and H-Area Seepage Basins and the Solid Waste Disposal Facility have resulted in a measurable impact on the tritium oxide flux to the onsite Fourmile Branch stream. Airborne releases of tritium oxide have been greatly affected by operational improvements and the end of the Cold War in 1991. However, the effects of SRS environmental remediation activities and ongoing tritium operations on tritium concentrations in the environment are measurable and documented in this assessment. Controlled hunts of deer and feral hogs are conducted at SRS for approximately six weeks each year. Before any harvested animal is released to a hunter, SRS personnel perform a field analysis for Cs-137 concentrations to ensure the hunter's dose does not exceed the SRS administrative game limit of 0.22 millisievert (22 mrem). However, most of the Cs-137 found in SRS onsite deer is not from site operations but is from nuclear weapons testing fallout from the 1950's and early 1960's. This legacy source term is trended in the SRS deer, and an assessment of the ''effective'' half-life of Cs-137 in deer

  6. Assessing the impact of ionizing radiation on aquatic invertebrates: a critical review.

    PubMed

    Dallas, Lorna J; Keith-Roach, Miranda; Lyons, Brett P; Jha, Awadhesh N

    2012-05-01

    There is growing scientific, regulatory and public concern over anthropogenic input of radionuclides to the aquatic environment, especially given the issues surrounding existing nuclear waste, future energy demand and past or potential nuclear accidents. A change in the approach to how we protect the environment from ionizing radiation has also underlined the importance of assessing its impact on nonhuman biota. This review presents a thorough and critical examination of the available information on the effects of ionizing radiation on aquatic invertebrates, which constitute approximately 90% of extant life on the planet and play vital roles in ecosystem functioning. The aim of the review was to assess the progress made so far, addressing any concerns and identifying the knowledge gaps in the field. The critical analysis of the available information included determining yearly publications in the field, qualities of radiation used, group(s) of animals studied, and levels of biological organization at which effects were examined. The overwhelming conclusion from analysis of the available information is that more data are needed in almost every area. However, in light of the current priorities in human and environmental health, and considering regulatory developments, the following are areas of particular interest for future research on the effects of ionizing radiation on nonhuman biota in general and aquatic invertebrates in particular: (1) studies that use end points across multiple levels of biological organization, including an ecosystem level approach where appropriate, (2) multiple species studies that produce comparable data across phylogenetic groups, and (3) determination of the modifying (i.e. antagonistic, additive or synergistic) effects of biotic and abiotic factors on the impact of ionizing radiation. It is essential that all of these issues are examined in the context of well-defined radiation exposure and total doses received and consider the life

  7. Computer-Aided Argument Mapping in an EFL Setting: Does Technology Precede Traditional Paper and Pencil Approach in Developing Critical Thinking?

    ERIC Educational Resources Information Center

    Eftekhari, Maryam; Sotoudehnama, Elaheh; Marandi, S. Susan

    2016-01-01

    Developing higher-order critical thinking skills as one of the central objectives of education has been recently facilitated via software packages. Whereas one such technology as computer-aided argument mapping is reported to enhance levels of critical thinking (van Gelder 2001), its application as a pedagogical tool in English as a Foreign…

  8. Regulating fatty acids in infant formula: critical assessment of U.S. policies and practices

    PubMed Central

    2014-01-01

    Background Fatty acids in breast-milk such as docosahexaenoic acid and arachidonic acid, commonly known as DHA and ARA, contribute to the healthy development of children in various ways. However, the manufactured versions that are added to infant formula might not have the same health benefits as those in breast-milk. There is evidence that the manufactured additives might cause harm to infants’ health, and they might lead to unwarranted increases in the cost of infant formula. The addition of such fatty acids to infant formula needs to be regulated. In the U.S., the Food and Drug Administration has primary responsibility for regulating the composition of infant formula. The central purpose of this study is to assess the FDA’s efforts with regard to the regulation of fatty acids in infant formula. Methods This study is based on critical analysis of policies and practices described in publicly available documents of the FDA, the manufacturers of fatty acids, and other relevant organizations. The broad framework for this work was set out by the author in his book on Regulating Infant Formula, published in 2011. Results The FDA does not assess the safety or the health impacts of fatty acid additives to infant formula before they are marketed, and there is no systematic assessment after marketing is underway. Rather than making its own independent assessments, the FDA accepts the manufacturers’ claims regarding their products’ safety and effectiveness. Conclusions The FDA is not adequately regulating the use of fatty acid additives to infant formula. This results in exposure of infants to potential risks. Adverse reactions are already on record. Also, the additives have led to increasing costs of infant formula despite the lack of proven benefits to normal, full term infants. There is a need for more effective regulation of DHA and ARA additives to infant formula. PMID:24433303

  9. Academic physicians' assessment of the effects of computers on health care.

    PubMed

    Detmer, W M; Friedman, C P

    1994-01-01

    We assessed the attitudes of academic physicians towards computers in health care at two academic medical centers that are in the early stages of clinical information-system deployment. We distributed a 4-page questionnaire to 470 subjects, and a total of 272 physicians (58%) responded. Our results show that respondents use computers frequently, primarily to perform academic-oriented tasks as opposed to clinical tasks. Overall, respondents viewed computers as being slightly beneficial to health care. They perceive self-education and access to up-to-date information as the most beneficial aspects of computers and are most concerned about privacy issues and the effect of computers on the doctor-patient relationship. Physicians with prior computer training and greater knowledge of informatics concepts had more favorable attitudes towards computers in health care. We suggest that negative attitudes towards computers can be addressed by careful system design as well as targeted educational activities.

  10. [Critical review of 222 cases of neoplastic pathology of the colon. Our experience using a computer].

    PubMed

    Parrella, R E; Astore, S; Brizi, M G; Natale, L; Pagano, A; Posi, G

    1987-11-01

    From August 1983 to December 1985, 2352 radiological examinations of the colon were performed in the Radiology Department of Università Cattolica del Sacro Cuore of Rome. From this group a sample of 222 patients was analyzed. They included 111 patients with colonic polyps and 111 with cancer. These cases were carefully examined, in terms of age, frequency of this pathology according to sex, symptom-illness rate, and the radiological data were compared with the endoscopic and histological findings. The data were processed using a computerized program. A critical correlation of the data obtained revealed that: 1) The surest symptom of colon carcinoma is blood in faeces with or without changes in defaecation frequency. Nor should isolated bowel disorders be ignored ("irritated" colon due to organic injuries). 2) The diagnostic accuracy of double contrast enema is very close to that of endoscopy, provided that intestinal cleaning is adequate (this in fact is an important aspect of the examination). 3) The mean age of patients in this group is high and cancer is more common than polyps. This seems to be due to the back of a complete diagnostic sequence, in which radiology has a specific and important role.

  11. Online training course on critical appraisal for nurses: adaptation and assessment

    PubMed Central

    2014-01-01

    Background Research is an essential activity for improving quality and efficiency in healthcare. The objective of this study was to train nurses from the public Basque Health Service (Osakidetza) in critical appraisal, promoting continuous training and the use of research in clinical practice. Methods This was a prospective pre-post test study. The InfoCritique course on critical appraisal was translated and adapted. A sample of 50 nurses and 3 tutors was recruited. Educational strategies and assessment instruments were established for the course. A course website was created that contained contact details of the teaching team and coordinator, as well as a course handbook and videos introducing the course. Assessment comprised the administration of questionnaires before and after the course, in order to explore the main intervention outcomes: knowledge acquired and self-learning readiness. Satisfaction was also measured at the end of the course. Results Of the 50 health professionals recruited, 3 did not complete the course for personal or work-related reasons. The mean score on the pre-course knowledge questionnaire was 70.5 out of 100, with a standard deviation of 11.96. In general, participants’ performance on the knowledge questionnaire improved after the course, as reflected in the notable increase of the mean score, to 86.6, with a standard deviation of 10.00. Further, analyses confirmed statistically significant differences between pre- and post-course results (p < 0.001). With regard to self-learning readiness, after the course, participants reported a greater readiness and ability for self-directed learning. Lastly, in terms of level of satisfaction with the course, the mean score was 7 out of 10. Conclusions Participants significantly improved their knowledge score and self-directed learning readiness after the educational intervention, and they were overall satisfied with the course. For the health system and nursing professionals, this type of

  12. Contemporary issues for experimental design in assessment of medical imaging and computer-assist systems

    NASA Astrophysics Data System (ADS)

    Wagner, Robert F.; Beiden, Sergey V.; Campbell, Gregory; Metz, Charles E.; Sacks, William M.

    2003-05-01

    The dialog among investigators in academia, industry, NIH, and the FDA has grown in recent years on topics of historic interest to attendees of these SPIE sub-conferences on Image Perception, Observer Performance, and Technology Assessment. Several of the most visible issues in this regard have been the emergence of digital mammography and modalities for computer-assisted detection and diagnosis in breast and lung imaging. These issues appear to be only the "tip of the iceberg" foreshadowing a number of emerging advances in imaging technology. So it is timely to make some general remarks looking back and looking ahead at the landscape (or seascape). The advances have been facilitated and documented in several forums. The major role of the SPIE Medical Imaging Conferences i well-known to all of us. Many of us were also present at the Medical Image Perception Society and co-sponsored by CDRH and NCI in September of 2001 at Airlie House, VA. The workshops and discussions held at that conference addressed some critical contemporary issues related to how society - and in particular industry and FDA - approach the general assessment problem. A great deal of inspiration for these discussions was also drawn from several workshops in recent years sponsored by the Biomedical Imaging Program of the National Cancer Institute on these issues, in particular the problem of "The Moving Target" of imaging technology. Another critical phenomenon deserving our attention is the fact that the Fourth National Forum on Biomedical Imaging in Oncology was recently held in Bethesda, MD., February 6-7, 2003. These forums are presented by the National Cancer Institute (NCI), the Food and Drug Administration (FDA), the Centers for Medicare and Medicaid Services (CMS), and the National Electrical Manufacturers Association (NEMA). They are sponsored by the National Institutes of Health/Foundation for Advanced Education in the Sciences (NIH/FAES). These forums led to the development of the NCI

  13. An Assessment of Nursing Attitudes toward Computers in Health Care.

    ERIC Educational Resources Information Center

    Carl, David L.; And Others

    The attitudes and perceptions of practicing nurses, student nurses, and nurse educators toward computerization of health care were assessed using questionnaires sent to two general hospitals and five nursing education programs. The sample consisted of 83 first-year nursing students, 84 second-year nursing students, 52 practicing nurses, and 26…

  14. Computational and experimental analysis of TMS-induced electric field vectors critical to neuronal activation

    NASA Astrophysics Data System (ADS)

    Krieg, Todd D.; Salinas, Felipe S.; Narayana, Shalini; Fox, Peter T.; Mogul, David J.

    2015-08-01

    Objective. Transcranial magnetic stimulation (TMS) represents a powerful technique to noninvasively modulate cortical neurophysiology in the brain. However, the relationship between the magnetic fields created by TMS coils and neuronal activation in the cortex is still not well-understood, making predictable cortical activation by TMS difficult to achieve. Our goal in this study was to investigate the relationship between induced electric fields and cortical activation measured by blood flow response. Particularly, we sought to discover the E-field characteristics that lead to cortical activation. Approach. Subject-specific finite element models (FEMs) of the head and brain were constructed for each of six subjects using magnetic resonance image scans. Positron emission tomography (PET) measured each subject’s cortical response to image-guided robotically-positioned TMS to the primary motor cortex. FEM models that employed the given coil position, orientation, and stimulus intensity in experimental applications of TMS were used to calculate the electric field (E-field) vectors within a region of interest for each subject. TMS-induced E-fields were analyzed to better understand what vector components led to regional cerebral blood flow (CBF) responses recorded by PET. Main results. This study found that decomposing the E-field into orthogonal vector components based on the cortical surface geometry (and hence, cortical neuron directions) led to significant differences between the regions of cortex that were active and nonactive. Specifically, active regions had significantly higher E-field components in the normal inward direction (i.e., parallel to pyramidal neurons in the dendrite-to-axon orientation) and in the tangential direction (i.e., parallel to interneurons) at high gradient. In contrast, nonactive regions had higher E-field vectors in the outward normal direction suggesting inhibitory responses. Significance. These results provide critical new

  15. Insights Into Microcirculation Underlying Critical Limb Ischemia by Single-Photon Emission Computed Tomography

    PubMed Central

    Liu, Jung-Tung; Chang, Cheng-Siu; Su, Chen-Hsing; Li, Cho-Shun

    2015-01-01

    Abstract Perfusion difference is used as a parameter to evaluate microcirculation. This study aims to differentiate lower-limb perfusion insufficiency from neuropathy to prevent possible occurrence of failed back surgery syndrome (FBSS). Patients were retrospectively gathered from 134 FBSS cases diagnosed in the past 7 years. Up to 82 cases that were excluded from neuralgia by radiologic imaging, electrodiagnostic electromyography, and nerve conduction velocity were enrolled in this study. Perfusion difference was evaluated by single-photon emission computed tomography, and pain intensities were recorded via visual analog scale (VAS) score. Lower perfusion at the left leg comprises 51.2% (42 of 82) of the patients. The mean perfusion difference of the 82 patients was 0.86 ± 0.05 (range: 0.75–0.93). Patients with systemic vascular diseases exhibited significantly higher perfusion difference than that of patients without these related diseases (P < 0.05), except for renal insufficiency (P = 0.134). Significant correlation was observed between perfusion difference and VAS score (r = −0.78; P < 0.0001; n = 82). In this study, we presented perfusion difference as a parameter for evaluating microcirculation, which cannot be detected by ultrasonography or angiography. PMID:26166084

  16. Critical examination of the uniformity requirements for single-photon emission computed tomography.

    PubMed

    O'Connor, M K; Vermeersch, C

    1991-01-01

    It is generally recognized that single-photon emission computed tomography (SPECT) imposes very stringent requirements on gamma camera uniformity to prevent the occurrence of ring artifacts. The purpose of this study was to examine the relationship between nonuniformities in the planar data and the magnitude of the consequential ring artifacts in the transaxial data, and how the perception of these artifacts is influenced by factors such as reconstruction matrix size, reconstruction filter, and image noise. The study indicates that the relationship between ring artifact magnitude and image noise is essentially independent of the acquisition or reconstruction matrix sizes, but is strongly dependent upon the type of smoothing filter applied during the reconstruction process. Furthermore, the degree to which a ring artifact can be perceived above image noise is dependent on the size and location of the nonuniformity in the planar data, with small nonuniformities (1-2 pixels wide) close to the center of rotation being less perceptible than those further out (8-20 pixels). Small defects or nonuniformities close to the center of rotation are thought to cause the greatest potential corruption to tomographic data. The study indicates that such may not be the case. Hence the uniformity requirements for SPECT may be less demanding than was previously thought.

  17. Assessment of toxic metals in waste personal computers

    SciTech Connect

    Kolias, Konstantinos; Hahladakis, John N. Gidarakos, Evangelos

    2014-08-15

    Highlights: • Waste personal computers were collected and dismantled in their main parts. • Motherboards, monitors and plastic housing were examined in their metal content. • Concentrations measured were compared to the RoHS Directive, 2002/95/EC. • Pb in motherboards and funnel glass of devices released <2006 was above the limit. • Waste personal computers need to be recycled and environmentally sound managed. - Abstract: Considering the enormous production of waste personal computers nowadays, it is obvious that the study of their composition is necessary in order to regulate their management and prevent any environmental contamination caused by their inappropriate disposal. This study aimed at determining the toxic metals content of motherboards (printed circuit boards), monitor glass and monitor plastic housing of two Cathode Ray Tube (CRT) monitors, three Liquid Crystal Display (LCD) monitors, one LCD touch screen monitor and six motherboards, all of which were discarded. In addition, concentrations of chromium (Cr), cadmium (Cd), lead (Pb) and mercury (Hg) were compared with the respective limits set by the RoHS 2002/95/EC Directive, that was recently renewed by the 2012/19/EU recast, in order to verify manufacturers’ compliance with the regulation. The research included disassembly, pulverization, digestion and chemical analyses of all the aforementioned devices. The toxic metals content of all samples was determined using Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). The results demonstrated that concentrations of Pb in motherboards and funnel glass of devices with release dates before 2006, that is when the RoHS Directive came into force, exceeded the permissible limit. In general, except from Pb, higher metal concentrations were detected in motherboards in comparison with plastic housing and glass samples. Finally, the results of this work were encouraging, since concentrations of metals referred in the RoHS Directive were found in

  18. The Identification, Implementation, and Evaluation of Critical User Interface Design Features of Computer-Assisted Instruction Programs in Mathematics for Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Seo, You-Jin; Woo, Honguk

    2010-01-01

    Critical user interface design features of computer-assisted instruction programs in mathematics for students with learning disabilities and corresponding implementation guidelines were identified in this study. Based on the identified features and guidelines, a multimedia computer-assisted instruction program, "Math Explorer", which delivers…

  19. Assessment of cardiac function: magnetic resonance and computed tomography.

    PubMed

    Greenberg, S B

    2000-10-01

    A complete cardiac study requires both anatomic and physiologic evaluation. Cardiac function can be evaluated noninvasively by magnetic resonance imaging (MRI)or ultrafast computed tomography (CT). MRI allows for evaluation of cardiac function by cine gradient echo imaging of the ventricles and flow analysis across cardiac valves and the great vessels. Cine gradient echo imaging is useful for evaluation of cardiac wall motion, ventricular volumes and ventricular mass. Flow analysis allows for measurement of velocity and flow during the cardiac cycle that reflects cardiac function. Ultrafast CT allows for measurement of cardiac indices similar to that provided by gradient echo imaging of the ventricles.

  20. Criticality Safety Assessment: Impact of Tank 40H Sludge Batch 2 Decant No. 2 on the Criticality Safety Assessment of the 242-25H Evaporator System (WSRC-TR-2000-00069)

    SciTech Connect

    Smiley, H.S.

    2001-07-30

    This assessment was done to evaluate the impact of the planned transfer of Decant No.2 from Sludge Batch 2 in Tank 40H on the potential for solids accumulation in the 242-25H evaporator. It is a nuclear criticality safety (NCS) goal to demonstrate that the evaporator vessel cannot accumulate fissile material in a quantity and configuration that provides a pathway to criticality.The mechanism for accumulation of fissile material is through formation of aluminosilicate solids.

  1. Life Cycle Assessment of Pavements: A Critical Review of Existing Literature and Research

    SciTech Connect

    Santero, Nicholas; Masanet, Eric; Horvath, Arpad

    2010-04-20

    This report provides a critical review of existing literature and modeling tools related to life-cycle assessment (LCA) applied to pavements. The review finds that pavement LCA is an expanding but still limited research topic in the literature, and that the existing body of work exhibits methodological deficiencies and incompatibilities that serve as barriers to the widespread utilization of LCA by pavement engineers and policy makers. This review identifies five key issues in the current body of work: inconsistent functional units, improper system boundaries, imbalanced data for asphalt and cement, use of limited inventory and impact assessment categories, and poor overall utility. This review also identifies common data and modeling gaps in pavement LCAs that should be addressed in future work. These gaps include: the use phase (rolling resistance, albedo, carbonation, lighting, leachate, and tire wear and emissions), asphalt fumes, feedstock energy of bitumen, traffic delay, the maintenance phase, and the end-of-life phase. This review concludes with a comprehensive list of recommendations for future research, which shed light on where improvements in knowledge can be made that will benefit the accuracy and comprehensiveness of pavement LCAs moving forward.

  2. A Structural and Functional Assessment of the Lung via Multidetector-Row Computed Tomography

    PubMed Central

    Hoffman, Eric A.; Simon, Brett A.; McLennan, Geoffrey

    2006-01-01

    With advances in multidetector-row computed tomography (MDCT), it is now possible to image the lung in 10 s or less and accurately extract the lungs, lobes, and airway tree to the fifth- through seventh-generation bronchi and to regionally characterize lung density, texture, ventilation, and perfusion. These methods are now being used to phenotype the lung in health and disease and to gain insights into the etiology of pathologic processes. This article outlines the application of these methodologies with specific emphasis on chronic obstructive pulmonary disease. We demonstrate the use of our methods for assessing regional ventilation and perfusion and demonstrate early data that show, in a sheep model, a regionally intact hypoxic pulmonary vasoconstrictor (HPV) response with an apparent inhibition of HPV regionally in the presence of inflammation. We present the hypothesis that, in subjects with pulmonary emphysema, one major contributing factor leading to parenchymal destruction is the lack of a regional blunting of HPV when the regional hypoxia is related to regional inflammatory events (bronchiolitis or alveolar flooding). If maintaining adequate blood flow to inflamed lung regions is critical to the nondestructive resolution of inflammatory events, the pathologic condition whereby HPV is sustained in regions of inflammation would likely have its greatest effect in the lung apices where blood flow is already reduced in the upright body posture. PMID:16921136

  3. Benefits and Drawbacks of Computer-Based Assessment and Feedback Systems: Student and Educator Perspectives

    ERIC Educational Resources Information Center

    Debuse, Justin C. W.; Lawley, Meredith

    2016-01-01

    Providing students with high quality feedback is important and can be achieved using computer-based systems. While student and educator perspectives of such systems have been investigated, a comprehensive multidisciplinary study has not yet been undertaken. This study examines student and educator perspectives of a computer-based assessment and…

  4. Randomised Items in Computer-Based Tests: Russian Roulette in Assessment?

    ERIC Educational Resources Information Center

    Marks, Anthony M.; Cronje, Johannes C.

    2008-01-01

    Computer-based assessments are becoming more commonplace, perhaps as a necessity for faculty to cope with large class sizes. These tests often occur in large computer testing venues in which test security may be compromised. In an attempt to limit the likelihood of cheating in such venues, randomised presentation of items is automatically…

  5. The Role of Computer Conferencing in Delivery of a Short Course on Assessment of Learning Difficulties.

    ERIC Educational Resources Information Center

    Dwyer, Eamonn

    1991-01-01

    A pilot project at the University of Ulster (Northern Ireland) used the CAUCUS computer conferencing system on the CAMPUS 2000 education network to train teachers to assess young adults with severe learning difficulties. Despite problems with student attrition and system failure, computer conferencing was felt to be a useful medium for providing…

  6. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  7. Assessing and Improving the Factorial Structures of the Computer Self-Efficacy Scale.

    ERIC Educational Resources Information Center

    Moroz, Pauline A.; Nash, John B.

    The Computer Self-Efficacy Scale (CSE) developed by C. A. Murphy, D. Coover, and S. V. Owen (1989) is an instrument purported to assess computer-related competencies. Previous research into the factor structure of the CSE has yielded conflicting results. In this study, the scale was used to collect data from 216 graduate education students. A…

  8. Impacts of Mobile Computing on Student Learning in the University: A Comparison of Course Assessment Data

    ERIC Educational Resources Information Center

    Hawkes, Mark; Hategekimana, Claver

    2010-01-01

    This study focuses on the impact of wireless, mobile computing tools on student assessment outcomes. In a campus-wide wireless, mobile computing environment at an upper Midwest university, an empirical analysis is applied to understand the relationship between student performance and Tablet PC use. An experimental/control group comparison of…

  9. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    ERIC Educational Resources Information Center

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  10. High Performance Computing Facility Operational Assessment, FY 2010 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Bland, Arthur S Buddy; Hack, James J; Baker, Ann E; Barker, Ashley D; Boudwin, Kathlyn J.; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; White, Julia C

    2010-08-01

    Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energy assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next

  11. Cognitive Assessment of Movement-Based Computer Games

    NASA Technical Reports Server (NTRS)

    Kearney, Paul

    2008-01-01

    This paper examines the possibility that dance games such as Dance Dance Revolution or StepMania enhance the cognitive abilities that are critical to academic achievement. These games appear to place a high cognitive load on working memory requiring the player to convert a visual signal to a physical movement up to 7 times per second. Players see a pattern of directions displayed on the screen and they memorise these as a dance sequence. Other researchers have found that attention span and memory ability, both cognitive abilities required for academic achievement, are improved through the use of physical movement and exercise. This paper reviews these claims and documents tool development for on-going research by the author.

  12. Acute Perforated Diverticulitis: Assessment With Multidetector Computed Tomography.

    PubMed

    Sessa, Barbara; Galluzzo, Michele; Ianniello, Stefania; Pinto, Antonio; Trinci, Margherita; Miele, Vittorio

    2016-02-01

    Colonic diverticulitis is a common condition in the western population. Complicated diverticulitis is defined as the presence of extraluminal air or abscess, peritonitis, colon occlusion, or fistulas. Multidetector row computed tomography (MDCT) is the modality of choice for the diagnosis and the staging of diverticulitis and its complications, which enables performing an accurate differential diagnosis and addressing the patients to a correct management. MDCT is accurate in diagnosing the site of perforation in approximately 85% of cases, by the detection of direct signs (focal bowel wall discontinuity, extraluminal gas, and extraluminal enteric contrast) and indirect signs, which are represented by segmental bowel wall thickening, abnormal bowel wall enhancement, perivisceral fat stranding of fluid, and abscess. MDCT is accurate in the differentiation from complicated colon diverticulitis and colon cancer, often with a similar imaging. The computed tomography-guided classification is recommended to discriminate patients with mild diverticulitis, generally treated with antibiotics, from those with severe diverticulitis with a large abscess, which may be drained with a percutaneous approach.

  13. Assessment of metabolic bone diseases by quantitative computed tomography

    SciTech Connect

    Richardson, M.L.; Genant, H.K.; Cann, C.E.; Ettinger, B.; Gordan, G.S.; Kolb, F.O.; Reiser, U.J.

    1985-05-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid- induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements.

  14. Assessing The Impact Of Computed Radiography And PACS

    NASA Astrophysics Data System (ADS)

    Hedgcock, Marcus W.; Kehr, Katherine

    1989-05-01

    Our institution (San Francisco VA Medical Center) is a VA pilot center for total digital imaging and PACS. Quantitative information about PACS impact on health care is limited, because no centers have done rigorous preimplementation studies. We are gathering quantitative service delivery and cost data before, during, and after stepwise implementation of computed radiography and PACS at our institution to define the impact on imaging service delivery. We designed a simple audit method using the x-ray request and time clocks to determine patient waiting time, imaging time, film use, image availability to the radiologist, matching of current with previous images, image availability to clinicians, and time to final interpretation. Our department model is a multichannel, mulitserver patient queue. Our current radiograph file is space limited, containing only one year of images; older images are kept in a remote file area in another building. In addition, there are 16 subfile areas within the Radiology Service and the medical center. Our preimplementation audit showed some long waiting times (40 minutes, average 20) and immediate retrieval of prior films in only 42% of cases, with an average retrieval time of 22 hours. Computed radiography and the optical archive have the potential to improve these figures. The audit will be ongoing and automated as implementation of PACS progresses, to measure service improvement and learning curve with the new equipment. We present the audit format and baseline preimplementation figures.

  15. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  16. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  17. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.705(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  18. 48 CFR 52.223-16 - IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Environmental Assessment of Personal Computer Products. 52.223-16 Section 52.223-16 Federal Acquisition... Assessment of Personal Computer Products. As prescribed in 23.706(b)(1), insert the following clause: IEEE 1680 Standard for the Environmental Assessment of Personal Computer Products (DEC 2007) (a)...

  19. The critical role of culture and environment as determinants of women's participation in computer science

    NASA Astrophysics Data System (ADS)

    Frieze, Carol

    This thesis proposes the need for, and illustrates, a new approach to how we think about, and act on, issues relating to women's participation, or lack of participation, in computer science (CS). This approach is based on a cultural perspective arguing that many of the reasons for women entering---or not entering---CS programs have little to do with gender and a lot to do with environment and culture. Evidence for this approach comes primarily from a qualitative, research study, which shows the effects of changes in the micro-culture on CS undergraduates at Carnegie Mellon, and from studies of other cultural contexts that illustrate a "Women-CS fit". We also discuss the interventions that have been crucial to the evolution of this specific micro-culture. Our argument goes against the grain of many gender and CS studies which conclude that the reasons for women's low participation in CS are based in gender --and particularly in gender differences in how men and women relate to the field. Such studies tend to focus on gender differences and recommend accommodating (what are perceived to be) women's different ways of relating to CS. This is often interpreted as contextualizing the curriculum to make it "female-friendly". The CS curriculum at Carnegie Mellon was not contextualized to be "female-friendly". Nevertheless, over the past few years, the school has attracted and graduated well above the US national average for women in undergraduate CS programs. We argue that this is due in large part to changes in the culture and environment of the department. As the environment has shifted from an unbalanced to a more balanced environment (balanced in terms of gender, breadth of student personalities, and professional support for women) the way has been opened for a range of students, including a significant number of women, to participate, and be successful, in the CS major. Our research shows that as men and women inhabit, and participate in, a more balanced environment

  20. Single-molecule protein sequencing through fingerprinting: computational assessment

    NASA Astrophysics Data System (ADS)

    Yao, Yao; Docter, Margreet; van Ginkel, Jetty; de Ridder, Dick; Joo, Chirlmin

    2015-10-01

    Proteins are vital in all biological systems as they constitute the main structural and functional components of cells. Recent advances in mass spectrometry have brought the promise of complete proteomics by helping draft the human proteome. Yet, this commonly used protein sequencing technique has fundamental limitations in sensitivity. Here we propose a method for single-molecule (SM) protein sequencing. A major challenge lies in the fact that proteins are composed of 20 different amino acids, which demands 20 molecular reporters. We computationally demonstrate that it suffices to measure only two types of amino acids to identify proteins and suggest an experimental scheme using SM fluorescence. When achieved, this highly sensitive approach will result in a paradigm shift in proteomics, with major impact in the biological and medical sciences.

  1. The Interactive Media Package for Assessment of Communication and Critical Thinking (IMPACCT[c]): Testing a Programmatic Online Communication Competence Assessment System

    ERIC Educational Resources Information Center

    Spitzberg, Brian H.

    2011-01-01

    IMPACCT is an online survey covering over 40 self-report types of student communication competency, as well as a test of critical thinking based on cognitive problem-solving. The student nominates two peers who rate the student's interpersonal, computer-mediated, group and leadership, and public speaking communication competence. The student takes…

  2. A computer-based Safety Assessment for Flight Evacuation - SAFE

    NASA Technical Reports Server (NTRS)

    Shively, Robert J.

    1988-01-01

    The Safety Assessment for Flight Evacuation (SAFE) system has been developed for the computerized evaluation of safety in civil Emergency Medical Service (EMS) operations. The speed of the microprocessor used to analyze data allows many individual factors to be considered, as well as the interactions among those factors. SAFE's data base is structured as if-then conditional statements. SAFE also allows the most important of the factors to be given greater weight in the final score. The questionnaire filled by EMS crews encompassed mission-, crew-, organization-, environment-, and aircraft-related factors; each of these was subdivided into as many as eight variables affecting the EMS-mission risk of that factor.

  3. Quantitative computed tomography for spinal mineral assessment: current status

    NASA Technical Reports Server (NTRS)

    Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U.; Arnaud, C. D.

    1985-01-01

    Quantitative CT (QCT) is an established method for the noninvasive assessment of bone mineral content in the vertebral spongiosum and other anatomic locations. The potential strengths of QCT relative to dual photon absorptiometry (DPA) are its capability for precise three-dimensional anatomic localization providing a direct density measurement and its capability for spatial separation of highly responsive cancellous bone from less responsive cortical bone. The extraction of this quantitative information from the CT image, however, requires sophisticated calibration and positioning techniques and careful technical monitoring.

  4. Computational DNA hole spectroscopy: A new tool to predict mutation hotspots, critical base pairs, and disease ‘driver’ mutations

    PubMed Central

    Suárez, Martha Y.; Villagrán; Miller, John H.

    2015-01-01

    We report on a new technique, computational DNA hole spectroscopy, which creates spectra of electron hole probabilities vs. nucleotide position. A hole is a site of positive charge created when an electron is removed. Peaks in the hole spectrum depict sites where holes tend to localize and potentially trigger a base pair mismatch during replication. Our studies of mitochondrial DNA reveal a correlation between L-strand hole spectrum peaks and spikes in the human mutation spectrum. Importantly, we also find that hole peak positions that do not coincide with large variant frequencies often coincide with disease-implicated mutations and/or (for coding DNA) encoded conserved amino acids. This enables combining hole spectra with variant data to identify critical base pairs and potential disease ‘driver’ mutations. Such integration of DNA hole and variance spectra could ultimately prove invaluable for pinpointing critical regions of the vast non-protein-coding genome. An observed asymmetry in correlations, between the spectrum of human mtDNA variations and the L- and H-strand hole spectra, is attributed to asymmetric DNA replication processes that occur for the leading and lagging strands. PMID:26310834

  5. A Comparative Assessment of Computer Literacy of Private and Public Secondary School Students in Lagos State, Nigeria

    ERIC Educational Resources Information Center

    Osunwusi, Adeyinka Olumuyiwa; Abifarin, Michael Segun

    2013-01-01

    The aim of this study was to conduct a comparative assessment of computer literacy of private and public secondary school students. Although the definition of computer literacy varies widely, this study treated computer literacy in terms of access to, and use of, computers and the internet, basic knowledge and skills required to use computers and…

  6. A Critical Assessment of Hygroscopic Seeding of Convective Clouds for Rainfall Enhancement.

    NASA Astrophysics Data System (ADS)

    Silverman, Bernard A.

    2003-09-01

    During the past decade, statistically positive results have been reported for four major, randomized hygroscopic seeding experiments, each in a different part of the world. Experiments on cold convective clouds using hygroscopic flares were carried out in South Africa and Mexico. Experiments on warm convective clouds using hygroscopic particles were carried out in Thailand and India. The scientific evidence for enhancing rainfall from convective clouds by hygroscopic seeding from these four randomized experiments is examined and critically assessed. The assessment uses, as a measure of proof of concept, the criteria for success of any cloud seeding activity that were recommended in the Scientific Background for the 1998 AMS Policy Statement on Planned and Inadvertent Weather Modifications, criteria that required both statistical and physical evidence.Based on a critical examination of the results of these four major, randomized hygroscopic seeding experiments, it has been concluded that they have not yet provided either the statistical or physical evidence required to establish that the effectiveness of hygroscopic seeding of convective clouds to increase precipitation is scientifically proven. The impressive statistical results from these experiments must be viewed with caution because, according to the proof-of-concept criteria, credibility of the results depends on the physical plausibility of the seeding conceptual model that forms the basis for anticipating seeding-induced increases in rainfall. The credibility of the hygroscopic seeding for microphysical effects hypothesis has been seriously undermined because it cannot explain the magnitude and timing of the statistically significant increases in precipitation that were observed. Theories suggesting that the microphysical effects of seeding-enhanced downdraft circulations to produce longer-lived clouds have been advanced; however, in the absence of any supporting physical or model evidence, they must be

  7. A Critical Assessment of Epidemiology Studies Regarding Dietary/Supplemental Zinc and Prostate Cancer Risk

    PubMed Central

    Costello, Leslie C.; Franklin, Renty B.; Tan, Ming T.

    2013-01-01

    understanding of these issues in assessing the validity and the conclusiveness of the outcomes from the epidemiology studies of purported associations of dietary and supplemental zinc on the risk of prostate cancer; particularly when the unsubstantiated conclusions are at odds with clinical and experimental evidence. It is in the interest of the medical, scientific and public communities that this critical review is undertaken. We hope that this review will generate an open, objective, scientific and medical discussion and assessment of this important issue. PMID:24204440

  8. Interactive Computer Based Assessment Tasks: How Problem-Solving Process Data Can Inform Instruction

    ERIC Educational Resources Information Center

    Zoanetti, Nathan

    2010-01-01

    This article presents key steps in the design and analysis of a computer based problem-solving assessment featuring interactive tasks. The purpose of the assessment is to support targeted instruction for students by diagnosing strengths and weaknesses at different stages of problem-solving. The first focus of this article is the task piloting…

  9. Supporting Student Learning: The Use of Computer-Based Formative Assessment Modules.

    ERIC Educational Resources Information Center

    Peat, Mary; Franklin, Sue

    2002-01-01

    Describes the development of a variety of computer-based assessment opportunities, both formative and summative, that are available to a large first-year biology class at the University of Sydney (Australia). Discusses online access to weekly quizzes, a mock exam, and special self-assessment modules that are beneficial to student learning.…

  10. Synchronous Computer-Mediated Dynamic Assessment: A Case Study of L2 Spanish Past Narration

    ERIC Educational Resources Information Center

    Darhower, Mark Anthony

    2014-01-01

    In this study, dynamic assessment is employed to help understand the developmental processes of two university Spanish learners as they produce a series of past narrations in a synchronous computer mediated environment. The assessments were conducted in six weekly one-hour chat sessions about various scenes of a Spanish language film. The analysis…

  11. Effects of Feedback in a Computer-Based Assessment for Learning

    ERIC Educational Resources Information Center

    van der Kleij, Fabienne M.; Eggen, Theo J. H. M.; Timmers, Caroline F.; Veldkamp, Bernard P.

    2012-01-01

    The effects of written feedback in a computer-based assessment for learning on students' learning outcomes were investigated in an experiment at a Higher Education institute in the Netherlands. Students were randomly assigned to three groups, and were subjected to an assessment for learning with different kinds of feedback. These are immediate…

  12. Staff and Student Perceptions of Computer-Assisted Assessment for Physiology Practical Classes

    ERIC Educational Resources Information Center

    Sheader, Elizabeth; Gouldsborough, Ingrid; Grady, Ruth

    2006-01-01

    Effective assessment of laboratory practicals is a challenge for large-size classes. To reduce the administrative burden of staff members without compromising the student learning experience, we utilized dedicated computer software for short-answer question assessment for nearly 300 students and compared it with the more traditional, paper-based…

  13. Computational Modeling and Assessment Of Nanocoatings for Ultra Supercritical Boilers

    SciTech Connect

    David W. Gandy; John P. Shingledecker

    2011-04-11

    Forced outages and boiler unavailability in conventional coal-fired fossil power plants is most often caused by fireside corrosion of boiler waterwalls. Industry-wide, the rate of wall thickness corrosion wastage of fireside waterwalls in fossil-fired boilers has been of concern for many years. It is significant that the introduction of nitrogen oxide (NOx) emission controls with staged burners systems has increased reported waterwall wastage rates to as much as 120 mils (3 mm) per year. Moreover, the reducing environment produced by the low-NOx combustion process is the primary cause of accelerated corrosion rates of waterwall tubes made of carbon and low alloy steels. Improved coatings, such as the MCrAl nanocoatings evaluated here (where M is Fe, Ni, and Co), are needed to reduce/eliminate waterwall damage in subcritical, supercritical, and ultra-supercritical (USC) boilers. The first two tasks of this six-task project-jointly sponsored by EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)-have focused on computational modeling of an advanced MCrAl nanocoating system and evaluation of two nanocrystalline (iron and nickel base) coatings, which will significantly improve the corrosion and erosion performance of tubing used in USC boilers. The computational model results showed that about 40 wt.% is required in Fe based nanocrystalline coatings for long-term durability, leading to a coating composition of Fe-25Cr-40Ni-10 wt.% Al. In addition, the long term thermal exposure test results further showed accelerated inward diffusion of Al from the nanocrystalline coatings into the substrate. In order to enhance the durability of these coatings, it is necessary to develop a diffusion barrier interlayer coating such TiN and/or AlN. The third task 'Process Advanced MCrAl Nanocoating Systems' of the six-task project jointly sponsored by the Electric Power Research Institute, EPRI and the U.S. Department of Energy (DE-FC26-07NT43096)- has focused on processing of

  14. Computational assessment of several hydrogen-free high energy compounds.

    PubMed

    Tan, Bisheng; Huang, Ming; Long, Xinping; Li, Jinshan; Fan, Guijuan

    2016-01-01

    Tetrazino-tetrazine-tetraoxide (TTTO) is an attractive high energy compound, but unfortunately, it is not yet experimentally synthesized so far. Isomerization of TTTO leads to its five isomers, bond-separation energies were empolyed to compare the global stability of six compounds, it is found that isomer 1 has the highest bond-separation energy (1204.6kJ/mol), compared with TTTO (1151.2kJ/mol); thermodynamic properties of six compounds were theoretically calculated, including standard formation enthalpies (solid and gaseous), standard fusion enthalpies, standard vaporation enthalpies, standard sublimation enthalpies, lattice energies and normal melting points, normal boiling points; their detonation performances were also computed, including detonation heat (Q, cal/g), detonation velocity (D, km/s), detonation pressure (P, GPa) and impact sensitivity (h50, cm), compared with TTTO (Q=1311.01J/g, D=9.228km/s, P=40.556GPa, h50=12.7cm), isomer 5 exhibites better detonation performances (Q=1523.74J/g, D=9.389km/s, P=41.329GPa, h50= 28.4cm).

  15. Assessment of toxic metals in waste personal computers.

    PubMed

    Kolias, Konstantinos; Hahladakis, John N; Gidarakos, Evangelos

    2014-08-01

    Considering the enormous production of waste personal computers nowadays, it is obvious that the study of their composition is necessary in order to regulate their management and prevent any environmental contamination caused by their inappropriate disposal. This study aimed at determining the toxic metals content of motherboards (printed circuit boards), monitor glass and monitor plastic housing of two Cathode Ray Tube (CRT) monitors, three Liquid Crystal Display (LCD) monitors, one LCD touch screen monitor and six motherboards, all of which were discarded. In addition, concentrations of chromium (Cr), cadmium (Cd), lead (Pb) and mercury (Hg) were compared with the respective limits set by the RoHS 2002/95/EC Directive, that was recently renewed by the 2012/19/EU recast, in order to verify manufacturers' compliance with the regulation. The research included disassembly, pulverization, digestion and chemical analyses of all the aforementioned devices. The toxic metals content of all samples was determined using Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). The results demonstrated that concentrations of Pb in motherboards and funnel glass of devices with release dates before 2006, that is when the RoHS Directive came into force, exceeded the permissible limit. In general, except from Pb, higher metal concentrations were detected in motherboards in comparison with plastic housing and glass samples. Finally, the results of this work were encouraging, since concentrations of metals referred in the RoHS Directive were found in lower levels than the legislative limits.

  16. Ultrasound attenuation computed tomography assessment of PAGAT gel dose

    NASA Astrophysics Data System (ADS)

    Khoei, S.; Trapp, J. V.; Langton, C. M.

    2014-08-01

    Ultrasound has been previously investigated as an alternative readout method for irradiated polymer gel dosimeters, with authors reporting varying dose responses. We extend previous work utilizing a new computed tomography ultrasound scanner comprising of two identical 5 MHz, 128-element linear-array ultrasound transducers, co-axially aligned and submerged in water as a coupling agent, with rotational of the gel dosimeter between the transducers facilitated by a robotic arm. We have investigated the dose-dependence of both ultrasound bulk attenuation and broadband ultrasound attenuation (BUA) for the PAGAT gel dosimeter. The ultrasound bulk attenuation dose sensitivity was found to be 1.46  ±  0.04 dB m -1 Gy -1, being in agreement with previously published results for PAG and MAGIC gels. BUA was also found to be dose dependent and was measured to be 0.024  ±  0.003 dB MHz -1 Gy -1 the advantage of BUA being its insensitivity to frequency-independent attenuation mechanisms including reflection and refraction, thereby minimizing image reconstruction artefacts.

  17. Cone Beam Computed Tomographic Assessment of Bifid Mandibular Condyle

    PubMed Central

    Khojastepour, Leila; Kolahi, Shirin; Panahi, Nazi

    2015-01-01

    Objectives: Differential diagnosis of bifid mandibular condyle (BMC) is important, since it may play a role in temporomandibular joint (TMJ) dysfunctions and joint symptoms. In addition, radiographic appearance of BMC may mimic tumors and/or fractures. The aim of this study was to evaluate the prevalence and orientation of BMC based on cone beam computed tomography (CBCT) scans. Materials and Methods: This cross-sectional study was performed on CBCT scans of paranasal sinuses of 425 patients. In a designated NNT station, all CBCT scans were evaluated in the axial, coronal and sagittal planes to find the frequency of BMC. The condylar head horizontal angulations were also determined in the transverse plane. T-test was used to compare the frequency of BMC between the left and right sides and between males and females. Results: Totally, 309 patients with acceptable visibility of condyles on CBCT scans were entered in the study consisting of 170 (55%) females and 139 (45%) males with a mean age of 39.43±9.7 years. The BMC was detected in 14 cases (4.53%). Differences between males and females, sides and horizontal angulations of condyle of normal and BMC cases were not significant. Conclusion: The prevalence of BMC in the studied population was 4.53%. No significant difference was observed between males and females, sides or horizontal angulations of the involved and uninvolved condyles. PMID:27559345

  18. Computational assessment of several hydrogen-free high energy compounds.

    PubMed

    Tan, Bisheng; Huang, Ming; Long, Xinping; Li, Jinshan; Fan, Guijuan

    2016-01-01

    Tetrazino-tetrazine-tetraoxide (TTTO) is an attractive high energy compound, but unfortunately, it is not yet experimentally synthesized so far. Isomerization of TTTO leads to its five isomers, bond-separation energies were empolyed to compare the global stability of six compounds, it is found that isomer 1 has the highest bond-separation energy (1204.6kJ/mol), compared with TTTO (1151.2kJ/mol); thermodynamic properties of six compounds were theoretically calculated, including standard formation enthalpies (solid and gaseous), standard fusion enthalpies, standard vaporation enthalpies, standard sublimation enthalpies, lattice energies and normal melting points, normal boiling points; their detonation performances were also computed, including detonation heat (Q, cal/g), detonation velocity (D, km/s), detonation pressure (P, GPa) and impact sensitivity (h50, cm), compared with TTTO (Q=1311.01J/g, D=9.228km/s, P=40.556GPa, h50=12.7cm), isomer 5 exhibites better detonation performances (Q=1523.74J/g, D=9.389km/s, P=41.329GPa, h50= 28.4cm). PMID:26705845

  19. Computational intelligence for target assessment in Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Micheli-Tzanakou, Evangelia; Hamilton, J. L.; Zheng, J.; Lehman, Richard M.

    2001-11-01

    Recent advances in image and signal processing have created a new challenging environment for biomedical engineers. Methods that were developed for different fields are now finding a fertile ground in biomedicine, especially in the analysis of bio-signals and in the understanding of images. More and more, these methods are used in the operating room, helping surgeons, and in the physician's office as aids for diagnostic purposes. Neural Network (NN) research on the other hand, has gone a long way in the past decade. NNs now consist of many thousands of highly interconnected processing elements that can encode, store and recall relationships between different patterns by altering the weighting coefficients of inputs in a systematic way. Although they can generate reasonable outputs from unknown input patterns, and can tolerate a great deal of noise, they are very slow when run on a serial machine. We have used advanced signal processing and innovative image processing methods that are used along with computational intelligence for diagnostic purposes and as visualization aids inside and outside the operating room. Applications to be discussed include EEGs and field potentials in Parkinson's disease along with 3D reconstruction of MR or fMR brain images in Parkinson's patients, are currently used in the operating room for Pallidotomies and Deep Brain Stimulation (DBS).

  20. High Performance Computing Facility Operational Assessment, CY 2011 Oak Ridge Leadership Computing Facility

    SciTech Connect

    Baker, Ann E; Barker, Ashley D; Bland, Arthur S Buddy; Boudwin, Kathlyn J.; Hack, James J; Kendall, Ricky A; Messer, Bronson; Rogers, James H; Shipman, Galen M; Wells, Jack C; White, Julia C; Hudson, Douglas L

    2012-02-01

    Oak Ridge National Laboratory's Leadership Computing Facility (OLCF) continues to deliver the most powerful resources in the U.S. for open science. At 2.33 petaflops peak performance, the Cray XT Jaguar delivered more than 1.4 billion core hours in calendar year (CY) 2011 to researchers around the world for computational simulations relevant to national and energy security; advancing the frontiers of knowledge in physical sciences and areas of biological, medical, environmental, and computer sciences; and providing world-class research facilities for the nation's science enterprise. Users reported more than 670 publications this year arising from their use of OLCF resources. Of these we report the 300 in this review that are consistent with guidance provided. Scientific achievements by OLCF users cut across all range scales from atomic to molecular to large-scale structures. At the atomic scale, researchers discovered that the anomalously long half-life of Carbon-14 can be explained by calculating, for the first time, the very complex three-body interactions between all the neutrons and protons in the nucleus. At the molecular scale, researchers combined experimental results from LBL's light source and simulations on Jaguar to discover how DNA replication continues past a damaged site so a mutation can be repaired later. Other researchers combined experimental results from ORNL's Spallation Neutron Source and simulations on Jaguar to reveal the molecular structure of ligno-cellulosic material used in bioethanol production. This year, Jaguar has been used to do billion-cell CFD calculations to develop shock wave compression turbo machinery as a means to meet DOE goals for reducing carbon sequestration costs. General Electric used Jaguar to calculate the unsteady flow through turbo machinery to learn what efficiencies the traditional steady flow assumption is hiding from designers. Even a 1% improvement in turbine design can save the nation billions of gallons of

  1. Teaching and Assessing Critical Thinking Skills for Argument Analysis in Psychology

    ERIC Educational Resources Information Center

    Bensley, D. Alan; Crowe, Deborah S.; Bernhardt, Paul; Buckner, Camille; Allman, Amanda L.

    2010-01-01

    Critical thinking is a valued educational outcome; however, little is known about whether psychology courses, especially ones such as research methods courses that might be expected to promote critical thinking skills, actually improve them. We compared the acquisition of critical thinking skills for analyzing psychological arguments in 3 groups…

  2. Assessment of Stirling Technology Has Provided Critical Data Leading Toward Flight Readiness of the Stirling Converter

    NASA Technical Reports Server (NTRS)

    Thieme, Lanny G.

    2001-01-01

    The NASA Glenn Research Center is supporting the development of a Stirling converter with the Department of Energy (DOE, Germantown, Maryland) for an advanced Stirling Radioisotope Power System (SRPS) to provide spacecraft onboard electric power for NASA space science missions. A key technology assessment completed by Glenn and DOE has led to the SRPS being identified as a high-efficiency power source for such deep space missions as the Europa Orbiter and the Solar Probe. In addition, the Stirling system is now being considered for unmanned Mars rovers, especially where mission profiles may exclude the use of photovoltaic power systems, such as exploration at high Martian latitudes or for missions of long duration. The SRPS efficiency of over 20 percent will reduce the required amount of radioisotope by more than a factor of 3 in comparison to current radioisotope thermoelectric generators. This significantly reduces radioisotope cost, radiological inventory, and system cost, and it provides efficient use of scarce radioisotope resources. In support of this technology assessment, Glenn conducted a series of independent evaluations and tests to determine the technology readiness of a 55-We Stirling converter developed by Stirling Technology Company (Kennewick, Washington) and DOE. Key areas evaluated by Glenn included: 1) Radiation tolerance of materials; 2) Random vibration testing of the Stirling converter in Glenn's Structural Dynamics Lab to simulate operation in the launch environment; 3) Electromagnetic interference and compatibility (EMI/EMC) of the converter operating in Glenn's EMI lab; Independent failure modes, effects, and criticality analysis, and life and reliability 4. Independent failure modes, effects, and criticality analysis, and life and reliability assessment; and 5) SRPS cost estimate. The data from these evaluations were presented to NASA Headquarters and the Jet Propulsion Laboratory mission office by a joint industry/Government team

  3. Computational Assessment of the Aerodynamic Performance of a Variable-Speed Power Turbine for Large Civil Tilt-Rotor Application

    NASA Technical Reports Server (NTRS)

    Welch, Gerard E.

    2011-01-01

    The main rotors of the NASA Large Civil Tilt-Rotor notional vehicle operate over a wide speed-range, from 100% at take-off to 54% at cruise. The variable-speed power turbine offers one approach by which to effect this speed variation. Key aero-challenges include high work factors at cruise and wide (40 to 60 deg.) incidence variations in blade and vane rows over the speed range. The turbine design approach must optimize cruise efficiency and minimize off-design penalties at take-off. The accuracy of the off-design incidence loss model is therefore critical to the turbine design. In this effort, 3-D computational analyses are used to assess the variation of turbine efficiency with speed change. The conceptual design of a 4-stage variable-speed power turbine for the Large Civil Tilt-Rotor application is first established at the meanline level. The design of 2-D airfoil sections and resulting 3-D blade and vane rows is documented. Three-dimensional Reynolds Averaged Navier-Stokes computations are used to assess the design and off-design performance of an embedded 1.5-stage portion-Rotor 1, Stator 2, and Rotor 2-of the turbine. The 3-D computational results yield the same efficiency versus speed trends predicted by meanline analyses, supporting the design choice to execute the turbine design at the cruise operating speed.

  4. Conceptualizing learning for sustainability through environmental assessment: critical reflections on 15 years of research

    SciTech Connect

    Sinclair, A. John Diduck, Alan Fitzpatrick, Patricia

    2008-10-15

    Numerous scholars are now directing their attention to the education and learning implications of participatory resource and environmental governance because of the potential implications of these for generating the social mobilization necessary to achieve sustainability trajectories. Our work, and that of other researchers, establishes that public participation in environmental assessment (EA) provides fertile ground for considering the intricacies of governance as they relate to participation, and for examining the education and learning implications of participation. Since EA law requires in many cases that public voices be part of the decision process, it has resulted in the creation of fascinating, state-sanctioned, deliberative spaces for civic interactions. Our purpose here is to share, and build upon, a framework that conceptualizes the relationships among participation, education, learning and sustainability in an EA context. We do so by considering findings from studies we have undertaken on participation in EA in Canada since the early 90's. Our approach was interactive and collaborative. We each considered in detail the key results of our earlier work as they relate to education, learning and EA process design. The findings illuminate aspects of the conceptual framework for which there is considerable empirical evidence, such as the link between meaningful participation and critical education and the diversity of individual learning outcomes associated with public participation in EA. The findings also highlight those parts of the framework for which the empirical evidence is relatively sparse, such as the range of possible social learning outcomes, their congruence with sustainability criteria, and the roles of monitoring and cumulative and strategic assessments in shaping EA into an adaptive, learning system.

  5. Critical examination of assumptions used in risk assessments of dioxin contaminated soil

    SciTech Connect

    Paustenbach, D.J.; Shu, H.P.; Murray, F.J.

    1986-09-01

    This paper critically evaluates several aspects of previously proposed approaches to setting limits for 2,3,7,8-tetrachlorodibenzo-p-dioxin (dioxin, TCDD) in residential soil and soil within industrial sites. Factors and assumptions which significantly affect the predicted degree of hazard associated with exposure to soil contaminated with low levels of dioxin are discussed. This paper shows how different, more justifiable assumptions than those used by the Centers for Disease Control (CDC) regarding the quantities of soil typically consumed by children, TCDD's nongenotoxicity, dermal exposure to soil, the concentration of airborne soil particles, dioxin's bioavailability in soil, and extrapolation of the dose response curve can profoundly affect the results of the risk assessment and, subsequently, the magnitude of the recommended limits. Two case studies which quantitatively illustrate the effect of these assumptions on the risk estimates are presented. Non-U.S. regulatory agencies have considered TCDD's nongenotoxicity in estimating that the virtually safe dose (VSD) or acceptable daily dose for dioxin is approximately 10 pg/kg/day (10,000 fg/kg/day). These approaches are compared and contrasted with the method used by the United States EPA whose risk estimates are higher and whose VSD is approximately 1000-fold lower. Alternative approaches to interpreting the cancer data indicate that a VSD of 130 pg/kg/day is more scientifically justified than risks estimated using standard approaches. This assessment indicates that a soil concentration of TCDD considerably in excess of 1 ppb should be acceptable for residential and nonresidential areas.

  6. An assessment of some methodological criticisms of studies of RNA efflux from isolated nuclei.

    PubMed

    Agutter, P S

    1983-09-15

    RNA efflux from isolated nuclei can be studied either as a means of elucidating the general mechanism of nucleo-cytoplasmic RNA transport, or as part of an investigation of the processing and utilization of particular gene transcripts. The present paper describes an assessment of three methodological criticisms of RNA-efflux measurements that are made for the former reason: for such measurements, it is sufficient to show that the post-incubation supernatant RNA is similar overall to homologous cytoplasmic mRNA, rather than to nuclear RNA, that is nevertheless of intranuclear origin, and that alterations to the medium during experiments do not markedly perturb this general nuclear restriction. The results seem to justify the following conclusions. (1) Although degradation of the nuclear RNA occurs during incubation in vitro, this process does not account for the appearance of RNA in the postnuclear supernatant. The degradation can be largely prevented by the addition of serine-proteinase inhibitors without altering the RNA efflux rate. (2) Some adsorption of labelled cytoplasmic RNA to the nuclear surface occurs during both isolation and incubation of the nuclei, and some desorption occurs during incubation. However, these effects introduce errors of less than 10% into the measurements of efflux rates. (3) Exogenous acidic polymers, including polyribonucleotides, disrupt nuclei and increase the apparent RNA efflux rate by causing leakage of nuclear contents. However, this effect can largely be overcome by including the nuclear stabilizers spermidine, Ca2+ and Mn2+ in the medium. In terms of this assessment, it appears that RNA efflux from isolated nuclei in media containing nuclear stabilizers serves as a reasonable model for transport in vivo.

  7. An assessment of some methodological criticisms of studies of RNA efflux from isolated nuclei.

    PubMed Central

    Agutter, P S

    1983-01-01

    RNA efflux from isolated nuclei can be studied either as a means of elucidating the general mechanism of nucleo-cytoplasmic RNA transport, or as part of an investigation of the processing and utilization of particular gene transcripts. The present paper describes an assessment of three methodological criticisms of RNA-efflux measurements that are made for the former reason: for such measurements, it is sufficient to show that the post-incubation supernatant RNA is similar overall to homologous cytoplasmic mRNA, rather than to nuclear RNA, that is nevertheless of intranuclear origin, and that alterations to the medium during experiments do not markedly perturb this general nuclear restriction. The results seem to justify the following conclusions. (1) Although degradation of the nuclear RNA occurs during incubation in vitro, this process does not account for the appearance of RNA in the postnuclear supernatant. The degradation can be largely prevented by the addition of serine-proteinase inhibitors without altering the RNA efflux rate. (2) Some adsorption of labelled cytoplasmic RNA to the nuclear surface occurs during both isolation and incubation of the nuclei, and some desorption occurs during incubation. However, these effects introduce errors of less than 10% into the measurements of efflux rates. (3) Exogenous acidic polymers, including polyribonucleotides, disrupt nuclei and increase the apparent RNA efflux rate by causing leakage of nuclear contents. However, this effect can largely be overcome by including the nuclear stabilizers spermidine, Ca2+ and Mn2+ in the medium. In terms of this assessment, it appears that RNA efflux from isolated nuclei in media containing nuclear stabilizers serves as a reasonable model for transport in vivo. PMID:6194787

  8. Computer-aided design of dry powder inhalers using computational fluid dynamics to assess performance.

    PubMed

    Suwandecha, Tan; Wongpoowarak, Wibul; Srichana, Teerapol

    2016-01-01

    Dry powder inhalers (DPIs) are gaining popularity for the delivery of drugs. A cost effective and efficient delivery device is necessary. Developing new DPIs by modifying an existing device may be the simplest way to improve the performance of the devices. The aim of this research was to produce a new DPIs using computational fluid dynamics (CFD). The new DPIs took advantages of the Cyclohaler® and the Rotahaler®. We chose a combination of the capsule chamber of the Cyclohaler® and the mouthpiece and grid of the Rotahaler®. Computer-aided design models of the devices were created and evaluated using CFD. Prototype models were created and tested with the DPI dispersion experiments. The proposed model 3 device had a high turbulence with a good degree of deagglomeration in the CFD and the experiment data. The %fine particle fraction (FPF) was around 50% at 60 L/min. The mass median aerodynamic diameter was around 2.8-4 μm. The FPF were strongly correlated to the CFD-predicted turbulence and the mechanical impaction parameters. The drug retention in the capsule was only 5-7%. In summary, a simple modification of the Cyclohaler® and Rotahaler® could produce a better performing inhaler using the CFD-assisted design.

  9. eLearning to facilitate the education and implementation of the Chelsea Critical Care Physical Assessment: a novel measure of function in critical illness

    PubMed Central

    Corner, Evelyn J; Handy, Jonathan M; Brett, Stephen J

    2016-01-01

    Objective To evaluate the efficacy of eLearning in the widespread standardised teaching, distribution and implementation of the Chelsea Critical Care Physical Assessment (CPAx) tool—a validated tool to assess physical function in critically ill patients. Design Prospective educational study. An eLearning module was developed through a conceptual framework, using the four-stage technique for skills teaching to teach clinicians how to use the CPAx. Example and test video case studies of CPAx assessments were embedded within the module. The CPAx scores for the test case studies and demographic data were recorded in a secure area of the website. Data were analysed for inter-rater reliability using intraclass correlation coefficients (ICCs) to see if an eLearning educational package facilitated consistent use of the tool. A utility and content validity questionnaire was distributed after 1 year to eLearning module registrants (n=971). This was to evaluate uptake of the CPAx in clinical practice and content validity of the CPAx from the perspective of clinical users. Setting The module was distributed for use via professional forums (n=2) and direct contacts (n=95). Participants Critical care clinicians. Primary outcome measure ICC of the test case studies. Results Between July and October 2014, 421 candidates from 15 countries registered for the eLearning module. The ICC for case one was 0.996 (95% CI 0.990 to 0.999; n=207). The ICC for case two was 0.988 (0.996 to 1.000; n=184). The CPAx has a strong total scale content validity index (s-CVI) of 0.94 and is well used. Conclusions eLearning is a useful and reliable way of teaching psychomotor skills, such as the CPAx. The CPAx is a well-used measure with high content validity rated by clinicians. PMID:27067895

  10. Assessment of critical closing pressure in the cerebral circulation as a measure of cerebrovascular tone.

    PubMed

    Richards, H K; Czosnyka, M; Pickard, J D

    1999-01-01

    Critical closing pressure (CCP) calculated from the blood flow velocity (FV) and arterial blood pressure (ABP) waveforms has been previously reported to be useful in the assessment of the dynamics of cerebral circulation. We investigated the relationship between CCP and intracranial pressure (ICP) and cerebrovascular tone in a model of intracranial hypertension in 22 anaesthetised New Zealand White rabbits during manipulations of arterial CO2, ABP and vasodilatation caused by hypoxia. Recordings were made of FV in the basilar artery, ABP and ICP during subarachnoid infusion of saline. During infusion ICP and CCP were significantly correlated (R=0.68; p<0.001), but the magnitude of increase in ICP and CCP during infusion were not correlated to each other. Linear regression between the difference: CCP-ICP (representing a factor due to vasogenic tone) and cerebral perfusion pressure (CPP=ABP-ICP) was highly significant (R=-0.87; p<0.01). Generally, CCP decreased significantly (p<0.05) with hypercarbia, arterial hypotension and after and post-hypoxia and the difference: CCP-ICP decreased consistently after each vasodilatatory manoeuvre studied. Our data confirmed the linear relationship between CCP and ICP, and between the difference: CCP-ICP and cerebrovascular tone. However, because the magnitude of increase in ICP was not correlated to magnitude of change in CCP, CCP cannot be use for detection of increasing ICP quantitatively. PMID:10592124

  11. Critical moments in preschool obesity: the call for nurses and communities to assess and intervene.

    PubMed

    Water, Tineke

    2011-12-01

    Thirty years ago obesity was rarely seen in children but is now described as a world wide pandemic. Previous research has focused on school age children; however, researchers have now identified critical moments of development during uterine life and early infancy where negative factors or insults could cause permanent changes in the structure and function of tissues and lead to epigenetic changes. Obesity in preschool children can cause premature and long term chronic health problems; has been associated with academic and social difficulties in kindergarten children; difficulty with social relationships; increased feelings of sadness, loneliness and anxiety; and negative self image in children as young as 5 years of age. The importance of identifying children under the age of five with obesity and associated risks is important yet less than half of health professionals intervene in cases of preschool obesity. This paper explores the concerns around antenatal and preschool obesity and the challenges for nurses and midwives in assessing and providing appropriate interventions for children and families in community settings.

  12. A critical assessment of the ecological assumptions underpinning compensatory mitigation of salmon-derived nutrients

    USGS Publications Warehouse

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-01-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  13. A Critical Assessment of the Effects of Bt Transgenic Plants on Parasitoids

    PubMed Central

    Chen, Mao; Zhao, Jian-Zhou; Collins, Hilda L.; Earle, Elizabeth D.; Cao, Jun; Shelton, Anthony M.

    2008-01-01

    The ecological safety of transgenic insecticidal plants expressing crystal proteins (Cry toxins) from the bacterium Bacillus thuringiensis (Bt) continues to be debated. Much of the debate has focused on nontarget organisms, especially predators and parasitoids that help control populations of pest insects in many crops. Although many studies have been conducted on predators, few reports have examined parasitoids but some of them have reported negative impacts. None of the previous reports were able to clearly characterize the cause of the negative impact. In order to provide a critical assessment, we used a novel paradigm consisting of a strain of the insect pest, Plutella xylostella (herbivore), resistant to Cry1C and allowed it to feed on Bt plants and then become parasitized by Diadegma insulare, an important endoparasitoid of P. xylostella. Our results indicated that the parasitoid was exposed to a biologically active form of the Cy1C protein while in the host but was not harmed by such exposure. Parallel studies conducted with several commonly used insecticides indicated they significantly reduced parasitism rates on strains of P. xylostella resistant to these insecticides. These results provide the first clear evidence of the lack of hazard to a parasitoid by a Bt plant, compared to traditional insecticides, and describe a test to rigorously evaluate the risks Bt plants pose to predators and parasitoids. PMID:18523682

  14. Decolonizing personality assessment and honoring indigenous voices: a critical examination of the MMPI-2.

    PubMed

    Hill, Jill S; Pace, Terry M; Robbins, Rockey R

    2010-01-01

    Utilizing a mixed methods approach located between constructivist-interpretivist and critical-ideological research paradigms (Ponterotto, 2005), the current study builds upon previous research (Pace et al., 2006) that investigated the cultural validity of the Minnesota Multiphasic Personality Inventory (MMPI)-2 in its use with American Indians. Thirty items from MMPI-2 scales F, 1, 6, 8, and 9 were identified via item analysis as reflecting significant differences in endorsement rates between an American Indian sample and the MMPI-2 normative group. Semistructured interviews focused on these 30 items were conducted with 13 American Indian participants from an Eastern Woodlands Nation in Oklahoma. Interviews were audio recorded, transcribed, and then coded for themes using a qualitative coding analysis. Nine themes emerged: core belief system, experiences of racism and discrimination, conflicting epistemologies, living in two worlds, community connectedness, responsibility and accountability to the community, traditional knowledge, stories as traditional knowledge, and language and historic loss. Results of the current study demonstrate how the MMPI-2 may pathologize Indigenous worldviews, knowledge, beliefs, and behaviors rather than accurately assess psychopathology. Implications for practice and future research are addressed.

  15. Rights and wrongs of the Hipparcos data. A critical quality assessment of the Hipparcos catalogue

    NASA Astrophysics Data System (ADS)

    van Leeuwen, F.

    2005-08-01

    A critical assessment of the quality of the Hipparcos data, partly supported by a completely new analysis of the raw data, is presented with the aim of clarifying reliability issues that have surfaced since the publication of the Hipparcos catalogue in 1997. A number of defects in the data are identified, such as scan-phase discontinuities and effects of external hits. These defects can be repaired when re-reducing the raw data. Instabilities in the great-circle reduction process are recognised and identified in a number of data sets. These resulted mainly from the difficult observing conditions imposed by the anomalous orbit of the satellite. The stability of the basic angle over the mission is confirmed, but the connectivity between the two fields of view has been less than optimal for some parts of the sky. Both are fundamental conditions for producing absolute parallaxes. Although there is clear room for improvement of the Hipparcos data, the catalogue as published remains generally reliable within the quoted accuracies. Some of the findings presented here are also relevant for the forthcoming Gaia mission.

  16. A critical assessment of UH-60 main rotor blade airfoil data

    NASA Technical Reports Server (NTRS)

    Totah, Joseph

    1993-01-01

    Many current comprehensive rotorcraft analyses employ lifting-line methods that require main rotor blade airfoil data, typically obtained from wind tunnel tests. In order to effectively evaluate these lifting-line methods, it is of the utmost importance to ensure that the airfoil section data are free of inaccuracies. A critical assessment of the SC1095 and SC1094R8 airfoil data used on the UH-60 main rotor blade was performed for that reason. Nine sources of wind tunnel data were examined, all of which contain SC1095 data and four of which also contain SC1094R8 data. Findings indicate that the most accurate data were generated in 1982 at the 11-Foot Wind Tunnel Facility at NASA Ames Research Center and in 1985 at the 6-inch by 22-inch transonic wind tunnel facility at Ohio State University. It has not been determined if data from these two sources are sufficiently accurate for their use in comprehensive rotorcraft analytical models of the UH-60. It is recommended that new airfoil tables be created for both airfoils using the existing data. Additional wind tunnel experimentation is also recommended to provide high quality data for correlation with these new airfoil tables.

  17. A Critical Assessment of the Ecological Assumptions Underpinning Compensatory Mitigation of Salmon-Derived Nutrients.

    PubMed

    Collins, Scott F; Marcarelli, Amy M; Baxter, Colden V; Wipfli, Mark S

    2015-09-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  18. A Critical Assessment of the Ecological Assumptions Underpinning Compensatory Mitigation of Salmon-Derived Nutrients

    NASA Astrophysics Data System (ADS)

    Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.

    2015-09-01

    We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.

  19. Supplemental studies for cardiovascular risk assessment in safety pharmacology: a critical overview.

    PubMed

    Picard, Sandra; Goineau, Sonia; Guillaume, Philippe; Henry, Joël; Hanouz, Jean-Luc; Rouet, René

    2011-12-01

    Safety Pharmacology studies for the cardiovascular risk assessment, as described in the ICH S7A and S7B guidelines, appear as being far from sufficient. The fact that almost all medicines withdrawn from the market because of life-threatening tachyarrhythmias (torsades-de-pointes) were shown as hERG blockers and QT interval delayers led the authorities to focus mainly on these markers. However, other surrogate biomarkers, e.g., TRIaD (triangulation, reverse-use-dependence, instability and dispersion of ventricular repolarization), have been identified to more accurately estimate the drug-related torsadogenic risk. In addition, more attention should be paid to other arrhythmias, not related to long QT and nevertheless severe and/or not self-extinguishing, e.g., atrial or ventricular fibrillation, resulting from altered electrical conduction or heterogeneous shortening of cardiac repolarization. Moreover, despite numerous clinical cases of drug-induced pulmonary hypertension, orthostatic hypotension, or heart valvular failure, few safety investigations are still conducted on drug interaction with cardiac and regional hemodynamics other than changes in aortic blood pressure evaluated in conscious large animals during the core battery mandatory studies. This critical review aims at discussing the usefulness, relevance, advantages, and limitations of some preclinical in vivo, in vitro, and in silico models, with high predictive values and currently used in supplemental safety studies.

  20. Challenges of assessing critical thinking and clinical judgment in nurse practitioner students.

    PubMed

    Gorton, Karen L; Hayes, Janice

    2014-03-01

    The purpose of this study was to determine whether there was a relationship between critical thinking skills and clinical judgment in nurse practitioner students. The study used a convenience, nonprobability sampling technique, engaging participants from across the United States. Correlational analysis demonstrated no statistically significant relationship between critical thinking skills and examination-style questions, critical thinking skills and scores on the evaluation and reevaluation of consequences subscale of the Clinical Decision Making in Nursing Scale, and critical thinking skills and the preceptor evaluation tool. The study found no statistically significant relationships between critical thinking skills and clinical judgment. Educators and practitioners could consider further research in these areas to gain insight into how critical thinking is and could be measured, to gain insight into the clinical decision making skills of nurse practitioner students, and to gain insight into the development and measurement of critical thinking skills in advanced practice educational programs.

  1. Using Interactive Simulations in Assessment: The Use of Computer-Based Interactive Simulations in the Assessment of Statistical Concepts

    ERIC Educational Resources Information Center

    Neumann, David L.

    2010-01-01

    Interactive computer-based simulations have been applied in several contexts to teach statistical concepts in university level courses. In this report, the use of interactive simulations as part of summative assessment in a statistics course is described. Students accessed the simulations via the web and completed questions relating to the…

  2. Computer-based assessment of student-constructed responses.

    PubMed

    Magliano, Joseph P; Graesser, Arthur C

    2012-09-01

    Student-constructed responses, such as essays, short-answer questions, and think-aloud protocols, provide a valuable opportunity to gauge student learning outcomes and comprehension strategies. However, given the challenges of grading student-constructed responses, instructors may be hesitant to use them. There have been major advances in the application of natural language processing of student-constructed responses. This literature review focuses on two dimensions that need to be considered when developing new systems. The first is type of response provided by the student-namely, meaning-making responses (e.g., think-aloud protocols, tutorial dialogue) and products of comprehension (e.g., essays, open-ended questions). The second corresponds to considerations of the type of natural language processing systems used and how they are applied to analyze the student responses. We argue that the appropriateness of the assessment protocols is, in part, constrained by the type of response and researchers should use hybrid systems that rely on multiple, convergent natural language algorithms. PMID:22581494

  3. Bayesian methods for assessing system reliability: models and computation.

    SciTech Connect

    Graves, T. L.; Hamada, Michael,

    2004-01-01

    There are many challenges with assessing the reliability of a system today. These challenges arise because a system may be aging and full system tests may be too expensive or can no longer be performed. Without full system testing, one must integrate (1) all science and engineering knowledge, models and simulations, (2) information and data at various levels of the system, e.g., subsystems and components and (3) information and data from similar systems, subsystems and components. The analyst must work with various data types and how the data are collected, account for measurement bias and uncertainty, deal with model and simulation uncertainty and incorporate expert knowledge. Bayesian hierarchical modeling provides a rigorous way to combine information from multiple sources and different types of information. However, an obstacle to applying Bayesian methods is the need to develop new software to analyze novel statistical models. We discuss a new statistical modeling environment, YADAS, that facilitates the development of Bayesian statistical analyses. It includes classes that help analysts specify new models, as well as classes that support the creation of new analysis algorithms. We illustrate these concepts using several examples.

  4. Roughness Based Crossflow Transition Control: A Computational Assessment

    NASA Technical Reports Server (NTRS)

    Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan; Streett, Craig L.; Carpenter, Mark H.

    2009-01-01

    A combination of parabolized stability equations and secondary instability theory has been applied to a low-speed swept airfoil model with a chord Reynolds number of 7.15 million, with the goals of (i) evaluating this methodology in the context of transition prediction for a known configuration for which roughness based crossflow transition control has been demonstrated under flight conditions and (ii) of analyzing the mechanism of transition delay via the introduction of discrete roughness elements (DRE). Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes, so as to weaken the growth of naturally occurring, linearly more unstable crossflow modes. Therefore, a synthesis of receptivity, linear and nonlinear growth of stationary crossflow disturbances, and the ensuing development of high frequency secondary instabilities is desirable to understand the experimentally observed transition behavior. With further validation, such higher fidelity prediction methodology could be utilized to assess the potential for crossflow transition control at even higher Reynolds numbers, where experimental data is currently unavailable.

  5. Assessment of computer-related health problems among post-graduate nursing students.

    PubMed

    Khan, Shaheen Akhtar; Sharma, Veena

    2013-01-01

    The study was conducted to assess computer-related health problems among post-graduate nursing students and to develop a Self Instructional Module for prevention of computer-related health problems in a selected university situated in Delhi. A descriptive survey with co-relational design was adopted. A total of 97 samples were selected from different faculties of Jamia Hamdard by multi stage sampling with systematic random sampling technique. Among post-graduate students, majority of sample subjects had average compliance with computer-related ergonomics principles. As regards computer related health problems, majority of post graduate students had moderate computer-related health problems, Self Instructional Module developed for prevention of computer-related health problems was found to be acceptable by the post-graduate students.

  6. Swimming Training Assessment: The Critical Velocity and the 400-m Test for Age-Group Swimmers.

    PubMed

    Zacca, Rodrigo; Fernandes, Ricardo Jorge P; Pyne, David B; Castro, Flávio Antônio de S

    2016-05-01

    To verify the metabolic responses of oxygen consumption (V[Combining Dot Above]O2), heart rate (HR), blood lactate concentrations [La], and rate of perceived exertion (RPE) when swimming at an intensity corresponding to the critical velocity (CV) assessed by a 4-parameter model (CV4par), and to check the reliability when using only a single 400-m maximal front crawl bout (T400) for CV4par assessment in age-group swimmers. Ten age-group swimmers (14-16 years old) performed 50-, 100-, 200-, 400- (T400), 800-, and 1,500-m maximal front crawl bouts to calculate CV4par. V[Combining Dot Above]O2, HR, [La], and RPE were measured immediately after bouts. Swimmers then performed 3 × 10-minute front crawl (45 seconds rest) at CV4par. V[Combining Dot Above]O2, HR, [La], and RPE were measured after 10 minutes of rest (Rest), warm-up (Pre), each 10-minute repetition, and at the end of the test (Post). CV4par was 1.33 ± 0.08 m·s. V[Combining Dot Above]O2, HR, [La], and RPE were similar between first 10-minute and Post time points in the 3 × 10-minute protocol. CV4par was equivalent to 92 ± 2% of the mean swimming speed of T400 (v400) for these swimmers. CV4par calculated through a single T400 (92%v400) showed excellent agreement (r = 0.30; 95% CI: -0.04 to 0.05 m·s, p = 0.39), low coefficient of variation (2%), and root mean square error of 0.02 ± 0.01 m·s when plotted against CV4par assessed through a 4-parameter model. These results generated the equation CV4par = 0.92 × v400. A single T400 can be used reliably to estimate the CV4par typically derived with 6 efforts in age-group swimmers.

  7. Evaluating social outcomes of HIV/AIDS interventions: a critical assessment of contemporary indicator frameworks

    PubMed Central

    Mannell, Jenevieve; Cornish, Flora; Russell, Jill

    2014-01-01

    Introduction Contemporary HIV-related theory and policy emphasize the importance of addressing the social drivers of HIV risk and vulnerability for a long-term response. Consequently, increasing attention is being given to social and structural interventions, and to social outcomes of HIV interventions. Appropriate indicators for social outcomes are needed in order to institutionalize the commitment to addressing social outcomes. This paper critically assesses the current state of social indicators within international HIV/AIDS monitoring and evaluation frameworks. Methods We analyzed the indicator frameworks of six international organizations involved in efforts to improve and synchronize the monitoring and evaluation of the HIV/AIDS response. Our analysis classifies the 328 unique indicators according to what they measure and assesses the degree to which they offer comprehensive measurement across three dimensions: domains of the social context, levels of change and organizational capacity. Results and discussion The majority of indicators focus on individual-level (clinical and behavioural) interventions and outcomes, neglecting structural interventions, community interventions and social outcomes (e.g. stigma reduction; community capacity building; policy-maker sensitization). The main tool used to address social aspects of HIV/AIDS is the disaggregation of data by social group. This raises three main limitations. Indicator frameworks do not provide comprehensive coverage of the diverse social drivers of the epidemic, particularly neglecting criminalization, stigma, discrimination and gender norms. There is a dearth of indicators for evaluating the social impacts of HIV interventions. Indicators of organizational capacity focus on capacity to effectively deliver and manage clinical services, neglecting capacity to respond appropriately and sustainably to complex social contexts. Conclusions Current indicator frameworks cannot adequately assess the social

  8. Radiological dose assessment for bounding accident scenarios at the Critical Experiment Facility, TA-18, Los Alamos National Laboratory

    SciTech Connect

    1991-09-01

    A computer modeling code, CRIT8, was written to allow prediction of the radiological doses to workers and members of the public resulting from these postulated maximum-effect accidents. The code accounts for the relationships of the initial parent radionuclide inventory at the time of the accident to the growth of radioactive daughter products, and considers the atmospheric conditions at time of release. The code then calculates a dose at chosen receptor locations for the sum of radionuclides produced as a result of the accident. Both criticality and non-criticality accidents are examined.

  9. Administration and environment considerations in computer-based sports-concussion assessment.

    PubMed

    Rahman-Filipiak, Annalise A M; Woodard, John L

    2013-12-01

    Computer-based testing has become a vital tool for the assessment of sport-related concussion (SRC). An increasing number of papers have been published on this topic, focusing on subjects such as the purpose and validity of baseline testing, the performance of special populations on computer-based tests, the psychometric properties of different computerized neurocognitive tools, and considerations for valid and reliable administration of these tools. The current paper describes several considerations regarding computerized test design, input and output devices, and testing environment that should be described explicitly when administering computer-based cognitive testing, regardless of whether the assessment is used for clinical or research purposes. The paper also reviews the conclusions of recent literature (2007-2013) using computer-based testing for the assessment of SRC, with special attention to the methods used in these studies. We also present an appendix checklist for clinicians and researchers that may be helpful in ensuring proper attention to factors that could influence the reliability and validity of computer-based cognitive testing. We believe that explicit attention to these technological factors may lead to the development of standards for the development and implementation of computer-based tests. Such standards have the potential to enhance the accuracy and utility of computer-based tests in SRC.

  10. Application of a screening method in assessing occupational safety and health of computer workstations.

    PubMed

    Niskanen, Toivo; Lehtelä, Jouni; Länsikallio, Riina

    2014-01-01

    Employers and workers need concrete guidance to plan and implement changes in the ergonomics of computer workstations. The Näppärä method is a screening tool for identifying problems requiring further assessment and corrective actions. The aim of this study was to assess the work of occupational safety and health (OSH) government inspectors who used Näppärä as part of their OSH enforcement inspections (430 assessments) related to computer work. The modifications in workstation ergonomics involved mainly adjustments to the screen, mouse, keyboard, forearm supports, and chair. One output of the assessment is an index indicating the percentage of compliance items. This method can be considered as exposure assessment and ergonomics intervention used as a benchmark for the level of ergonomics. Future research can examine whether the effectiveness of participatory ergonomics interventions should be investigated with Näppärä.

  11. Experiments on small-size fast critical fuel assemblies at the AKSAMIT facility and their use for development of computational models

    NASA Astrophysics Data System (ADS)

    Glushkov, E. S.; Glushkov, A. E.; Gomin, E. A.; Daneliya, S. B.; Zimin, A. A.; Kalugin, M. A.; Kapitonova, A. V.; Kompaniets, G. V.; Moroz, N. P.; Nosov, V. I.; Petrushenko, R. P.; Smirnov, O. N.

    2013-12-01

    Small-size fast critical assemblies with highly enriched fuel at the AKSAMIT facility are described in detail. Computational models of the critical assemblies at room temperature are given. The calculation results for the critical parameters are compared with the experimental data. A good agreement between the calculations and the experimental data is shown. The physical models developed for the critical assemblies, as well as the experimental results, can be applied to verify various codes intended for calculation of the neutronic characteristics of small-size fast nuclear reactors. For these experiments, the results computed using the codes of the MCU family show a high quality of the neutron data and of the physical models used.

  12. Environmental assessment for consolidation of certain materials and machines for nuclear criticality experiments and training

    SciTech Connect

    1996-05-21

    In support of its assigned missions and because of the importance of avoiding nuclear criticality accidents, DOE has adopted a policy to reduce identifiable nuclear criticality safety risks and to protect the public, workers, government property and essential operations from the effects of a criticality accident. In support of this policy, the Los Alamos Critical Experiments Facility (LACEF) at the Los Alamos National Laboratory (LANL) Technical Area (TA) 18, provides a program of general purpose critical experiments. This program, the only remaining one of its kind in the United States, seeks to maintain a sound basis of information for criticality control in those physical situations that DOE will encounter in handling and storing fissionable material in the future, and ensuring the presence of a community of individuals competent in practicing this control.

  13. Target Highlights in CASP9: Experimental Target Structures for the Critical Assessment of Techniques for Protein Structure Prediction

    PubMed Central

    Kryshtafovych, Andriy; Moult, John; Bartual, Sergio G.; Bazan, J. Fernando; Berman, Helen; Casteel, Darren E.; Christodoulou, Evangelos; Everett, John K.; Hausmann, Jens; Heidebrecht, Tatjana; Hills, Tanya; Hui, Raymond; Hunt, John F.; Jayaraman, Seetharaman; Joachimiak, Andrzej; Kennedy, Michael A.; Kim, Choel; Lingel, Andreas; Michalska, Karolina; Montelione, Gaetano T.; Otero, José M.; Perrakis, Anastassis; Pizarro, Juan C.; van Raaij, Mark J.; Ramelot, Theresa A.; Rousseau, Francois; Tong, Liang; Wernimont, Amy K.; Young, Jasmine; Schwede, Torsten

    2011-01-01

    One goal of the CASP Community Wide Experiment on the Critical Assessment of Techniques for Protein Structure Prediction is to identify the current state of the art in protein structure prediction and modeling. A fundamental principle of CASP is blind prediction on a set of relevant protein targets, i.e. the participating computational methods are tested on a common set of experimental target proteins, for which the experimental structures are not known at the time of modeling. Therefore, the CASP experiment would not have been possible without broad support of the experimental protein structural biology community. In this manuscript, several experimental groups discuss the structures of the proteins which they provided as prediction targets for CASP9, highlighting structural and functional peculiarities of these structures: the long tail fibre protein gp37 from bacteriophage T4, the cyclic GMP-dependent protein kinase Iβ (PKGIβ) dimerization/docking domain, the ectodomain of the JTB (Jumping Translocation Breakpoint) transmembrane receptor, Autotaxin (ATX) in complex with an inhibitor, the DNA-Binding J-Binding Protein 1 (JBP1) domain essential for biosynthesis and maintenance of DNA base-J (β-D-glucosyl-hydroxymethyluracil) in Trypanosoma and Leishmania, an so far uncharacterized 73 residue domain from Ruminococcus gnavus with a fold typical for PDZ-like domains, a domain from the Phycobilisome (PBS) core-membrane linker (LCM) phycobiliprotein ApcE from Synechocystis, the Heat shock protein 90 (Hsp90) activators PFC0360w and PFC0270w from Plasmodium falciparum, and 2-oxo-3-deoxygalactonate kinase from Klebsiella pneumoniae. PMID:22020785

  14. Applying Tandem Mass Spectral Libraries for Solving the Critical Assessment of Small Molecule Identification (CASMI) LC/MS Challenge 2012.

    PubMed

    Oberacher, Herbert

    2013-01-01

    The "Critical Assessment of Small Molecule Identification" (CASMI) contest was aimed in testing strategies for small molecule identification that are currently available in the experimental and computational mass spectrometry community. We have applied tandem mass spectral library search to solve Category 2 of the CASMI Challenge 2012 (best identification for high resolution LC/MS data). More than 230,000 tandem mass spectra part of four well established libraries (MassBank, the collection of tandem mass spectra of the "NIST/NIH/EPA Mass Spectral Library 2012", METLIN, and the 'Wiley Registry of Tandem Mass Spectral Data, MSforID') were searched. The sample spectra acquired in positive ion mode were processed. Seven out of 12 challenges did not produce putative positive matches, simply because reference spectra were not available for the compounds searched. This suggests that to some extent the limited coverage of chemical space with high-quality reference spectra is still a problem encountered in tandem mass spectral library search. Solutions were submitted for five challenges. Three compounds were correctly identified (kanamycin A, benzyldiphenylphosphine oxide, and 1-isopropyl-5-methyl-1H-indole-2,3-dione). In the absence of any reference spectrum, a false positive identification was obtained for 1-aminoanthraquinone by matching the corresponding sample spectrum to the structurally related compounds N-phenylphthalimide and 2-aminoanthraquinone. Another false positive result was submitted for 1H-benz[g]indole; for the 1H-benz[g]indole-specific sample spectra provided, carbazole was listed as the best matching compound. In this case, the quality of the available 1H-benz[g]indole-specific reference spectra was found to hamper unequivocal identification.

  15. Critical Assessment of Object Segmentation in Aerial Image Using Geo-Hausdorff Distance

    NASA Astrophysics Data System (ADS)

    Sun, H.; Ding, Y.; Huang, Y.; Wang, G.

    2016-06-01

    Aerial Image records the large-range earth objects with the ever-improving spatial and radiometric resolution. It becomes a powerful tool for earth observation, land-coverage survey, geographical census, etc., and helps delineating the boundary of different kinds of objects on the earth both manually and automatically. In light of the geo-spatial correspondence between the pixel locations of aerial image and the spatial coordinates of ground objects, there is an increasing need of super-pixel segmentation and high-accuracy positioning of objects in aerial image. Besides the commercial software package of eCognition and ENVI, many algorithms have also been developed in the literature to segment objects of aerial images. But how to evaluate the segmentation results remains a challenge, especially in the context of the geo-spatial correspondence. The Geo-Hausdorff Distance (GHD) is proposed to measure the geo-spatial distance between the results of various object segmentation that can be done with the manual ground truth or with the automatic algorithms.Based on the early-breaking and random-sampling design, the GHD calculates the geographical Hausdorff distance with nearly-linear complexity. Segmentation results of several state-of-the-art algorithms, including those of the commercial packages, are evaluated with a diverse set of aerial images. They have different signal-to-noise ratio around the object boundaries and are hard to trace correctly even for human operators. The GHD value is analyzed to comprehensively measure the suitability of different object segmentation methods for aerial images of different spatial resolution. By critically assessing the strengths and limitations of the existing algorithms, the paper provides valuable insight and guideline for extensive research in automating object detection and classification of aerial image in the nation-wide geographic census. It is also promising for the optimal design of operational specification of remote

  16. A Modified Sequential Organ Failure Assessment (MSOFA) Score for Critical Care Triage

    PubMed Central

    Grissom, Colin K.; Brown, Samuel M.; Kuttler, Kathryn G.; Boltax, Jonathan P.; Jones, Jason; Jephson, Al R.; Orme, James F.

    2013-01-01

    Objective The Sequential Organ Failure Assessment (SOFA) score has been recommended for triage during a mass influx of critically-ill patients, but requires laboratory measurement of four parameters which may be impractical with constrained resources. We hypothesized that a modified SOFA (MSOFA) score that requires only one laboratory measurement would predict patient outcome as well as the SOFA score. Methods After a retrospective derivation, in a prospective observational study in a 24-bed medical, surgical, and trauma intensive care unit, we determined serial SOFA and MSOFA scores on all patients admitted during calendar year 2008 and compared ability to predict mortality and need for mechanical ventilation. Results 1,770 patients (56% male) with a 30-day mortality of 10.5% were included in the study. Day 1 SOFA and MSOFA scores performed equally well at predicting mortality with an area under the receiver operating curve (AUC) of 0.83 (95% CI: 0.81-0.85) and 0.84 (95% CI 0.82-0.85) respectively (p=0.33 for comparison). Day 3 SOFA and MSOFA predicted mortality for the 828 patients remaining in the ICU with an AUC of 0.78 and 0.79 respectively. Day 5 scores performed less well at predicting mortality. Day 1 SOFA and MSOFA predicted need for mechanical ventilation on Day 3 with an AUC of 0.83 and 0.82 respectively. Mortality for the highest category of SOFA and MSOFA score (>11 points) was 53% and 58% respectively. Conclusions The MSOFA predicts mortality as well as the SOFA and is easier to implement in resource-constrained settings, but using either score as a triage tool would exclude many patients who would otherwise survive. PMID:21149228

  17. Health impact of "reduced yield" cigarettes: a critical assessment of the epidemiological evidence

    PubMed Central

    Thun, M.; Burns, D.

    2001-01-01

    Cigarettes with lower machine measured "tar" and nicotine yields have been marketed as "safer" than high tar products over the last four decades, but there is conflicting evidence about the impact of these products on the disease burden caused by smoking. This paper critically examines the epidemiological evidence relevant to the health consequences of "reduced yield" cigarettes. Some epidemiological studies have found attenuated risk of lung cancer but not other diseases, among people who smoke "reduced yield" cigarettes compared to smokers of unfiltered, high yield products. These studies probably overestimate the magnitude of any association with lung cancer by over adjusting for the number of cigarettes smoked per day (one aspect of compensatory smoking), and by not fully considering other differences between smokers of "high yield" and "low yield" cigarettes. Selected cohort studies in the USA and UK show that lung cancer risk continued to increase among older smokers from the 1950s to the 1980s, despite the widespread adoption of lower yield cigarettes. The change to filter tip products did not prevent a progressive increase in lung cancer risk among male smokers who began smoking during and after the second world war compared to the first world war era smokers. National trends in vital statistics data show declining lung cancer death rates in young adults, especially males, in many countries, but the extent to which this is attributable to "reduced yield" cigarettes remains unclear. No studies have adequately assessed whether health claims used to market "reduced yield" cigarettes delay cessation among smokers who might otherwise quit, or increase initiation among non-smokers. There is no convincing evidence that past changes in cigarette design have resulted in an important health benefit to either smokers or the whole population. Tobacco control policies should not allow changes in cigarette design to subvert or distract from interventions proven to reduce

  18. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  19. Computer Based Assessment of Cervical Vertebral Maturation Stages Using Digital Lateral Cephalograms

    PubMed Central

    Dzemidzic, Vildana; Sokic, Emir; Tiro, Alisa; Nakas, Enita

    2015-01-01

    Objective: This study was aimed to investigate the reliability of a computer application for assessment of the stages of cervical vertebra maturation in order to determine the stage of skeletal maturity. Material and methods: For this study, digital lateral cephalograms of 99 subjects (52 females and 47 males) were examined. The following selection criteria were used during the sample composition: age between 9 and 16 years, absence of anomalies of the vertebrae, good general health, no history of trauma at the cervical region. Subjects with lateral cephalograms of low quality were excluded from the study. For the purpose of this study a computer application Cephalometar HF V1 was developed. This application was used to mark the contours of the second, third and fourth cervical vertebrae on the digital lateral cephalograms, which enabled a computer to determine the stage of cervical vertebral maturation. The assessment of the stages of cervical vertebral maturation was carried out by an experienced orthodontist. The assessment was conducted according to the principles of the method proposed by authors Hassel and Farman. The degree of the agreement between the computer application and the researcher was analyzed using by statistical Cohen Kappa test. Results: The results of this study showed the agreement between the computer assessment and the researcher assessment of the cervical vertebral maturation stages, where the value of the Cohen Kappa coefficient was 0.985. Conclusion: The computer application Cephalometar HF V1 proved to be a reliable method for assessing the stages of cervical vertebral maturation. This program could help the orthodontists to identify the stage of cervical vertebral maturation when planning the orthodontic treatment for the patients with skeletal disharmonies. PMID:26862247

  20. Assessment of TRAC-PF1/MOD1 version 14. 3 using separate effects critical flow and blowdown experiments

    SciTech Connect

    Spindler, B.; Pellissier, M. )

    1990-01-01

    Independent assessment of the TRAC code was conducted at the Centre d'Etudes Nucleaires de Grenoble of the Commissariate a l'Energie Atomique (France) in the frame of the ICAP. This report presents the results of the assessment of TRAC-PF1/MOD1 version 14.3 using critical flow steady state tests (MOBY-DICK, SUPER-MOBY-DICK), and blowdown tests (CANON, SUPER-CANON, VERTICAL-CANON, MARVIKEN, OMEGA-TUBE, OMEGA-BUNDLE). This document, Volume 1, presents the text and tables from this assessment.