AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM
The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cast or risk analysis equations. It was especially intended for use by individuals with l...
Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
AUTOMOUSE: AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM OPERATIONAL MANUAL.
Under a mandate of national environmental laws, the agency strives to formulate and implement actions leading to a compatible balance between human activities and the ability of natural systems to support and nurture life. The Risk Reduction Engineering Laboratory is responsible ...
Gilshtein, Hayim; Mekel, Michal; Malkin, Leonid; Ben-Izhak, Ofer; Sabo, Edmond
2017-01-01
The cytologic diagnosis of indeterminate lesions of the thyroid involves much uncertainty, and the final diagnosis often requires operative resection. Computerized cytomorphometry and wavelets analysis were examined to evaluate their ability to better discriminate between benign and malignant lesions based on cytology slides. Cytologic reports from patients who underwent thyroid operation in a single, tertiary referral center were retrieved. Patients with Bethesda III and IV lesions were divided according to their final histopathology. Cytomorphometry and wavelet analysis were performed on the digitized images of the cytology slides. Cytology slides of 40 patients were analyzed. Seven patients had a histologic diagnosis of follicular malignancy, 13 had follicular adenomas, and 20 had a benign goiter. Computerized cytomorphometry with a combination of descriptors of nuclear size, shape, and texture was able to predict quantitatively adenoma versus malignancy within the indeterminate group with 95% accuracy. An automated wavelets analysis with a neural network algorithm reached an accuracy of 96% in identifying correctly malignant vs. benign lesions based on cytology. Computerized analysis of cytology slides seems to be more accurate in defining indeterminate thyroid lesions compared with conventional cytologic analysis, which is based on visual characteristics on cytology as well as the expertise of the cytologist. This pilot study needs to be validated with a greater number of samples. Providing a successful validation, we believe that such methods carry promise for better patient treatment. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Zhenhua; Rose, Adam Z.; Prager, Fynnwin
The state of the art approach to economic consequence analysis (ECA) is computable general equilibrium (CGE) modeling. However, such models contain thousands of equations and cannot readily be incorporated into computerized systems used by policy analysts to yield estimates of economic impacts of various types of transportation system failures due to natural hazards, human related attacks or technological accidents. This paper presents a reduced-form approach to simplify the analytical content of CGE models to make them more transparent and enhance their utilization potential. The reduced-form CGE analysis is conducted by first running simulations one hundred times, varying key parameters, suchmore » as magnitude of the initial shock, duration, location, remediation, and resilience, according to a Latin Hypercube sampling procedure. Statistical analysis is then applied to the “synthetic data” results in the form of both ordinary least squares and quantile regression. The analysis yields linear equations that are incorporated into a computerized system and utilized along with Monte Carlo simulation methods for propagating uncertainties in economic consequences. Although our demonstration and discussion focuses on aviation system disruptions caused by terrorist attacks, the approach can be applied to a broad range of threat scenarios.« less
Modeling uncertainty in computerized guidelines using fuzzy logic.
Jaulent, M. C.; Joyaux, C.; Colombet, I.; Gillois, P.; Degoulet, P.; Chatellier, G.
2001-01-01
Computerized Clinical Practice Guidelines (CPGs) improve quality of care by assisting physicians in their decision making. A number of problems emerges since patients with close characteristics are given contradictory recommendations. In this article, we propose to use fuzzy logic to model uncertainty due to the use of thresholds in CPGs. A fuzzy classification procedure has been developed that provides for each message of the CPG, a strength of recommendation that rates the appropriateness of the recommendation for the patient under consideration. This work is done in the context of a CPG for the diagnosis and the management of hypertension, published in 1997 by the French agency ANAES. A population of 82 patients with mild to moderate hypertension was selected and the results of the classification system were compared to whose given by a classical decision tree. Observed agreement is 86.6% and the variability of recommendations for patients with close characteristics is reduced. PMID:11825196
A Nondeterministic Resource Planning Model in Education
ERIC Educational Resources Information Center
Yoda, Koji
1977-01-01
Discusses a simple technique for stochastic resource planning that, when computerized, can assist educational managers in the process of quantifying the future uncertainty, thereby, helping them make better decisions. The example used is a school lunch program. (Author/IRT)
Evaluating a Computerized Aid for Conducting a Cognitive Task Analysis
2000-01-01
in conducting a cognitive task analysis . The conduct of a cognitive task analysis is costly and labor intensive. As a result, a few computerized aids...evaluation of a computerized aid, specifically CAT-HCI (Cognitive Analysis Tool - Human Computer Interface), for the conduct of a detailed cognitive task analysis . A
Making Materials Science and Engineering Data More Valuable Research Products (Postprint)
2014-09-12
uncertainties in the publishing market - place.b Also, there is a possibility that some for-profit publishers could try to restrict access to digital...Kaufman JG, Glatzman JS (eds) Computerization and networking of materials databases: Second Volume, ASTM STP 1106. American Society for Testing and
A Bayesian Tutoring System for Newtonian Mechanics: Can It Adapt to Different Learners?
ERIC Educational Resources Information Center
Pek, Peng-Kiat; Poh, Kim-Leng
2004-01-01
Newtonian mechanics is a core module in technology courses, but is difficult for many students to learn. Computerized tutoring can assist the teachers to provide individualized instruction. This article presents the application of decision theory to develop a tutoring system, "iTutor", to select optimal tutoring actions under uncertainty of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bird, S.P.
1978-03-01
Biofouling and corrosion of heat exchanger surfaces in Ocean Thermal Energy Conversion (OTEC) systems may be controlling factors in the potential success of the OTEC concept. Very little is known about the nature and behavior of marine fouling films at sites potentially suitable for OTEC power plants. To facilitate the acquisition of needed data, a biofouling measurement device developed by Professor J. G. Fetkovich and his associates at Carnegie-Mellon University (CMU) has been mass produced for use by several organizations in experiments at a variety of ocean sites. The CMU device is designed to detect small changes in thermal resistancemore » associated with the formation of marine microfouling films. An account of the work performed at the Pacific Northwest Laboratory (PNL) to develop a computerized uncertainty analysis for estimating experimental uncertainties of results obtained with the CMU biofouling measurement device and data reduction scheme is presented. The analysis program was written as a subroutine to the CMU data reduction code and provides an alternative to the CMU procedure for estimating experimental errors. The PNL code was used to analyze sample data sets taken at Keahole Point, Hawaii; St. Croix, the Virgin Islands; and at a site in the Gulf of Mexico. The uncertainties of the experimental results were found to vary considerably with the conditions under which the data were taken. For example, uncertainties of fouling factors (where fouling factor is defined as the thermal resistance of the biofouling layer) estimated from data taken on a submerged buoy at Keahole Point, Hawaii were found to be consistently within 0.00006 hr-ft/sup 2/-/sup 0/F/Btu, while corresponding values for data taken on a tugboat in the Gulf of Mexico ranged up to 0.0010 hr-ft/sup 2/-/sup 0/F/Btu. Reasons for these differences are discussed.« less
A methodology for the evaluation of program cost and schedule risk for the SEASAT program
NASA Technical Reports Server (NTRS)
Abram, P.; Myers, D.
1976-01-01
An interactive computerized project management software package (RISKNET) is designed to analyze the effect of the risk involved in each specific activity on the results of the total SEASAT-A program. Both the time and the cost of each distinct activity can be modeled with an uncertainty interval so as to provide the project manager with not only the expected time and cost for the completion of the total program, but also with the expected range of costs corresponding to any desired level of significance. The nature of the SEASAT-A program is described. The capabilities of RISKNET and the implementation plan of a RISKNET analysis for the development of SEASAT-A are presented.
Computerized PET/CT image analysis in the evaluation of tumour response to therapy
Wang, J; Zhang, H H
2015-01-01
Current cancer therapy strategy is mostly population based, however, there are large differences in tumour response among patients. It is therefore important for treating physicians to know individual tumour response. In recent years, many studies proposed the use of computerized positron emission tomography/CT image analysis in the evaluation of tumour response. Results showed that computerized analysis overcame some major limitations of current qualitative and semiquantitative analysis and led to improved accuracy. In this review, we summarize these studies in four steps of the analysis: image registration, tumour segmentation, image feature extraction and response evaluation. Future works are proposed and challenges described. PMID:25723599
Validation of a computerized algorithm to quantify fetal heart rate deceleration area.
Gyllencreutz, Erika; Lu, Ke; Lindecrantz, Kaj; Lindqvist, Pelle G; Nordstrom, Lennart; Holzmann, Malin; Abtahi, Farhad
2018-05-16
Reliability in visual cardiotocography interpretation is unsatisfying, which has led to development of computerized cardiotocography. Computerized analysis is well established for antenatal fetal surveillance, but has yet not performed sufficiently during labor. We aimed to investigate the capacity of a new computerized algorithm compared to visual assessment in identifying intrapartum fetal heart rate baseline and decelerations. Three-hundred-and-twelve intrapartum cardiotocography tracings with variable decelerations were analysed by the computerized algorithm and visually examined by two observers, blinded to each other and the computer analysis. The width, depth and area of each deceleration was measured. Four cases (>100 variable decelerations) were subject to in-depth detailed analysis. The outcome measures were bias in seconds (width), beats per minute (depth), and beats (area) between computer and observers by using Bland-Altman analysis. Interobserver reliability was determined by calculating intraclass correlation and Spearman rank analysis. The analysis (312 cases) showed excellent intraclass correlation (0.89-0.95) and very strong Spearman correlation (0.82-0.91). The detailed analysis of > 100 decelerations in 4 cases revealed low bias between the computer and the two observers; width 1.4 and 1.4 seconds, depth 5.1 and 0.7 beats per minute, and area 0.1 and -1.7 beats. This was comparable to the bias between the two observers; 0.3 seconds (width), 4.4 beats per minute (depth), and 1.7 beats (area). The intraclass correlation was excellent (0.90-0.98). A novel computerized algorithm for intrapartum cardiotocography analysis is as accurate as gold standard visual assessment with high correlation and low bias. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
The Deference Due the Oracle: Computerized Text Analysis in a Basic Writing Class.
ERIC Educational Resources Information Center
Otte, George
1989-01-01
Describes how a computerized text analysis program can help students discover error patterns in their writing, and notes how students' responses to analyses can reduce errors and improve their writing. (MM)
ERIC Educational Resources Information Center
Hsu, Chien-Ju; Thompson, Cynthia K.
2018-01-01
Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…
Tzeng, Huey-Ming; Hu, Hsou Mei; Yin, Chang-Yi
2011-12-01
Medicare no longer reimburses acute care hospitals for the costs of additional care required due to hospital-acquired injuries. Consequently, this study explored the effective computerized systems to inform practice for better interventions to reduce fall risk. It provided a correlation between type of computerized system and hospital-acquired injurious fall rates at acute care hospitals in California, Florida, and New York. It used multiple publicly available data sets, with the hospital as the unit of analysis. Descriptive and Pearson correlation analyses were used. The analysis included 462 hospitals. Significant correlations could be categorized into two groups: (1) meaningful computerized systems that were associated with lower injurious fall rates: the decision support systems for drug allergy alerts, drug-drug interaction alerts, and drug-laboratory interaction alerts; and (2) computerized systems that were associated with higher injurious fall rates: the decision support system for drug-drug interaction alerts and the computerized provider order entry system for radiology tests. Future research may include additional states, multiple years of data, and patient-level data to validate this study's findings. This effort may further inform policy makers and the public about effective clinical computerized systems provided to clinicians to improve their practice decisions and care outcomes.
Disclosure of sensitive behaviors across self-administered survey modes: a meta-analysis.
Gnambs, Timo; Kaspar, Kai
2015-12-01
In surveys, individuals tend to misreport behaviors that are in contrast to prevalent social norms or regulations. Several design features of the survey procedure have been suggested to counteract this problem; particularly, computerized surveys are supposed to elicit more truthful responding. This assumption was tested in a meta-analysis of survey experiments reporting 460 effect sizes (total N =125,672). Self-reported prevalence rates of several sensitive behaviors for which motivated misreporting has been frequently observed were compared across self-administered paper-and-pencil versus computerized surveys. The results revealed that computerized surveys led to significantly more reporting of socially undesirable behaviors than comparable surveys administered on paper. This effect was strongest for highly sensitive behaviors and surveys administered individually to respondents. Moderator analyses did not identify interviewer effects or benefits of audio-enhanced computer surveys. The meta-analysis highlighted the advantages of computerized survey modes for the assessment of sensitive topics.
ERIC Educational Resources Information Center
Stansfield, Charles W., Ed.
This collection of essays on measurement theory and language testing includes: "Computerized Adaptive Testing: Implications for Language Test Developers" (Peter Tung); "The Promise and Threat of Computerized Adaptive Assessment of Reading Comprehension" (Michael Canale); "Computerized Rasch Analysis of Item Bias in ESL…
D'Orso, M I; Centemeri, R; Oggionni, P; Latocca, R; Crippa, M; Vercellino, R; Riva, M; Cesana, G
2011-01-01
The movement computerized analysis of upper limb is a valid support in the definition of residual functional capability and of specific work suitability in complex cases. This methodology of evaluation is able to correctly and objectively define the tridimensional ranges of motion of every patient's upper limb. This fact can be particularly useful for workers coming back to work after a work-related or a not work-related accident of for handicapped workers at the beginning of a new work activity. We report a research carried out using computerized analysis of motion of upper limbs in 20 engineering workers.
The effect of sleep deprivation on vocal expression of emotion in adolescents and adults.
McGlinchey, Eleanor L; Talbot, Lisa S; Chang, Keng-Hao; Kaplan, Katherine A; Dahl, Ronald E; Harvey, Allison G
2011-09-01
Investigate the impact of sleep deprivation on vocal expression of emotion. Within-group repeated measures analysis involving sleep deprivation and rested conditions. Experimental laboratory setting. Fifty-five healthy participants (24 females), including 38 adolescents aged 11-15 y and 17 adults aged 30-60 y. A multimethod approach was used to examine vocal expression of emotion in interviews conducted at 22:30 and 06:30. On that night, participants slept a maximum of 2 h. Interviews were analyzed for vocal expression of emotion via computerized text analysis, human rater judgments, and computerized acoustic properties. Computerized text analysis and human rater judgments indicated decreases in positive emotion in all participants at 06:30 relative to 22:30, and adolescents displayed a significantly greater decrease in positive emotion via computerized text analysis relative to adults. Increases in negative emotion were observed among all participants using human rater judgments. Results for the computerized acoustic properties indicated decreases in pitch, bark energy (intensity) in certain high frequency bands, and vocal sharpness (reduction in high frequency bands > 1000 Hz). These findings support the importance of sleep for healthy emotional functioning in adults, and further suggest that adolescents are differentially vulnerable to the emotional consequences of sleep deprivation.
Leung, Gabriel M.; Yu, Philip L. H.; Wong, Irene O. L.; Johnston, Janice M.; Tin, Keith Y. K.
2003-01-01
Objective: Given the slow adoption of medical informatics in Hong Kong and Asia, we sought to understand the contributory barriers and potential incentives associated with information technology implementation. Design and Measurements: A representative sample of 949 doctors (response rate = 77.0%) was asked through a postal survey to rank a list of nine barriers associated with clinical computerization according to self-perceived importance. They ranked seven incentives or catalysts that may influence computerization. We generated mean rank scores and used multidimensional preference analysis to explore key explanatory dimensions of these variables. A hierarchical cluster analysis was performed to identify homogenous subgroups of respondents. We further determined the relationships between the sets of barriers and incentives/catalysts collectively using canonical correlation. Results: Time costs, lack of technical support and large capital investments were the biggest barriers to computerization, whereas improved office efficiency and better-quality care were ranked highest as potential incentives to computerize. Cost vs. noncost, physician-related vs. patient-related, and monetary vs. nonmonetary factors were the key dimensions explaining the barrier variables. Similarly, within-practice vs external and “push” vs “pull” factors accounted for the incentive variables. Four clusters were identified for barriers and three for incentives/catalysts. Canonical correlation revealed that respondents who were concerned with the costs of computerization also perceived financial incentives and government regulation to be important incentives/catalysts toward computerization. Those who found the potential interference with communication important also believed that the promise of improved care from computerization to be a significant incentive. Conclusion: This study provided evidence regarding common barriers associated with clinical computerization. Our findings also identified possible incentive strategies that may be employed to accelerate uptake of computer systems. PMID:12595409
Gravitational starlight deflection measurements during the 21 August 2017 total solar eclipse
NASA Astrophysics Data System (ADS)
Bruns, Donald G.
2018-04-01
Precise star positions near the Sun were measured during the 21 August 2017 total solar eclipse in order to measure their gravitational deflections. The equipment, procedures, and analysis are described in detail. A portable refractor, a CCD camera, and a computerized mount were set up in Wyoming. Detailed calibrations were necessary to improve accuracy and precision. Nighttime measurements taken just before the eclipse provided cubic optical distortion corrections. Calibrations based on star field images 7.4° on both sides of the Sun taken during totality gave linear and quadratic plate constants. A total of 45 images of the sky surrounding the Sun were acquired during the middle part of totality, with an integrated exposure of 22 s. The deflection analysis depended on accurate star positions from the USNO’s UCAC5 star catalog. The final result was a deflection coefficient L = 1.7512 arcsec, in perfect agreement with the theoretical value, with an uncertainty of only 3%.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-28
... the Compatibility Between the Donor's Cell Type and the Recipient's Serum or Plasma Type... Crossmatch' (Computerized Analysis of the Compatibility between the Donor's Cell Type and the Recipient's... donor's cell type and the recipient's serum or plasma type. The guidance describes practices that we...
A Procedure for the Computerized Analysis of Cleft Palate Speech Transcription
ERIC Educational Resources Information Center
Fitzsimons, David A.; Jones, David L.; Barton, Belinda; North, Kathryn N.
2012-01-01
The phonetic symbols used by speech-language pathologists to transcribe speech contain underlying hexadecimal values used by computers to correctly display and process transcription data. This study aimed to develop a procedure to utilise these values as the basis for subsequent computerized analysis of cleft palate speech. A computer keyboard…
NASA Technical Reports Server (NTRS)
1976-01-01
The primary objective of this study was to develop an integrated approach for the development, implementation, and utilization of all software that is required to efficiently and cost-effectively support advanced technology laboratory flight and ground operations. It was recognized that certain aspects of the operations would be mandatory computerized services; computerization of other aspects would be optional. Thus, the analyses encompassed not only alternate computer utilization and implementations but trade studies of the programmatic effects of non-computerized versus computerized approaches to the operations. A general overview of the study is presented.
Helene, L M; Rocha, M T
1998-10-01
The purpose of this study was to identify leprosy patients' psychosocial problems experienced after they were informed about their diagnosis. We focused attention upon concerns and behavioral changes related to their families, friends, jobs and to themselves. Data were obtained by a two opened questions interview and they were analysed with the aid of artificial intelligence techniques. These intelligence tools were used to discover the most frequent words, phrases and concepts existing in the interview reports. The results showed that after being informed about their diagnosis, the majority of the patients referred some concerns and behavioral changes related to their families, friends, jobs and to themselves. The main concerns of the population were related to the disease (transmission, the treatment extension, the possibility of hospitalization, the uncertainty about the cure). These facts induced some of the patients to avoid telling people about the disease they have.
Technology in the Assessment of Learning Disability.
ERIC Educational Resources Information Center
Bigler, Erin D.; Lajiness-O'Neill, Renee; Howes, Nancy-Louise
1998-01-01
Reviews recent neuroradiologic and brain imaging techniques in the assessment of learning disability. Technologies reviewed include computerized tomography; magnetic resonance imaging; electrophysiological and metabolic imaging; computerized electroencepholographic studies of evoked potentials, event-related potentials, spectral analysis, and…
Application of a computerized environmental information system to master and sector planning
NASA Technical Reports Server (NTRS)
Stewart, J. C.
1978-01-01
A computerized composite mapping system developed as an aid in the land use decision making process is described. Emphasis is placed on consideration of the environment in urban planning. The presence of alluvium, shallow bedrock, surface water, and vegetation growth are among the environmental factors considered. An analysis of the Shady Grove Sector planning is presented as an example of the use of computerized composite mapping for long range planning.
Computerized symbolic manipulation in structural mechanics Progress and potential
NASA Technical Reports Server (NTRS)
Noor, A. K.; Andersen, C. M.
1978-01-01
Status and recent applications of computerized symbolic manipulation to structural mechanics problems are summarized. The applications discussed include; (1) generation of characteristic arrays of finite elements; (2) evaluation of effective stiffness and mass coefficients of continuum models for repetitive lattice structures; and (3) application of Rayleigh-Ritz technique to free vibration analysis of laminated composite elliptic plates. The major advantages of using computerized symbolic manipulation in each of these applications are outlined. A number of problem areas which limit the realization of the full potential of computerized symbolic manipulation in structural mechanics are examined and some of the means of alleviating them are discussed.
An insight into morphometric descriptors of cell shape that pertain to regenerative medicine.
Lobo, Joana; See, Eugene Yong-Shun; Biggs, Manus; Pandit, Abhay
2016-07-01
Cellular morphology has recently been indicated as a powerful indicator of cellular function. The analysis of cell shape has evolved from rudimentary forms of microscopic visual inspection to more advanced methodologies that utilize high-resolution microscopy coupled with sophisticated computer hardware and software for data analysis. Despite this progress, there is still a lack of standardization in quantification of morphometric parameters. In addition, uncertainty remains as to which methodologies and parameters of cell morphology will yield meaningful data, which methods should be utilized to categorize cell shape, and the extent of reliability of measurements and the interpretation of the resulting analysis. A large range of descriptors has been employed to objectively assess the cellular morphology in two-dimensional and three-dimensional domains. Intuitively, simple and applicable morphometric descriptors are preferable and standardized protocols for cell shape analysis can be achieved with the help of computerized tools. In this review, cellular morphology is discussed as a descriptor of cellular function and the current morphometric parameters that are used quantitatively in two- and three-dimensional environments are described. Furthermore, the current problems associated with these morphometric measurements are addressed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
An analysis of interplanetary space radiation exposure for various solar cycles
NASA Technical Reports Server (NTRS)
Badhwar, G. D.; Cucinotta, F. A.; O'Neill, P. M.; Wilson, J. W. (Principal Investigator)
1994-01-01
The radiation dose received by crew members in interplanetary space is influenced by the stage of the solar cycle. Using the recently developed models of the galactic cosmic radiation (GCR) environment and the energy-dependent radiation transport code, we have calculated the dose at 0 and 5 cm water depth; using a computerized anatomical man (CAM) model, we have calculated the skin, eye and blood-forming organ (BFO) doses as a function of aluminum shielding for various solar minima and maxima between 1954 and 1989. These results show that the equivalent dose is within about 15% of the mean for the various solar minima (maxima). The maximum variation between solar minimum and maximum equivalent dose is about a factor of three. We have extended these calculations for the 1976-1977 solar minimum to five practical shielding geometries: Apollo Command Module, the least and most heavily shielded locations in the U.S. space shuttle mid-deck, center of the proposed Space Station Freedom cluster and sleeping compartment of the Skylab. These calculations, using the quality factor of ICRP 60, show that the average CAM BFO equivalent dose is 0.46 Sv/year. Based on an approach that takes fragmentation into account, we estimate a calculation uncertainty of 15% if the uncertainty in the quality factor is neglected.
Computerized analysis of sonograms for the detection of breast lesions
NASA Astrophysics Data System (ADS)
Drukker, Karen; Giger, Maryellen L.; Horsch, Karla; Vyborny, Carl J.
2002-05-01
With a renewed interest in using non-ionizing radiation for the screening of high risk women, there is a clear role for a computerized detection aid in ultrasound. Thus, we are developing a computerized detection method for the localization of lesions on breast ultrasound images. The computerized detection scheme utilizes two methods. Firstly, a radial gradient index analysis is used to distinguish potential lesions from normal parenchyma. Secondly, an image skewness analysis is performed to identify posterior acoustic shadowing. We analyzed 400 cases (757 images) consisting of complex cysts, solid benign lesions, and malignant lesions. The detection method yielded an overall sensitivity of 95% by image, and 99% by case at a false-positive rate of 0.94 per image. In 51% of all images, only the lesion itself was detected, while in 5% of the images only the shadowing was identified. For malignant lesions these numbers were 37% and 9%, respectively. In summary, we have developed a computer detection method for lesions on ultrasound images of the breast, which may ultimately aid in breast cancer screening.
An Analysis of Minimum System Requirements to Support Computerized Adaptive Testing.
1986-09-01
adaptive test ( CAT ); adaptive test ing A;4SRAC:’ (Continue on reverie of necessary and ident4f by block number) % This pape-r discusses the minimum system...requirements needed to develop a computerized adaptive test ( CAT ). It lists some of the benefits of adaptive testing, establishes a set of...discusses the minimum system requirements needed to develop a computerized adaptive test ( CAT ). It lists some of the benefits of adaptive testing
Economics of infection control surveillance technology: cost-effective or just cost?
Furuno, Jon P; Schweizer, Marin L; McGregor, Jessina C; Perencevich, Eli N
2008-04-01
Previous studies have suggested that informatics tools, such as automated alert and decision support systems, may increase the efficiency and quality of infection control surveillance. However, little is known about the cost-effectiveness of these tools. We focus on 2 types of economic analyses that have utility in assessing infection control interventions (cost-effectiveness analysis and business-case analysis) and review the available literature on the economics of computerized infection control surveillance systems. Previous studies on the effectiveness of computerized infection control surveillance have been limited to assessments of whether these tools increase the sensitivity and specificity of surveillance over traditional methods. Furthermore, we identified only 2 studies that assessed the costs associated with computerized infection control surveillance. Thus, it remains unknown whether computerized infection control surveillance systems are cost-effective and whether use of these systems improves patient outcomes. The existing data are insufficient to allow for a summary conclusion on the cost-effectiveness of infection control surveillance technology. All future studies of computerized infection control surveillance systems should aim to collect outcomes and economic data to inform decision making and assist hospitals with completing business-cases analyses.
A new color vision test to differentiate congenital and acquired color vision defects.
Shin, Young Joo; Park, Kyu Hyung; Hwang, Jeong-Min; Wee, Won Ryang; Lee, Jin Hak
2007-07-01
To investigate the efficacy of a novel computer-controlled color test for the differentiation of congenital and acquired color vision deficiency. Observational cross-sectional study. Thirty-one patients with congenital color vision deficiency and 134 patients with acquired color vision deficiency with a Snellen visual acuity better than 20/30 underwent an ophthalmologic examination including the Ishihara color test, Hardy-Rand-Rittler test, Nagel anomaloscopy, and the Seohan computerized hue test between June, 2003, and January, 2004. To investigate the type of color vision defect, a graph of the Seohan computerized hue test was divided into 4 quadrants and error scores in each quadrant were summated. The ratio between the sums of error scores of quadrants I and III (Q1+Q3) and those of quadrants II and IV (Q2+Q4) was calculated. Error scores and ratio in quadrant analysis of the Seohan computerized hue test. The Seohan computerized hue test showed that the sum of Q2+Q4 was significantly higher than the sum of Q1+Q3 in congenital color vision deficiency (P<0.01, paired t test) and that the sum of Q2+Q4 was significantly lower than the sum of Q1+Q3 in acquired color vision deficiency (P<0.01, paired t test). In terms of discriminating congenital and acquired color vision deficiency, the ratio in quadrant analysis had 93.3% sensitivity and 98.5% specificity with a reference value of 1.5 by the Seohan computerized hue test (95% confidence interval). The quadrant analysis and ratio of (Q2+Q4)/(Q1+Q3) using the Seohan computerized hue test effectively differentiated congenital and acquired color vision deficiency.
[The clinical economic analysis of the methods of ischemic heart disease diagnostics].
Kalashnikov, V Iu; Mitriagina, S N; Syrkin, A L; Poltavskaia, M G; Sorokina, E G
2007-01-01
The clinical economical analysis was applied to assess the application of different techniques of ischemic heart disease diagnostics - the electro-cardiographic monitoring, the treadmill-testing, the stress-echo cardiographic with dobutamine, the single-photon computerized axial tomography with load, the multi-spiral computerized axial tomography with coronary arteries staining in patients with different initial probability of disease occurrence. In all groups, the best value of "cost-effectiveness" had the treadmill-test. The patients with low risk needed 17.4 rubles to precise the probability of ischemic heart disease occurrence at 1%. In the group with medium and high risk this indicator was 9.4 and 24.7 rubles correspondingly. It is concluded that to precise the probability of ischemic heart disease occurrence after tredmil-test in the patients with high probability it is appropriate to use the single-photon computerized axial tomography with load and in the case of patients with low probability the multi-spiral computerized axial tomography with coronary arteries staining.
An Analysis of Community Health Nurses Documentation: The Best Approach to Computerization
Chalmers, M.
1988-01-01
The study explored and analyzed the actual patient-related documentation performed by a sample of community health nurses working in voluntary home health agencies. The outcome of the study was a system flow chart of that documentation and included: common components of the documentation, where in the existing systems they are recorded, when they are recorded by the nurse and why they are used by the nurses and administrative personnel in the agencies. The flow chart is suitable for use as a prototype for the development of a computer software package for the computerization of the patient-related documentation by community health nurses. General System and communication theories were used as a framework for this study. A thorough analysis of the documenation resulted in a complete and exhaustive explication of the documentation by community health nurses, as well as the identification of what parts of that documentation lend themselves most readily to computerization and what areas, if any, may not readily adapt to computerization.
ERIC Educational Resources Information Center
Jones, Tom; Di Salvo, Vince
A computerized content analysis of the "theory input" for a basic speech course was conducted. The questions to be answered were (1) What does the inexperienced basic speech student hold as a conceptual perspective of the "speech to inform" prior to his being subjected to a college speech class? and (2) How does that inexperienced student's…
ERIC Educational Resources Information Center
Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.
In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…
Oak Ridge Computerized Hierarchical Information System (ORCHIS) status report, July 1973
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, A.A.
1974-01-01
This report summarizes the concepts, software, and contents of the Oak Ridge Computerized Hierarchical Information System. This data analysis and text processing system was developed as an integrated, comprehensive information processing capability to meet the needs of an on-going multidisciplinary research and development organization. (auth)
Dance Technology. Current Applications and Future Trends.
ERIC Educational Resources Information Center
Gray, Judith A., Ed.
Original research is reported on image digitizing, robot choreography, movement analysis, databases for dance, computerized dance notation, and computerized lightboards for dance performance. Articles in this publication are as follows: (1) "The Evolution of Dance Technology" (Judith A. Gray); (2) "Toward a Language for Human Movement" (Thomas W.…
Computer Review Can Cut HVAC Energy Use
ERIC Educational Resources Information Center
McClure, Charles J. R.
1974-01-01
A computerized review of construction bidding documents, usually done by a consulting engineer, can reveal how much money it will cost to operate various alternative types of HVAC equipment over a school's lifetime. The review should include a computerized load calculation, energy systems flow diagram, control system analysis, and a computerized…
Comparative study of smile analysis by subjective and computerized methods.
Basting, Roberta Tarkany; da Trindade, Rita de Cássia Silva; Flório, Flávia Martão
2006-01-01
This study compared: 1) the subjective analyses of a smile done by specialists with advanced training and by general dentists; 2) the subjective analysis of a smile, or that associated with the face, by specialists with advanced training and general dentists; 3) subjective analysis using a computerized analysis of the smile by specialists with advanced training, verifying the midline, labial line, smile line, the line between commissures and the golden proportion. The sample consisted of 100 adults with natural dentition; 200 photographs were taken (100 of the smile and 100 of the entire face). Computerized analysis using AutoCAD software was performed, together with the subjective analyses of 2 groups of professionals (3 general dentists and 3 specialists with advanced training), using the following assessment factors: the midline, labial line, smile line, line between the commissures and the golden proportion. The smile itself and the smile associated with the entire face were recorded as being agreeable or not agreeable by the professionals. The McNemar test showed a highly significant difference (p=0.0000) among the subjective analyses performed by specialists compared to general dentists. Between the 2 groups of dental professionals, there were highly significant differences (p=0.0000) found between the subjective analyses of the smile and that of the face. The McNemar test showed statistical differences in all factors assessed, with the exception of the midline (p=0.1951), when the computerized analysis and subjective analysis of the specialists were compared. In order to establish harmony of the smile, it was not possible to establish a greater or lesser relevance among the factors analyzed.
Computerized Systems for Collecting Real-Time Observational Data.
ERIC Educational Resources Information Center
Kahng, SungWoo; Iwata, Brian
1998-01-01
A survey of 15 developers of computerized real-time observation systems found many systems have incorporated laptop or handheld computers as well as bar-code scanners. Most systems used IBM-compatible software, and ranged from free to complete systems costing more than $1,500. Data analysis programs were included with most programs. (Author/CR)
ERIC Educational Resources Information Center
Bantum, Erin O'Carroll; Owen, Jason E.
2009-01-01
Psychological interventions provide linguistic data that are particularly useful for testing mechanisms of action and improving intervention methodologies. For this study, emotional expression in an Internet-based intervention for women with breast cancer (n = 63) was analyzed via rater coding and 2 computerized coding methods (Linguistic Inquiry…
ERIC Educational Resources Information Center
Reychav, Iris; Raban, Daphne Ruth; McHaney, Roger
2018-01-01
The current empirical study examines relationships between network measures and learning performance from a social network analysis perspective. We collected computerized, networking data to analyze how 401 junior high students connected to classroom peers using text- and video-based material on iPads. Following a period of computerized…
The role of computerized symbolic manipulation in rotorcraft dynamics analysis
NASA Technical Reports Server (NTRS)
Crespo Da Silva, Marcelo R. M.; Hodges, Dewey H.
1986-01-01
The potential role of symbolic manipulation programs in development and solution of the governing equations for rotorcraft dynamics problems is discussed and illustrated. Nonlinear equations of motion for a helicopter rotor blade represented by a rotating beam are developed making use of the computerized symbolic manipulation program MACSYMA. The use of computerized symbolic manipulation allows the analyst to concentrate on more meaningful tasks, such as establishment of physical assumptions, without being sidetracked by the tedious and trivial details of the algebraic manipulations. Furthermore, the resulting equations can be produced, if necessary, in a format suitable for numerical solution. A perturbation-type solution for the resulting dynamical equations is shown to be possible with a combination of symbolic manipulation and standard numerical techniques. This should ultimately lead to a greater physical understanding of the behavior of the solution than is possible with purely numerical techniques. The perturbation analysis of the flapping motion of a rigid rotor blade in forward flight is presented, for illustrative purposes, via computerized symbolic manipulation with a method that bypasses Floquet theory.
Kazandjian, Vahé A; Lipitz-Snyderman, Allison
2011-12-01
To discuss the usefulness of health care information technology (HIT) in assisting care providers minimize uncertainty while simultaneously increasing efficiency of the care provided. An ongoing study of HIT, performance measurement (clinical and production efficiency) and their implications to the payment for care represents the design of this study. Since 2006, all Maryland hospitals have embarked on a multi-faceted study of performance measures and HIT adoption surveys, which will shape the health care payment model in Maryland, the last of the all-payor states, in 2011. This paper focuses on the HIT component of the Maryland care payment initiative. While the payment model is still under review and discussion, 'appropriateness' of care has been discussed as an important dimension of measurement. Within this dimension, the 'uncertainty' concept has been identified as associated with variation in care practices. Hence, the methods of this paper define how HIT can assist care providers in addressing the concept of uncertainty, and then provides findings from the first HIT survey in Maryland to infer the readiness of Maryland hospital in addressing uncertainty of care in part through the use of HIT. Maryland hospitals show noteworthy variation in their adoption and use of HIT. While computerized, electronic patient records are not commonly used among and across Maryland hospitals, many of the uses of HIT internally in each hospital could significantly assist in better communication about better practices to minimize uncertainty of care and enhance the efficiency of its production. © 2010 Blackwell Publishing Ltd.
Computerized image analysis for quantitative neuronal phenotyping in zebrafish.
Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C
2006-06-15
An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.
An analysis of computerization in primary care practices.
Condon, James V; Smith, Sherry P
2002-12-01
To remain profitable, primary care practices, the front-line health care providers, must provide excellent patient care and reduce expenses while providing payers with accurate data. Many primary care practices have turned to computer technology to achieve these goals. This study examined the degree of computerization of primary care providers in the Augusta, Georgia, metropolitan area as well as the level of awareness of the Health Insurance Portability and Accountability Act (HIPAA) by primary care providers and its potential effect on their future computerization plans. The study's findings are presented and discussed as well as a number of recommendations for practice managers.
THE VALIDITY OF HUMAN AND COMPUTERIZED WRITING ASSESSMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring
2005-09-01
This paper summarizes an experiment designed to assess the validity of essay grading between holistic and analytic human graders and a computerized grader based on latent semantic analysis. The validity of the grade was gauged by the extent to which the student’s knowledge of the topic correlated with the grader’s expert knowledge. To assess knowledge, Pathfinder networks were generated by the student essay writers, the holistic and analytic graders, and the computerized grader. It was found that the computer generated grades more closely matched the definition of valid grading than did human generated grades.
Surface mapping of spike potential fields: experienced EEGers vs. computerized analysis.
Koszer, S; Moshé, S L; Legatt, A D; Shinnar, S; Goldensohn, E S
1996-03-01
An EEG epileptiform spike focus recorded with scalp electrodes is clinically localized by visual estimation of the point of maximal voltage and the distribution of its surrounding voltages. We compared such estimated voltage maps, drawn by experienced electroencephalographers (EEGers), with a computerized spline interpolation technique employed in the commercially available software package FOCUS. Twenty-two spikes were recorded from 15 patients during long-term continuous EEG monitoring. Maps of voltage distribution from the 28 electrodes surrounding the points of maximum change in slope (the spike maximum) were constructed by the EEGer. The same points of maximum spike and voltage distributions at the 29 electrodes were mapped by computerized spline interpolation and a comparison between the two methods was made. The findings indicate that the computerized spline mapping techniques employed in FOCUS construct voltage maps with similar maxima and distributions as the maps created by experienced EEGers. The dynamics of spike activity, including correlations, are better visualized using the computerized technique than by manual interpretation alone. Its use as a technique for spike localization is accurate and adds information of potential clinical value.
ERIC Educational Resources Information Center
Wang, Shudong; McCall, Marty; Jiao, Hong; Harris, Gregg
2012-01-01
The purposes of this study are twofold. First, to investigate the construct or factorial structure of a set of Reading and Mathematics computerized adaptive tests (CAT), "Measures of Academic Progress" (MAP), given in different states at different grades and academic terms. The second purpose is to investigate the invariance of test…
ERIC Educational Resources Information Center
Vasarhelyi, Paul
The new data retrieval system for the social sciences which has recently been installed in the UNESCO Secretariat in Paris is described in this comprehensive report. The computerized system is designed to facilitate the existing storage systems in the circulation of information, data retrieval, and indexing services. Basically, this report…
ERIC Educational Resources Information Center
Lutz, John E.; And Others
The degree of success of the computerized Child-Based Information System (CBIS) was analyzed in two areas--presenting, delivering, and managing a developmental curriculum; and recording, filing, and monitoring child tracking data, including requirements for Individualized Education Plans (IEP's). Preschool handicapped and high-risk children and…
ERIC Educational Resources Information Center
Sokolowski, Andrzej; Li, Yeping; Willson, Victor
2015-01-01
Background: The process of problem solving is difficult for students; thus, mathematics educators have made multiple attempts to seek ways of making this process more accessible to learners. The purpose of this study was to examine the effect size statistic of utilizing exploratory computerized environments (ECEs) to support the process of word…
SYN-OP-SYS™: A Computerized Management Information System for Quality Assurance and Risk Management
Thomas, David J.; Weiner, Jayne; Lippincott, Ronald C.
1985-01-01
SYN·OP·SYS™ is a computerized management information system for quality assurance and risk management. Computer software for the efficient collection and analysis of “occurrences” and the clinical data associated with these kinds of patient events is described. The system is evaluated according to certain computer design criteria, and the system's implementation is assessed.
ERIC Educational Resources Information Center
Brandhorst, W. T.
An analysis of existing computerized data banks in science and technology reveals that nearly half of them involve the storage and retrieval of bibliographic data. Activity in this area has been independent and autonomous. This situation is now giving way to a new environment which involves cooperation, standards, and a rigorous rational analysis…
Person Fit Analysis in Computerized Adaptive Testing Using Tests for a Change Point
ERIC Educational Resources Information Center
Sinharay, Sandip
2016-01-01
Meijer and van Krimpen-Stoop noted that the number of person-fit statistics (PFSs) that have been designed for computerized adaptive tests (CATs) is relatively modest. This article partially addresses that concern by suggesting three new PFSs for CATs. The statistics are based on tests for a change point and can be used to detect an abrupt change…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, Valerie A.; Ogilvie, Alistair B.
2012-01-01
This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a rationale for why this data should be collected, a list of the data needed to support reliability and availability analysis, and specific data recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of operating wind turbines. This report is intended to help develop a basic understanding of the data needed for reliability analysis frommore » a Computerized Maintenance Management System (CMMS) and other data systems. The report provides a rationale for why this data should be collected, a list of the data needed to support reliability and availability analysis, and specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and analysis and reporting needs. The 'Motivation' section of this report provides a rationale for collecting and analyzing field data for reliability analysis. The benefits of this type of effort can include increased energy delivered, decreased operating costs, enhanced preventive maintenance schedules, solutions to issues with the largest payback, and identification of early failure indicators.« less
A study of commuter airplane design optimization
NASA Technical Reports Server (NTRS)
Keppel, B. V.; Eysink, H.; Hammer, J.; Hawley, K.; Meredith, P.; Roskam, J.
1978-01-01
The usability of the general aviation synthesis program (GASP) was enhanced by the development of separate computer subroutines which can be added as a package to this assembly of computerized design methods or used as a separate subroutine program to compute the dynamic longitudinal, lateral-directional stability characteristics for a given airplane. Currently available analysis methods were evaluated to ascertain those most appropriate for the design functions which the GASP computerized design program performs. Methods for providing proper constraint and/or analysis functions for GASP were developed as well as the appropriate subroutines.
ERIC Educational Resources Information Center
Beaudrie, Sara M.; Ducar, Cynthia
2012-01-01
This paper outlines the design, implementation, and analysis of a computerized Spanish heritage language (SHL) placement exam. The exam created by the authors exemplifies how to design a simple yet effective placement exam with limited resources. It is suggested that an SHL placement exam should be developed in-house due not only to the diversity…
Marquié, J C; Thon, B; Baracat, B
1994-06-01
The study of Bue and Gollac (1988) provided evidence that a significantly lower proportion of workers aged 45 years and over make use of computer technology compared with younger ones. The aim of the present survey was to explain this fact by a more intensive analysis of the older workers' attitude with respect to the computerization of work situations in relation to other individual and organizational factors. Six hundred and twenty office workers from 18 to 70 years old, either users or non-users of computerized devices, were asked to complete a questionnaire. The questions allowed the assessment of various aspects of the workers' current situation, such as the computer training they had received, the degree of consultation they were subjected to during the computerization process, their representation of the effects of these new technologies on working conditions and employment, the rate of use of new technologies outside the work context, and the perceived usefulness of computers for their own work. The analysis of the questionnaire revealed that as long as the step towards using computer tools, even minimally, has not been taken, then attitudes with respect to computerization are on the whole not very positive and are a source of anxiety for many workers. Age, and even more, seniority in the department, increase such negative representations. The effects of age and seniority were also found among users, as well as the effects of other factors such as qualification, education level, type and rate of computer use, and size of the firm. For the older workers, the expectation of less positive consequences for their career, or even the fear that computerization might be accompanied by threats to their own employment and the less clear knowledge of how computers operate, appeared to account for a significant part of the observed age and seniority differences in attitudes. Although the difference in the amount of computer training between age groups was smaller than expected, the study revealed that one third of the users never received any specific training, and that many of those who benefited from it were trained for only a few days. Consultation of the staff during the computerization process also appeared to be poor, to apply mostly to the best trained and qualified workers, and to be more highly developed in small companies. The results are discussed in the light of more qualitative data recorded during the survey. They suggest the need to increase information, training and involvement of all personnel from the very first stages of computerization (or other technical changes) in order to lessen fears and the feeling of disruption, which are particularly obvious among the oldest workers.
Rossi, Michael R.; Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed
2009-01-01
The current study focuses on experimentally validating a planning scheme based on the so-called bubble-packing method. This study is a part of an ongoing effort to develop computerized planning tools for cryosurgery, where bubble packing has been previously developed as a means to find an initial, uniform distribution of cryoprobes within a given domain; the so-called force-field analogy was then used to move cryoprobes to their optimum layout. However, due to the high quality of the cryoprobes’ distribution, suggested by bubble packing and its low computational cost, it has been argued that a planning scheme based solely on bubble packing may be more clinically relevant. To test this argument, an experimental validation is performed on a simulated cross-section of the prostate, using gelatin solution as a phantom material, proprietary liquid-nitrogen based cryoprobes, and a cryoheater to simulate urethral warming. Experimental results are compared with numerically simulated temperature histories resulting from planning. Results indicate an average disagreement of 0.8 mm in identifying the freezing front location, which is an acceptable level of uncertainty in the context of prostate cryosurgery imaging. PMID:19885373
Computerized photogrammetry used to calculate the brow position index.
Naif-de-Andrade, Naif Thadeu; Hochman, Bernardo; Naif-de-Andrade, Camila Zirlis; Ferreira, Lydia Masako
2012-10-01
The orbital region is of vital importance to facial expression. Brow ptosis, besides having an impact on facial harmony, is a sign of aging. Various surgical techniques have been developed to increase the efficacy of brow-lift surgery. However, no consensus method exists for an objective measurement of the eyebrow position due to the curvature of the face. Therefore, this study aimed to establish a method for measuring the eyebrow position using computerized photogrammetry. For this study, 20 orbital regions of 10 volunteers were measured by direct anthropometry using a digital caliper and by indirect anthropometry (computerized photogrammetry) using standardized digital photographs. Lines, points, and distances were defined based on the position of the anthropometric landmarks endocanthion and exocanthion and then used to calculate the brow position index (BPI). Statistical analysis was performed using Student's t test with a significance level of 5 %. The BPI values obtained by computerized photogrammetric measurements did not differ significantly from those obtained by direct anthropometric measurements (p > 0.05). The mean BPI was 84.89 ± 10.30 for the computerized photogrammetric measurements and 85.27 ± 10.67 for the direct anthropometric measurements. The BPI defined in this study and obtained by computerized photogrammetry is a reproducible and efficient method for measuring the eyebrow position. This journal requires that authors assign a level of evidence to each article.
Prediction of ball and roller bearing thermal and kinematic performance by computer analysis
NASA Technical Reports Server (NTRS)
Pirvics, J.; Kleckner, R. J.
1983-01-01
Characteristics of good computerized analysis software are suggested. These general remarks and an overview of representative software precede a more detailed discussion of load support system analysis program structure. Particular attention is directed at a recent cylindrical roller bearing analysis as an example of the available design tools. Selected software modules are then examined to reveal the detail inherent in contemporary analysis. This leads to a brief section on current design computation which seeks to suggest when and why computerized analysis is warranted. An example concludes the argument offered for such design methodology. Finally, remarks are made concerning needs for model development to address effects which are now considered to be secondary but are anticipated to emerge to primary status in the near future.
Gomar, Jesús J; Valls, Elia; Radua, Joaquim; Mareca, Celia; Tristany, Josep; del Olmo, Francisco; Rebolleda-Gil, Carlos; Jañez-Álvarez, María; de Álvaro, Francisco J; Ovejero, María R; Llorente, Ana; Teixidó, Cristina; Donaire, Ana M; García-Laredo, Eduardo; Lazcanoiturburu, Andrea; Granell, Luis; Mozo, Cristina de Pablo; Pérez-Hernández, Mónica; Moreno-Alcázar, Ana; Pomarol-Clotet, Edith; McKenna, Peter J
2015-11-01
The effectiveness of cognitive remediation therapy (CRT) for the neuropsychological deficits seen in schizophrenia is supported by meta-analysis. However, a recent methodologically rigorous trial had negative findings. In this study, 130 chronic schizophrenic patients were randomly assigned to computerized CRT, an active computerized control condition (CC) or treatment as usual (TAU). Primary outcome measures were 2 ecologically valid batteries of executive function and memory, rated under blind conditions; other executive and memory tests and a measure of overall cognitive function were also employed. Carer ratings of executive and memory failures in daily life were obtained before and after treatment. Computerized CRT was found to produce improvement on the training tasks, but this did not transfer to gains on the primary outcome measures and most other neuropsychological tests in comparison to either CC or TAU conditions. Nor did the intervention result in benefits on carer ratings of daily life cognitive failures. According to this study, computerized CRT is not effective in schizophrenia. The use of both active and passive CCs suggests that nature of the control group is not an important factor influencing results. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.
Computerized series solution of relativistic equations of motion.
NASA Technical Reports Server (NTRS)
Broucke, R.
1971-01-01
A method of solution of the equations of planetary motion is described. It consists of the use of numerical general perturbations in orbital elements and in rectangular coordinates. The solution is expanded in Fourier series in the mean anomaly with the aid of harmonic analysis and computerized series manipulation techniques. A detailed application to the relativistic motion of the planet Mercury is described both for Schwarzschild and isotropic coordinates.
Weir, Charlene R; Nebeker, Jonathan J R; Hicken, Bret L; Campo, Rebecca; Drews, Frank; Lebar, Beth
2007-01-01
Computerized Provider Order Entry (CPOE) with electronic documentation, and computerized decision support dramatically changes the information environment of the practicing clinician. Prior work patterns based on paper, verbal exchange, and manual methods are replaced with automated, computerized, and potentially less flexible systems. The objective of this study is to explore the information management strategies that clinicians use in the process of adapting to a CPOE system using cognitive task analysis techniques. Observation and semi-structured interviews were conducted with 88 primary-care clinicians at 10 Veterans Administration Medical Centers. Interviews were taped, transcribed, and extensively analyzed to identify key information management goals, strategies, and tasks. Tasks were aggregated into groups, common components across tasks were clarified, and underlying goals and strategies identified. Nearly half of the identified tasks were not fully supported by the available technology. Six core components of tasks were identified. Four meta-cognitive information management goals emerged: 1) Relevance Screening; 2) Ensuring Accuracy; 3) Minimizing memory load; and 4) Negotiating Responsibility. Strategies used to support these goals are presented. Users develop a wide array of information management strategies that allow them to successfully adapt to new technology. Supporting the ability of users to develop adaptive strategies to support meta-cognitive goals is a key component of a successful system.
Investigation of computer-aided colonic crypt pattern analysis
NASA Astrophysics Data System (ADS)
Qi, Xin; Pan, Yinsheng; Sivak, Michael V., Jr.; Olowe, Kayode; Rollins, Andrew M.
2007-02-01
Colorectal cancer is the second leading cause of cancer-related death in the United States. Approximately 50% of these deaths could be prevented by earlier detection through screening. Magnification chromoendoscopy is a technique which utilizes tissue stains applied to the gastrointestinal mucosa and high-magnification endoscopy to better visualize and characterize lesions. Prior studies have shown that shapes of colonic crypts change with disease and show characteristic patterns. Current methods for assessing colonic crypt patterns are somewhat subjective and not standardized. Computerized algorithms could be used to standardize colonic crypt pattern assessment. We have imaged resected colonic mucosa in vitro (N = 70) using methylene blue dye and a surgical microscope to approximately simulate in vivo imaging with magnification chromoendoscopy. We have developed a method of computerized processing to analyze the crypt patterns in the images. The quantitative image analysis consists of three steps. First, the crypts within the region of interest of colonic tissue are semi-automatically segmented using watershed morphological processing. Second, crypt size and shape parameters are extracted from the segmented crypts. Third, each sample is assigned to a category according to the Kudo criteria. The computerized classification is validated by comparison with human classification using the Kudo classification criteria. The computerized colonic crypt pattern analysis algorithm will enable a study of in vivo magnification chromoendoscopy of colonic crypt pattern correlated with risk of colorectal cancer. This study will assess the feasibility of screening and surveillance of the colon using magnification chromoendoscopy.
Economic Evaluation of Computerized Structural Analysis
NASA Technical Reports Server (NTRS)
Fortin, P. E.
1985-01-01
This completed effort involved a technical and economic study of the capabilities of computer programs in the area of structural analysis. The applicability of the programs to NASA projects and to other users was studied. The applications in other industries was explored including both research and development and applied areas. The costs of several alternative analysis programs were compared. A literature search covered applicable technical literature including journals, trade publications and books. In addition to the literature search, several commercial companies that have developed computerized structural analysis programs were contacted and their technical brochures reviewed. These programs include SDRC I-DEAS, MSC/NASTRAN, SCADA, SUPERSAP, NISA/DISPLAY, STAAD-III, MICAS, GTSTRUDL, and STARS. These programs were briefly reviewed as applicable to NASA projects.
Kuusk, Teele; De Bruijn, Roderick; Brouwer, Oscar R; De Jong, Jeroen; Donswijk, Maarten; Grivas, Nikolaos; Hendricksen, Kees; Horenblas, Simon; Prevoo, Warner; Valdés Olmos, Renato A; Van Der Poel, Henk G; Van Rhijn, Bas W G; Wit, Esther M; Bex, Axel
2018-06-01
Lymphatic drainage from renal tumors is unpredictable. In vivo drainage studies of primary lymphatic landing sites may reveal the variability and dynamics of lymphatic connections. The purpose of this study was to investigate the lymphatic drainage pattern of renal tumors in vivo with single photon emission/computerized tomography after intratumor radiotracer injection. We performed a phase II, prospective, single arm study to investigate the distribution of sentinel nodes from renal tumors on single photon emission/computerized tomography. Patients with cT1-3 (less than 10 cm) cN0M0 renal tumors of any subtype were enrolled in analysis. After intratumor ultrasound guided injection of 0.4 ml 99m Tc-nanocolloid we performed preoperative imaging of sentinel nodes with lymphoscintigraphy and single photon emission/computerized tomography. Sentinel and locoregional nonsentinel nodes were resected with a γ probe combined with a mobile γ camera. The primary study end point was the location of sentinel nodes outside the locoregional retroperitoneal templates on single photon emission/computerized tomography. Using a Simon minimax 2-stage design to detect a 25% extralocoregional retroperitoneal template location of sentinel nodes on imaging at α = 0.05 and 80% power at least 40 patients with sentinel node imaging on single photon emission/computerized tomography were needed. Of the 68 patients 40 underwent preoperative single photon emission/computerized tomography of sentinel nodes and were included in primary end point analysis. Lymphatic drainage outside the locoregional retroperitoneal templates was observed in 14 patients (35%). Eight patients (20%) had supradiaphragmatic sentinel nodes. Sentinel nodes from renal tumors were mainly located in the respective locoregional retroperitoneal templates. Simultaneous sentinel nodes were located outside the suggested lymph node dissection templates, including supradiaphragmatic sentinel nodes in more than a third of the patients. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Atlas of computerized blood flow analysis in bone disease.
Gandsman, E J; Deutsch, S D; Tyson, I B
1983-11-01
The role of computerized blood flow analysis in routine bone scanning is reviewed. Cases illustrating the technique include proven diagnoses of toxic synovitis, Legg-Perthes disease, arthritis, avascular necrosis of the hip, fractures, benign and malignant tumors, Paget's disease, cellulitis, osteomyelitis, and shin splints. Several examples also show the use of the technique in monitoring treatment. The use of quantitative data from the blood flow, bone uptake phase, and static images suggests specific diagnostic patterns for each of the diseases presented in this atlas. Thus, this technique enables increased accuracy in the interpretation of the radionuclide bone scan.
Beheshti, Iman; Olya, Hossain G T; Demirel, Hasan
2016-04-05
Recently, automatic risk assessment methods have been a target for the detection of Alzheimer's disease (AD) risk. This study aims to develop an automatic computer-aided AD diagnosis technique for risk assessment of AD using information diffusion theory. Information diffusion is a fuzzy mathematics logic of set-value that is used for risk assessment of natural phenomena, which attaches fuzziness (uncertainty) and incompleteness. Data were obtained from voxel-based morphometry analysis of structural magnetic resonance imaging. The information diffusion model results revealed that the risk of AD increases with a reduction of the normalized gray matter ratio (p > 0.5, normalized gray matter ratio <40%). The information diffusion model results were evaluated by calculation of the correlation of two traditional risk assessments of AD, the Mini-Mental State Examination and the Clinical Dementia Rating. The correlation results revealed that the information diffusion model findings were in line with Mini-Mental State Examination and Clinical Dementia Rating results. Application of information diffusion model contributes to the computerization of risk assessment of AD, which has a practical implication for the early detection of AD.
Hazut, Koren; Romem, Pnina; Malkin, Smadar; Livshiz-Riven, Ilana
2016-12-01
The purpose of this study was to compare the predictive validity, economic efficiency, and faculty staff satisfaction of a computerized test versus a personal interview as admission methods for graduate nursing studies. A mixed method study was designed, including cross-sectional and retrospective cohorts, interviews, and cost analysis. One hundred and thirty-four students in the Master of Nursing program participated. The success of students in required core courses was similar in both admission method groups. The personal interview method was found to be a significant predictor of success, with cognitive variables the only significant contributors to the model. Higher satisfaction levels were reported with the computerized test compared with the personal interview method. The cost of the personal interview method, in annual hourly work, was 2.28 times higher than the computerized test. These findings may promote discussion regarding the cost benefit of the personal interview as an admission method for advanced academic studies in healthcare professions. © 2016 John Wiley & Sons Australia, Ltd.
Computerized N-acetylcysteine physician order entry by template protocol for acetaminophen toxicity.
Thompson, Trevonne M; Lu, Jenny J; Blackwood, Louisa; Leikin, Jerrold B
2011-01-01
Some medication dosing protocols are logistically complex for traditional physician ordering. The use of computerized physician order entry (CPOE) with templates, or order sets, may be useful to reduce medication administration errors. This study evaluated the rate of medication administration errors using CPOE order sets for N-acetylcysteine (NAC) use in treating acetaminophen poisoning. An 18-month retrospective review of computerized inpatient pharmacy records for NAC use was performed. All patients who received NAC for the treatment of acetaminophen poisoning were included. Each record was analyzed to determine the form of NAC given and whether an administration error occurred. In the 82 cases of acetaminophen poisoning in which NAC was given, no medication administration errors were identified. Oral NAC was given in 31 (38%) cases; intravenous NAC was given in 51 (62%) cases. In this retrospective analysis of N-acetylcysteine administration using computerized physician order entry and order sets, no medication administration errors occurred. CPOE is an effective tool in safely executing complicated protocols in an inpatient setting.
Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H
2017-03-01
To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and specificity. Bayesian methods fully quantify uncertainty and their ability to incorporate external evidence may be useful for imprecisely estimated parameters. Copyright © 2017 Elsevier Inc. All rights reserved.
Shu, Ting; Zhang, Bob; Tang, Yuan Yan
2017-01-01
At present, heart disease is the number one cause of death worldwide. Traditionally, heart disease is commonly detected using blood tests, electrocardiogram, cardiac computerized tomography scan, cardiac magnetic resonance imaging, and so on. However, these traditional diagnostic methods are time consuming and/or invasive. In this paper, we propose an effective noninvasive computerized method based on facial images to quantitatively detect heart disease. Specifically, facial key block color features are extracted from facial images and analyzed using the Probabilistic Collaborative Representation Based Classifier. The idea of facial key block color analysis is founded in Traditional Chinese Medicine. A new dataset consisting of 581 heart disease and 581 healthy samples was experimented by the proposed method. In order to optimize the Probabilistic Collaborative Representation Based Classifier, an analysis of its parameters was performed. According to the experimental results, the proposed method obtains the highest accuracy compared with other classifiers and is proven to be effective at heart disease detection.
Georgieva, Antoniya; Payne, Stephen J; Moulden, Mary; Redman, Christopher W G
2011-01-01
We applied computerized methods to assess the Electronic Fetal Monitoring (EFM) in labor. We analyzed retrospectively the last hour of EFM for 1,370 babies, delivered by emergency Cesarean sections before the onset of pushing (data collected at the John Radcliffe Hospital, Oxford, UK). There were two cohorts according to the reason for intervention: (a) fetal distress, n(1) = 524 and (b) failure to progress and/or malpresentation, n(2) = 846. The cohorts were compared in terms of classical EFM features (baseline, decelerations, variability and accelerations), computed by a dedicated Oxford system for automated analysis--OxSys. In addition, OxSys was employed to simulate current clinical guidelines for the classification of fetal monitoring, i.e. providing in real time a three-tier grading system of the EFM (normal, indeterminate, or abnormal). The computerized features and the simulated guidelines corresponded well to the clinical management and to the actual labor outcome (measured by umbilical arterial pH).
Computerized Design Synthesis (CDS), A database-driven multidisciplinary design tool
NASA Technical Reports Server (NTRS)
Anderson, D. M.; Bolukbasi, A. O.
1989-01-01
The Computerized Design Synthesis (CDS) system under development at McDonnell Douglas Helicopter Company (MDHC) is targeted to make revolutionary improvements in both response time and resource efficiency in the conceptual and preliminary design of rotorcraft systems. It makes the accumulated design database and supporting technology analysis results readily available to designers and analysts of technology, systems, and production, and makes powerful design synthesis software available in a user friendly format.
2013-01-01
Objective. This study compared the relationship between computer experience and performance on computerized cognitive tests and a traditional paper-and-pencil cognitive test in a sample of older adults (N = 634). Method. Participants completed computer experience and computer attitudes questionnaires, three computerized cognitive tests (Useful Field of View (UFOV) Test, Road Sign Test, and Stroop task) and a paper-and-pencil cognitive measure (Trail Making Test). Multivariate analysis of covariance was used to examine differences in cognitive performance across the four measures between those with and without computer experience after adjusting for confounding variables. Results. Although computer experience had a significant main effect across all cognitive measures, the effect sizes were similar. After controlling for computer attitudes, the relationship between computer experience and UFOV was fully attenuated. Discussion. Findings suggest that computer experience is not uniquely related to performance on computerized cognitive measures compared with paper-and-pencil measures. Because the relationship between computer experience and UFOV was fully attenuated by computer attitudes, this may imply that motivational factors are more influential to UFOV performance than computer experience. Our findings support the hypothesis that computer use is related to cognitive performance, and this relationship is not stronger for computerized cognitive measures. Implications and directions for future research are provided. PMID:22929395
Ganesan, Vishnu; De, Shubha; Shkumat, Nicholas; Marchini, Giovanni; Monga, Manoj
2018-02-01
Preoperative determination of uric acid stones from computerized tomography imaging would be of tremendous clinical use. We sought to design a software algorithm that could apply data from noncontrast computerized tomography to predict the presence of uric acid stones. Patients with pure uric acid and calcium oxalate stones were identified from our stone registry. Only stones greater than 4 mm which were clearly traceable from initial computerized tomography to final composition were included in analysis. A semiautomated computer algorithm was used to process image data. Average and maximum HU, eccentricity (deviation from a circle) and kurtosis (peakedness vs flatness) were automatically generated. These parameters were examined in several mathematical models to predict the presence of uric acid stones. A total of 100 patients, of whom 52 had calcium oxalate and 48 had uric acid stones, were included in the final analysis. Uric acid stones were significantly larger (12.2 vs 9.0 mm, p = 0.03) but calcium oxalate stones had higher mean attenuation (457 vs 315 HU, p = 0.001) and maximum attenuation (918 vs 553 HU, p <0.001). Kurtosis was significantly higher in each axis for calcium oxalate stones (each p <0.001). A composite algorithm using attenuation distribution pattern, average attenuation and stone size had overall 89% sensitivity, 91% specificity, 91% positive predictive value and 89% negative predictive value to predict uric acid stones. A combination of stone size, attenuation intensity and attenuation pattern from conventional computerized tomography can distinguish uric acid stones from calcium oxalate stones with high sensitivity and specificity. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Computerization and its contribution to care quality improvement: the nurses' perspective.
Kagan, Ilya; Fish, Miri; Farkash-Fink, Naomi; Barnoy, Sivia
2014-12-01
Despite the widely held belief that the computerization of hospital medical systems contributes to improved patient care management, especially in the context of ordering medications and record keeping, extensive study of the attitudes of medical staff to computerization has found them to be negative. The views of nursing staff have been barely studied and so are unclear. The study reported here investigated the association between nurses' current computer use and skills, the extent of their involvement in quality control and improvement activities on the ward and their perception of the contribution of computerization to improving nursing care. The study was made in the context of a Joint Commission International Accreditation (JCIA) in a large tertiary medical center in Israel. The perception of the role of leadership commitment in the success of a quality initiative was also tested for. Two convenience samples were drawn from 33 clinical wards and units of the medical center. They were questioned at two time points, one before the JCIA and a second after JCIA completion. Of all nurses (N=489), 89 were paired to allow analysis of the study data in a before-and-after design. Thus, this study built three data sets: a pre-JCIA set, a post-JCIA set and a paired sample who completed the questionnaire both before and after JCIA. Data were collected by structured self-administered anonymous questionnaire. After the JCIA the participants ranked the role of leadership in quality improvement, the extent of their own quality control activity, and the contribution of computers to quality improvement higher than before the JCIA. Significant Pearson correlations were found showing that the higher the rating given to quality improvement leadership the more nurses reported quality improvement activities undertaken by them and the higher nurses rated the impact of computerization on the quality of care. In a regression analysis quality improvement leadership and computer use/skills accounted for 30% of the variance in the perceived contribution of computerization to quality improvement. (a) The present study is the first to show a relationship between organizational leadership and computer use by nurses for the purpose of improving clinical care. (b) The nurses' appreciation of the contribution computerization can make to data management and to clinical care quality improvement were both increased by the JCI accreditation process. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Troncone, Alda; Cascella, Crescenzo; Chianese, Antonietta; Iafusco, Dario
2015-07-01
The purpose of this study was to assess messages posted by mothers of children with type 1 diabetes in the Italian Facebook group "Mamme e diabete" using computerized text analysis. The data suggest that these mothers use online discussion boards as a place to seek and provide information to better manage the disease's daily demands-especially those tasks linked to insulin correction and administration, control of food intake, and bureaucratic duties, as well as to seek and give encouragement and to share experiences regarding diabetes and related impact on their life. The implications of these findings for the management of diabetes are discussed.
Computerized Production Process Planning. Volume 2. Benefit Analysis.
1976-11-01
advantage , in the long term, Systems 2 and 3 will return greater economic benefits . Plots of the cumulative present value of the cash flow by year are...is economically viable for large parts manufac- turers and does offer significant advantages over Systems I and 2 in terms of intangible benefits ...AD-RI51 996 COMPUTERIZED PRODUCTION PROCESS PLANNING VOLUME 2 i/1.. BENEFIT ANRLYSIS(U) IIT RESEARCH INST CHICRGO IL SH H HU ET AL. NOV 76 DAAHNi-76
Automated Computerized Analysis of Speechin Psychiatric Disorders
Cohen, Alex S.; Elvevåg, Brita
2014-01-01
Purpose of Review Disturbances in communication are a hallmark of severe mental illnesses. Recent technological advances have paved the way for objectifying communication using automated computerized linguistic and acoustic analysis. We review recent studies applying various computer-based assessments to the natural language produced by adult patients with severe mental illness. Recent Findings Automated computerized methods afford tools with which it is possible to objectively evaluate patients in a reliable, valid and efficient manner that complements human ratings. Crucially, these measures correlate with important clinical measures. The clinical relevance of these novel metrics has been demonstrated by showing their relationship to functional outcome measures, their in vivo link to classic ‘language’ regions in the brain, and, in the case of linguistic analysis, their relationship to candidate genes for severe mental illness. Summary Computer based assessments of natural language afford a framework with which to measure communication disturbances in adults with SMI. Emerging evidence suggests that they can be reliable and valid, and overcome many practical limitations of more traditional assessment methods. The advancement of these technologies offers unprecedented potential for measuring and understanding some of the most crippling symptoms of some of the most debilitating illnesses known to humankind. PMID:24613984
Computerized structural mechanics for 1990's: Advanced aircraft needs
NASA Technical Reports Server (NTRS)
Viswanathan, A. V.; Backman, B. F.
1989-01-01
The needs for computerized structural mechanics (CSM) as seen from the standpoint of the aircraft industry are discussed. These needs are projected into the 1990's with special focus on the new advanced materials. Preliminary design/analysis, research, and detail design/analysis are identified as major areas. The role of local/global analyses in these different areas is discussed. The lessons learned in the past are used as a basis for the design of a CSM framework that could modify and consolidate existing technology and include future developments in a rational and useful way. A philosophy is stated, and a set of analyses needs driven by the emerging advanced composites is enumerated. The roles of NASA, the universities, and the industry are identified. Finally, a set of rational research targets is recommended based on both the new types of computers and the increased complexity the industry faces. Computerized structural mechanics should be more than new methods in structural mechanics and numerical analyses. It should be a set of engineering applications software products that combines innovations in structural mechanics, numerical analysis, data processing, search and display features, and recent hardware advances and is organized in a framework that directly supports the design process.
Weir, Charlene R.; Nebeker, Jonathan J.R.; Hicken, Bret L.; Campo, Rebecca; Drews, Frank; LeBar, Beth
2007-01-01
Objective Computerized Provider Order Entry (CPOE) with electronic documentation, and computerized decision support dramatically changes the information environment of the practicing clinician. Prior work patterns based on paper, verbal exchange, and manual methods are replaced with automated, computerized, and potentially less flexible systems. The objective of this study is to explore the information management strategies that clinicians use in the process of adapting to a CPOE system using cognitive task analysis techniques. Design Observation and semi-structured interviews were conducted with 88 primary-care clinicians at 10 Veterans Administration Medical Centers. Measurements Interviews were taped, transcribed, and extensively analyzed to identify key information management goals, strategies, and tasks. Tasks were aggregated into groups, common components across tasks were clarified, and underlying goals and strategies identified. Results Nearly half of the identified tasks were not fully supported by the available technology. Six core components of tasks were identified. Four meta-cognitive information management goals emerged: 1) Relevance Screening; 2) Ensuring Accuracy; 3) Minimizing memory load; and 4) Negotiating Responsibility. Strategies used to support these goals are presented. Conclusion Users develop a wide array of information management strategies that allow them to successfully adapt to new technology. Supporting the ability of users to develop adaptive strategies to support meta-cognitive goals is a key component of a successful system. PMID:17068345
Del Mazo-Barbara, Anna; Mirabel, Clémentine; Nieto, Valentín; Reyes, Blanca; García-López, Joan; Oliver-Vila, Irene; Vives, Joaquim
2016-09-01
Computerized systems (CS) are essential in the development and manufacture of cell-based medicines and must comply with good manufacturing practice, thus pushing academic developers to implement methods that are typically found within pharmaceutical industry environments. Qualitative and quantitative risk analyses were performed by Ishikawa and Failure Mode and Effects Analysis, respectively. A process for qualification of a CS that keeps track of environmental conditions was designed and executed. The simplicity of the Ishikawa analysis permitted to identify critical parameters that were subsequently quantified by Failure Mode Effects Analysis, resulting in a list of test included in the qualification protocols. The approach presented here contributes to simplify and streamline the qualification of CS in compliance with pharmaceutical quality standards.
Grid-Enabled Quantitative Analysis of Breast Cancer
2010-10-01
large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...research, we designed a pilot study utilizing large scale parallel Grid computing harnessing nationwide infrastructure for medical image analysis . Also
Using a virtual reality temporal bone simulator to assess otolaryngology trainees.
Zirkle, Molly; Roberson, David W; Leuwer, Rudolf; Dubrowski, Adam
2007-02-01
The objective of this study is to determine the feasibility of computerized evaluation of resident performance using hand motion analysis on a virtual reality temporal bone (VR TB) simulator. We hypothesized that both computerized analysis and expert ratings would discriminate the performance of novices from experienced trainees. We also hypothesized that performance on the virtual reality temporal bone simulator (VR TB) would differentiate based on previous drilling experience. The authors conducted a randomized, blind assessment study. Nineteen volunteers from the Otolaryngology-Head and Neck Surgery training program at the University of Toronto drilled both a cadaveric TB and a simulated VR TB. Expert reviewers were asked to assess operative readiness of the trainee based on a blind video review of their performance. Computerized hand motion analysis of each participant's performance was conducted. Expert raters were able to discriminate novices from experienced trainees (P < .05) on cadaveric temporal bones, and there was a trend toward discrimination on VR TB performance. Hand motion analysis showed that experienced trainees had better movement economy than novices (P < .05) on the VR TB. Performance, as measured by hand motion analysis on the VR TB simulator, reflects trainees' previous drilling experience. This study suggests that otolaryngology trainees could accomplish initial temporal bone training on a VR TB simulator, which can provide feedback to the trainee, and may reduce the need for constant faculty supervision and evaluation.
Computerized physician order entry from a chief information officer perspective.
Cotter, Carole M
2004-12-01
Designing and implementing a computerized physician order entry system in the critical care units of a large urban hospital system is an enormous undertaking. With their significant potential to improve health care and significantly reduce errors, the time for computerized physician order entry or physician order management systems is past due. Careful integrated planning is the key to success, requiring multidisciplinary teams at all levels of clinical and administrative management to work together. Articulated from the viewpoint of the Chief Information Officer of Lifespan, a not-for-profit hospital system in Rhode Island, the vision and strategy preceding the information technology plan, understanding the system's current state, the gap analysis between current and future state, and finally, building and implementing the information technology plan are described.
Validity of a Manual Soft Tissue Profile Prediction Method Following Mandibular Setback Osteotomy
Kolokitha, Olga-Elpis
2007-01-01
Objectives The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. Methods To test the validity of the manual method the prediction tracings were compared to the actual post-operative tracings. The Dentofacial Planner software was used to develop the computerized post-surgical prediction tracings. Both manual and computerized prediction printouts were analyzed by using the cephalometric system PORDIOS. Statistical analysis was performed by means of t-test. Results Comparison between manual prediction tracings and the actual post-operative profile showed that the manual method results in more convex soft tissue profiles; the upper lip was found in a more prominent position, upper lip thickness was increased and, the mandible and lower lip were found in a less posterior position than that of the actual profiles. Comparison between computerized and manual prediction methods showed that in the manual method upper lip thickness was increased, the upper lip was found in a more anterior position and the lower anterior facial height was increased as compared to the computerized prediction method. Conclusions Cephalometric simulation of post-operative soft tissue profile following orthodontic-surgical management of mandibular prognathism imposes certain limitations related to the methods implied. However, both manual and computerized prediction methods remain a useful tool for patient communication. PMID:19212468
Validity of a manual soft tissue profile prediction method following mandibular setback osteotomy.
Kolokitha, Olga-Elpis
2007-10-01
The aim of this study was to determine the validity of a manual cephalometric method used for predicting the post-operative soft tissue profiles of patients who underwent mandibular setback surgery and compare it to a computerized cephalometric prediction method (Dentofacial Planner). Lateral cephalograms of 18 adults with mandibular prognathism taken at the end of pre-surgical orthodontics and approximately one year after surgery were used. To test the validity of the manual method the prediction tracings were compared to the actual post-operative tracings. The Dentofacial Planner software was used to develop the computerized post-surgical prediction tracings. Both manual and computerized prediction printouts were analyzed by using the cephalometric system PORDIOS. Statistical analysis was performed by means of t-test. Comparison between manual prediction tracings and the actual post-operative profile showed that the manual method results in more convex soft tissue profiles; the upper lip was found in a more prominent position, upper lip thickness was increased and, the mandible and lower lip were found in a less posterior position than that of the actual profiles. Comparison between computerized and manual prediction methods showed that in the manual method upper lip thickness was increased, the upper lip was found in a more anterior position and the lower anterior facial height was increased as compared to the computerized prediction method. Cephalometric simulation of post-operative soft tissue profile following orthodontic-surgical management of mandibular prognathism imposes certain limitations related to the methods implied. However, both manual and computerized prediction methods remain a useful tool for patient communication.
Macniven, J A B; Davis, C; Ho, M-Y; Bradshaw, C M; Szabadi, E; Constantinescu, C S
2008-09-01
Cognitive impairments in information processing speed, attention and executive functioning are widely reported in patients with multiple sclerosis (MS). Several studies have identified impaired performance on the Stroop test in people with MS, yet uncertainty remains over the cause of this phenomenon. In this study, 25 patients with MS were assessed with a neuropsychological test battery including a computerized Stroop test and a computerized test of information processing speed, the Graded Conditional Discrimination Tasks (GCDT). The patient group was compared with an individually age, sex and estimated premorbid IQ-matched healthy control group. The patients' reaction times (RTs) were significantly longer than those of the controls on all Stroop test trials and there was a significantly enhanced absolute (RT(incongruent)-RT(neutral)) and relative (100 x [RT(incongruent)-RT(neutral)]/RT(neutral)) Stroop interference effect for the MS group. The linear function relating RT to stimulus complexity in the GCDT was significantly steeper in the patient group, indicating slowed information processing. The results are discussed with reference to the difference engine model, a theory of diversity in speeded cognition. It is concluded that, in the assessment of people with MS, great caution must be used in the interpretation of performance on neuropsychological tests which rely on RT as the primary measure.
Barker, Jocelyn; Hoogi, Assaf; Depeursinge, Adrien; Rubin, Daniel L
2016-05-01
Computerized analysis of digital pathology images offers the potential of improving clinical care (e.g. automated diagnosis) and catalyzing research (e.g. discovering disease subtypes). There are two key challenges thwarting computerized analysis of digital pathology images: first, whole slide pathology images are massive, making computerized analysis inefficient, and second, diverse tissue regions in whole slide images that are not directly relevant to the disease may mislead computerized diagnosis algorithms. We propose a method to overcome both of these challenges that utilizes a coarse-to-fine analysis of the localized characteristics in pathology images. An initial surveying stage analyzes the diversity of coarse regions in the whole slide image. This includes extraction of spatially localized features of shape, color and texture from tiled regions covering the slide. Dimensionality reduction of the features assesses the image diversity in the tiled regions and clustering creates representative groups. A second stage provides a detailed analysis of a single representative tile from each group. An Elastic Net classifier produces a diagnostic decision value for each representative tile. A weighted voting scheme aggregates the decision values from these tiles to obtain a diagnosis at the whole slide level. We evaluated our method by automatically classifying 302 brain cancer cases into two possible diagnoses (glioblastoma multiforme (N = 182) versus lower grade glioma (N = 120)) with an accuracy of 93.1% (p < 0.001). We also evaluated our method in the dataset provided for the 2014 MICCAI Pathology Classification Challenge, in which our method, trained and tested using 5-fold cross validation, produced a classification accuracy of 100% (p < 0.001). Our method showed high stability and robustness to parameter variation, with accuracy varying between 95.5% and 100% when evaluated for a wide range of parameters. Our approach may be useful to automatically differentiate between the two cancer subtypes. Copyright © 2015 Elsevier B.V. All rights reserved.
Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles
2004-01-01
The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.
Definition of Tire Properties Required for Landing System Analysis
NASA Technical Reports Server (NTRS)
Clark, S. K.; Dodge, R. N.; Luchini, J. R.
1978-01-01
The data bank constructed provided two basic advantages for the user of aircraft tire information. First, computerization of the data bank allowed mechanical property data to be stored, corrected, updated, and revised quickly and easily as more reliable tests and measurements were carried out. Secondly, the format of the book which can be printed from the computerized data bank can be easily adjusted to suit the needs of the users without the great expense normally associated with reprinting and editing books set by ordinary typography.
Drainage identification analysis and mapping, phase 2.
DOT National Transportation Integrated Search
2017-01-01
Drainage Identification, Analysis and Mapping System (DIAMS) is a computerized database that captures and : stores relevant information associated with all aboveground and underground hydraulic structures belonging to : the New Jersey Department of T...
GIS-Based crash referencing and analysis system
DOT National Transportation Integrated Search
1999-02-01
One area where a Geographic Information System (GIS) has yet to be extensively applied is in the analysis of crash data. Computerized crash analysis systems in which crash data, roadway inventory data, and traffic operations data can be merged are us...
Gottschalk, Louis A; DeFrancisco, Don; Bechtel, Robert J
2002-08-01
The aim of this study was to test the validity of a computer software program previously demonstrated to be capable of making DSM-IV neuropsychiatric diagnoses from the content analysis of speech or verbal texts. In this report, the computer program was applied to three personal writings of Napoleon Bonaparte when he was 12 to 16 years of age. The accuracy of the neuropsychiatric evaluations derived from the computerized content analysis of these writings of Napoleon was independently corroborated by two biographers who have described pertinent details concerning his life situations, moods, and other emotional reactions during this adolescent period of his life. The relevance of this type of computer technology to psychohistorical research and clinical psychiatry is suggested.
The computerized OMAHA system in microsoft office excel.
Lai, Xiaobin; Wong, Frances K Y; Zhang, Peiqiang; Leung, Carenx W Y; Lee, Lai H; Wong, Jessica S Y; Lo, Yim F; Ching, Shirley S Y
2014-01-01
The OMAHA System was adopted as the documentation system in an interventional study. To systematically record client care and facilitate data analysis, two Office Excel files were developed. The first Excel file (File A) was designed to record problems, care procedure, and outcomes for individual clients according to the OMAHA System. It was used by the intervention nurses in the study. The second Excel file (File B) was the summary of all clients that had been automatically extracted from File A. Data in File B can be analyzed directly in Excel or imported in PASW for further analysis. Both files have four parts to record basic information and the three parts of the OMAHA System. The computerized OMAHA System simplified the documentation procedure and facilitated the management and analysis of data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martins de Oliveira, Jose Jr.; Germano Martins, Antonio Cesar
X-ray computed tomography (CT) refers to the cross-sectional imaging of an object measuring the transmitted radiation at different directions. In this work, we describe a non-conventional application of computerized tomography: visualization and improvements in the understanding of some internal structural features of solid dosage forms. A micro-CT X-ray scanner, with a minimum resolution of 30 mum was used to characterize some pharmaceutical tablets, granules, controlled-release osmotic tablet and liquid-filled soft-gelatin capsules. The analysis presented in this work are essentially qualitative, but quantitative parameters, such as porosity, density distribution, tablets dimensions, etc. could also be obtained using the related CT techniques.
[Complex automatic data processing in multi-profile hospitals].
Dovzhenko, Iu M; Panov, G D
1990-01-01
The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.
Effect of gender on computerized electrocardiogram measurements in college athletes.
Mandic, Sandra; Fonda, Holly; Dewey, Frederick; Le, Vy-van; Stein, Ricardo; Wheeler, Matt; Ashley, Euan A; Myers, Jonathan; Froelicher, Victor F
2010-06-01
Broad criteria for classifying an electrocardiogram (ECG) as abnormal and requiring additional testing prior to participating in competitive athletics have been recommended for the preparticipation examination (PPE) of athletes. Because these criteria have not considered gender differences, we examined the effect of gender on the computerized ECG measurements obtained on Stanford student athletes. Currently available computer programs require a basis for "normal" in athletes of both genders to provide reliable interpretation. During the 2007 PPE, computerized ECGs were recorded and analyzed on 658 athletes (54% male; mean age, 19 +/- 1 years) representing 22 sports. Electrocardiogram measurements included intervals and durations in all 12 leads to calculate 12-lead voltage sums, QRS amplitude and QRS area, spatial vector length (SVL), and the sum of the R wave in V5 and S wave in V2 (RSsum). By computer analysis, male athletes had significantly greater QRS duration, PR interval, Q-wave duration, J-point amplitude, and T-wave amplitude, and shorter QTc interval compared with female athletes (all P < 0.05). All ECG indicators of left ventricular electrical activity were significantly greater in males. Although gender was consistently associated with indices of atrial and ventricular electrical activity in multivariable analysis, ECG measurements correlated poorly with body dimensions. Significant gender differences exist in ECG measurements of college athletes that are not explained by differences in body size. Our tables of "normal" computerized gender-specific measurements can facilitate the development of automated ECG interpretation for screening young athletes.
Patel, Minal R; Vichich, Jennifer; Lang, Ian; Lin, Jessica; Zheng, Kai
2017-04-01
The introduction of health information technology systems, electronic health records in particular, is changing the nature of how clinicians interact with patients. Lack of knowledge remains on how best to integrate such systems in the exam room. The purpose of this systematic review was to (1) distill "best" behavioral and communication practices recommended in the literature for clinicians when interacting with patients in the presence of computerized systems during a clinical encounter, (2) weigh the evidence of each recommendation, and (3) rank evidence-based recommendations for electronic health record communication training initiatives for clinicians. We conducted a literature search of 6 databases, resulting in 52 articles included in the analysis. We extracted information such as study setting, research design, sample, findings, and implications. Recommendations were distilled based on consistent support for behavioral and communication practices across studies. Eight behavioral and communication practices received strong support of evidence in the literature and included specific aspects of using computerized systems to facilitate conversation and transparency in the exam room, such as spatial (re)organization of the exam room, maintaining nonverbal communication, and specific techniques that integrate the computerized system into the visit and engage the patient. Four practices, although patient-centered, have received insufficient evidence to date. We developed an evidence base of best practices for clinicians to maintain patient-centered communications in the presence of computerized systems in the exam room. Further work includes development and empirical evaluation of evidence-based guidelines to better integrate computerized systems into clinical care. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Reliability, validity and sensitivity of a computerized visual analog scale measuring state anxiety.
Abend, Rany; Dan, Orrie; Maoz, Keren; Raz, Sivan; Bar-Haim, Yair
2014-12-01
Assessment of state anxiety is frequently required in clinical and research settings, but its measurement using standard multi-item inventories entails practical challenges. Such inventories are increasingly complemented by paper-and-pencil, single-item visual analog scales measuring state anxiety (VAS-A), which allow rapid assessment of current anxiety states. Computerized versions of VAS-A offer additional advantages, including facilitated and accurate data collection and analysis, and applicability to computer-based protocols. Here, we establish the psychometric properties of a computerized VAS-A. Experiment 1 assessed the reliability, convergent validity, and discriminant validity of the computerized VAS-A in a non-selected sample. Experiment 2 assessed its sensitivity to increase in state anxiety following social stress induction, in participants with high levels of social anxiety. Experiment 1 demonstrated the computerized VAS-A's test-retest reliability (r = .44, p < .001); convergent validity with the State-Trait Anxiety Inventory's state subscale (STAI-State; r = .60, p < .001); and discriminant validity as indicated by significantly lower correlations between VAS-A and different psychological measures relative to the correlation between VAS-A and STAI-State. Experiment 2 demonstrated the VAS-A's sensitivity to changes in state anxiety via a significant pre- to during-stressor rise in VAS-A scores (F(1,48) = 25.13, p < .001). Set-order administration of measures, absence of clinically-anxious population, and gender-unbalanced samples. The adequate psychometric characteristics, combined with simple and rapid administration, make the computerized VAS-A a valuable self-rating tool for state anxiety. It may prove particularly useful for clinical and research settings where multi-item inventories are less applicable, including computer-based treatment and assessment protocols. The VAS-A is freely available: http://people.socsci.tau.ac.il/mu/anxietytrauma/visual-analog-scale/. Copyright © 2014 Elsevier Ltd. All rights reserved.
The effectiveness of computerized drug-lab alerts: a systematic review and meta-analysis.
Bayoumi, Imaan; Al Balas, Mosab; Handler, Steven M; Dolovich, Lisa; Hutchison, Brian; Holbrook, Anne
2014-06-01
Inadequate lab monitoring of drugs is a potential cause of ADEs (adverse drug events) which is remediable. To determine the effectiveness of computerized drug-lab alerts to improve medication-related outcomes. Citations from the Computerized Clinical Decision Support System Systematic Review (CCDSSR) and MMIT (Medications Management through Health Information Technology) databases, which had searched MEDLINE, EMBASE, CINAHL, Cochrane Database of Systematic Reviews, International Pharmaceutical Abstracts from 1974 to March 27, 2013. Randomized controlled trials (RCTs) of clinician-targeted computerized drug lab alerts conducted in any healthcare setting. Two reviewers performed full text review to determine study eligibility. A single reviewer abstracted data and evaluated validity of included studies using Cochrane handbook domains. Thirty-six studies met the inclusion criteria (25 single drug studies with 22,504 participants, 14 targeting anticoagulation; 11 multi-drug studies with 56,769 participants). ADEs were reported as an outcome in only four trials, all targeting anticoagulants. Computerized drug-lab alerts did not reduce ADEs (OR 0.89, 95% CI 0.79-1.00, p=0.05), length of hospital stay (SMD 0.00, 95%CI -0.93 to 0.93, p=0.055, 1 study), likelihood of hypoglycemia (OR 1.29, 95% CI 0.31-5.37) or likelihood of bleeding, but were associated with increased likelihood of prescribing changes (OR 1.73, 95% CI 1.21-2.47) or lab monitoring (OR 1.47, 95% confidence interval 1.12-1.94) in accordance with the alert. There is no evidence that computerized drug-lab alerts are associated with important clinical benefits, but there is evidence of improvement in selected clinical surrogate outcomes (time in therapeutic range for vitamin K antagonists), and changes in process outcomes (lab monitoring and prescribing decisions). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, W; Wang, J; Zhang, H
Purpose: To review the literature in using computerized PET/CT image analysis for the evaluation of tumor response to therapy. Methods: We reviewed and summarized more than 100 papers that used computerized image analysis techniques for the evaluation of tumor response with PET/CT. This review mainly covered four aspects: image registration, tumor segmentation, image feature extraction, and response evaluation. Results: Although rigid image registration is straightforward, it has been shown to achieve good alignment between baseline and evaluation scans. Deformable image registration has been shown to improve the alignment when complex deformable distortions occur due to tumor shrinkage, weight loss ormore » gain, and motion. Many semi-automatic tumor segmentation methods have been developed on PET. A comparative study revealed benefits of high levels of user interaction with simultaneous visualization of CT images and PET gradients. On CT, semi-automatic methods have been developed for only tumors that show marked difference in CT attenuation between the tumor and the surrounding normal tissues. Quite a few multi-modality segmentation methods have been shown to improve accuracy compared to single-modality algorithms. Advanced PET image features considering spatial information, such as tumor volume, tumor shape, total glycolytic volume, histogram distance, and texture features have been found more informative than the traditional SUVmax for the prediction of tumor response. Advanced CT features, including volumetric, attenuation, morphologic, structure, and texture descriptors, have also been found advantage over the traditional RECIST and WHO criteria in certain tumor types. Predictive models based on machine learning technique have been constructed for correlating selected image features to response. These models showed improved performance compared to current methods using cutoff value of a single measurement for tumor response. Conclusion: This review showed that computerized PET/CT image analysis holds great potential to improve the accuracy in evaluation of tumor response. This work was supported in part by the National Cancer Institute Grant R01CA172638.« less
Howell, David R; Osternig, Louis R; Chou, Li-Shan
2018-02-16
To examine the acute (within 72h of injury) and long-term (2mo postinjury) independent associations between objective dual-task gait balance and neurocognitive measurements among adolescents and young adults with a concussion and matched controls. Longitudinal case-control. Motion analysis laboratory. A total of 95 participants completed the study: 51 who sustained a concussion (mean age, 17.5±3.3y; 71% men) and 44 controls (mean age, 17.7±2.9y; 72% men). Participants who sustained a concussion underwent a dual-task gait analysis and computerized neurocognitive testing within 72 hours of injury and again 2 months later. Uninjured controls also completed the same test protocol in similar time increments. Not applicable. We compared dual-task gait balance control and computerized neurocognitive test performance between groups using independent samples t tests. Multivariable binary logistic regression models were then constructed for each testing time to determine the association between group membership (concussion vs control), dual-task gait balance control, and neurocognitive function. Medial-lateral center-of-mass displacement during dual-task gait was independently associated with group membership at the initial test (adjusted odds ratio [aOR], 2.432; 95% confidence interval [CI], 1.269-4.661) and 2-month follow-up test (aOR, 1.817; 95% CI, 1.014-3.256) tests. Visual memory composite scores were significantly associated with group membership at the initial hour postinjury time point (aOR, .953; 95% CI, .833-.998). However, the combination of computerized neurocognitive test variables did not predict dual-task gait balance control for participants with concussion, and no single neurocognitive variable was associated with dual-task gait balance control at either testing time. Dual-task assessments concurrently evaluating gait and cognitive performance may allow for the detection of persistent deficits beyond those detected by computerized neurocognitive deficits alone. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS
The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...
Jeffries, B F; Tarlton, M; De Smet, A A; Dwyer, S J; Brower, A C
1980-02-01
A computer program was created to identify and accept spatial data regarding the location of the thoracic and lumbar vertebral bodies on scoliosis films. With this information, the spine can be mathematically reconstructed and a scoliotic angle calculated. There was a 0.968 positive correlation between the computer and manual methods of measuring scoliosis. The computer method was more reproducible with a standard deviation of only 1.3 degrees. Computerized measurement of scoliosis also provides better evaluation of the true shape of the curve.
Gomes, Manuel; Aldridge, Robert W; Wylie, Peter; Bell, James; Epstein, Owen
2013-04-01
When symptomatic gastroenterology patients have an indication for colonic imaging, clinicians have a choice between optical colonoscopy (OC) and computerized tomography colonography with three-dimensional reconstruction (3-D CTC). 3-D CTC provides a minimally invasive and rapid evaluation of the entire colon, and it can be an efficient modality for diagnosing symptoms. It allows for a more targeted use of OC, which is associated with a higher risk of major adverse events and higher procedural costs. A case can be made for 3-D CTC as a primary test for colonic imaging followed if necessary by targeted therapeutic OC; however, the relative long-term costs and benefits of introducing 3-D CTC as a first-line investigation are unknown. The aim of this study was to assess the cost effectiveness of 3-D CTC versus OC for colonic imaging of symptomatic gastroenterology patients in the UK NHS. We used a Markov model to follow a cohort of 100,000 symptomatic gastroenterology patients, aged 50 years or older, and estimate the expected lifetime outcomes, life years (LYs) and quality-adjusted life years (QALYs), and costs (£, 2010-2011) associated with 3-D CTC and OC. Sensitivity analyses were performed to assess the robustness of the base-case cost-effectiveness results to variation in input parameters and methodological assumptions. 3D-CTC provided a similar number of LYs (7.737 vs 7.739) and QALYs (7.013 vs 7.018) per individual compared with OC, and it was associated with substantially lower mean costs per patient (£467 vs £583), leading to a positive incremental net benefit. After accounting for the overall uncertainty, the probability of 3-D CTC being cost effective was around 60 %, at typical willingness-to-pay values of £20,000-£30,000 per QALY gained. 3-D CTC is a cost-saving and cost-effective option for colonic imaging of symptomatic gastroenterology patients compared with OC.
Duarte, A; Walker, S; Littlewood, E; Brabyn, S; Hewitt, C; Gilbody, S; Palmer, S
2017-07-01
Computerized cognitive-behavioural therapy (cCBT) forms a core component of stepped psychological care for depression. Existing evidence for cCBT has been informed by developer-led trials. This is the first study based on a large independent pragmatic trial to assess the cost-effectiveness of cCBT as an adjunct to usual general practitioner (GP) care compared with usual GP care alone and to establish the differential cost-effectiveness of a free-to-use cCBT programme (MoodGYM) in comparison with a commercial programme (Beating the Blues) in primary care. Costs were estimated from a healthcare perspective and outcomes measured using quality-adjusted life years (QALYs) over 2 years. The incremental cost-effectiveness of each cCBT programme was compared with usual GP care. Uncertainty was estimated using probabilistic sensitivity analysis and scenario analyses were performed to assess the robustness of results. Neither cCBT programme was found to be cost-effective compared with usual GP care alone. At a £20 000 per QALY threshold, usual GP care alone had the highest probability of being cost-effective (0.55) followed by MoodGYM (0.42) and Beating the Blues (0.04). Usual GP care alone was also the cost-effective intervention in the majority of scenario analyses. However, the magnitude of the differences in costs and QALYs between all groups appeared minor (and non-significant). Technically supported cCBT programmes do not appear any more cost-effective than usual GP care alone. No cost-effective advantage of the commercially developed cCBT programme was evident compared with the free-to-use cCBT programme. Current UK practice recommendations for cCBT may need to be reconsidered in the light of the results.
Computerized system for assessing heart rate variability.
Frigy, A; Incze, A; Brânzaniuc, E; Cotoi, S
1996-01-01
The principal theoretical, methodological and clinical aspects of heart rate variability (HRV) analysis are reviewed. This method has been developed over the last 10 years as a useful noninvasive method of measuring the activity of the autonomic nervous system. The main components and the functioning of the computerized rhythm-analyzer system developed by our team are presented. The system is able to perform short-term (maximum 20 minutes) time domain HRV analysis and statistical analysis of the ventricular rate in any rhythm, particularly in atrial fibrillation. The performances of our system are demonstrated by using the graphics (RR histograms, delta RR histograms, RR scattergrams) and the statistical parameters resulted from the processing of three ECG recordings. These recordings are obtained from a normal subject, from a patient with advanced heart failure, and from a patient with atrial fibrillation.
Richard's, María M; Introzzi, Isabel; Zamora, Eliana; Vernucci, Santiago
2017-01-01
Inhibition is one of the main executive functions, because of its fundamental role in cognitive and social development. Given the importance of reliable and computerized measurements to assessment inhibitory performance, this research intends to analyze the internal and external criteria of validity of a computerized conjunction search task, to evaluate the role of perceptual inhibition. A sample of 41 children (21 females and 20 males), aged between 6 and 11 years old (M = 8.49, SD = 1.47), intentionally selected from a private management school of Mar del Plata (Argentina), middle socio-economic level were assessed. The Conjunction Search Task from the TAC Battery, Coding and Symbol Search tasks from Wechsler Intelligence Scale for Children were used. Overall, results allow us to confirm that the perceptual inhibition task form TAC presents solid rates of internal and external validity that make a valid measurement instrument of this process.
Computerization of Mental Health Integration Complexity Scores at Intermountain Healthcare
Oniki, Thomas A.; Rodrigues, Drayton; Rahman, Noman; Patur, Saritha; Briot, Pascal; Taylor, David P.; Wilcox, Adam B.; Reiss-Brennan, Brenda; Cannon, Wayne H.
2014-01-01
Intermountain Healthcare’s Mental Health Integration (MHI) Care Process Model (CPM) contains formal scoring criteria for assessing a patient’s mental health complexity as “mild,” “medium,” or “high” based on patient data. The complexity score attempts to assist Primary Care Physicians in assessing the mental health needs of their patients and what resources will need to be brought to bear. We describe an effort to computerize the scoring. Informatics and MHI personnel collaboratively and iteratively refined the criteria to make them adequately explicit and reflective of MHI objectives. When tested on retrospective data of 540 patients, the clinician agreed with the computer’s conclusion in 52.8% of the cases (285/540). We considered the analysis sufficiently successful to begin piloting the computerized score in prospective clinical care. So far in the pilot, clinicians have agreed with the computer in 70.6% of the cases (24/34). PMID:25954401
Using GIS in the Analysis of Truck Crashes, Summary Report
DOT National Transportation Integrated Search
1999-06-01
Computerized crash analysis systems in which crash data, roadway inventory data, and traffic operations data can be merged are used in many States and municipalities to identify problem locations and assess the effectiveness of implemented countermea...
Computerized bone analysis of hand radiographs
NASA Astrophysics Data System (ADS)
Pietka, Ewa; McNitt-Gray, Michael F.; Hall, Theodore R.; Huang, H. K.
1992-06-01
A computerized approach to the problem of skeletal maturity is presented. The analysis of a computed radiography (CR) hand image results in obtaining features, that can be used to assess the skeletal age of pediatric patients. It is performed on a standard left hand radiograph. First, epiphyseal regions of interest (EROI) are located. Then, within each EROI the distals, middles, and proximals are separated. This serves as a basis to locate the extremities of epiphyses and metaphyses. Next, the diameters of epiphyses and metaphyses are calculated. Finally, an epiphyseal diameter and metaphyseal diameter ratio is calculated. A pilot study indicated that these features are sensitive to the changes of the anatomical structure of a growing hand and can be used in the skeletal age assessment.
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results. PMID:3032390
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results.
Item Analysis in Introductory Economics Testing.
ERIC Educational Resources Information Center
Tinari, Frank D.
1979-01-01
Computerized analysis of multiple choice test items is explained. Examples of item analysis applications in the introductory economics course are discussed with respect to three objectives: to evaluate learning; to improve test items; and to help improve classroom instruction. Problems, costs and benefits of the procedures are identified. (JMD)
Automation of scour analysis at Louisiana bridge sites : final report.
DOT National Transportation Integrated Search
1988-12-01
The computerized system for the organization, analysis, and display of field collected scour data is described. This system will enhance the current manual procedure of accomplishing these tasks. The system accepts input from the user, and based on u...
ERIC Educational Resources Information Center
Ma, T. S.; Gutterson, Milton
1980-01-01
Reviews general developments in computerization and data processing of organic elemental analyses; carbon, hydrogen, and nitrogen analyzers; procedures for determining oxygen, sulfur, and halogens, as well as other nometallic elements and organometallics. Selected papers on trace analysis of nonmetals and determination of metallic elements are…
Uncertainty Budget Analysis for Dimensional Inspection Processes (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdez, Lucas M.
2012-07-26
This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less
NASA Technical Reports Server (NTRS)
1985-01-01
"Elizabeth," a computerized beauty analysis system marketed by Elizabeth Arden, is based upon an ARAC data base search of technologies measuring skin profiles. * "Elizabeth" is no longer commercially available.
Computerized analysis of fetal heart rate variability signal during the stages of labor.
Annunziata, Maria Laura; Tagliaferri, Salvatore; Esposito, Francesca Giovanna; Giuliano, Natascia; Mereghini, Flavia; Di Lieto, Andrea; Campanile, Marta
2016-03-01
To analyze computerized cardiotocographic (cCTG) parameters (baseline fetal heart rate, baseline FHR; short term variability, STV; approximate entropy, ApEn; low frequency, LF; movement frequency, MF; high frequency, HF) in physiological pregnancy in order to correlate them with the stages of labor. This could provide more information for understanding the mechanisms of nervous system control of FHR during labor progression. A total of 534 pregnant women were monitored on cCTG from the 37th week before the onset of spontaneous labor and during the first and the second stage of labor. Statistical analysis was performed using Kruskal-Wallis test and Wilcoxon rank-sum test with the Bonferroni adjusted α (< 0.05). Statistically significant differences were seen between baseline FHR, MF and HF (P < 0.001), in which the first two were reduced and the third was increased when compared between pre-labor, and the first and second stages of labor. Differences between some of the stages were found for ApEn, LF and for LF/(HF + MF), where the first and the third were reduced and the second was increased. cCTG modifications during labor may reflect the physiologic increased activation of the autonomous nervous system. Using computerized fetal heart rate analysis during labor it may be possible to obtain more information from the fetal cardiac signal, in comparison with the traditional tracing. © 2016 Japan Society of Obstetrics and Gynecology.
11 CFR 9033.12 - Production of computerized information.
Code of Federal Regulations, 2011 CFR
2011-01-01
... magnetic media, such as magnetic tapes or magnetic diskettes, containing the computerized information at.... The computerized magnetic media shall be prepared and delivered at the committee's expense and shall... Commission's Computerized Magnetic Media Requirements for title 26 Candidates/Committees Receiving Federal...
Representation of analysis results involving aleatory and epistemic uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis
2008-08-01
Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less
Computer Surveillance of Hospital-Acquired Infections: A 25 year Update
Evans, R. Scott; Abouzelof, Rouett H.; Taylor, Caroline W.; Anderson, Vickie; Sumner, Sharon; Soutter, Sharon; Kleckner, Ruth; Lloyd, James F.
2009-01-01
Hospital-acquired infections (HAIs) are a significant cause of patient harm and increased healthcare cost. Many states have instituted mandatory hospital-wide reporting of HAIs which will increase the workload of infection preventionists and the Center for Medicare and Medicaid Services is no longer paying hospitals to treat certain HAIs. These competing priorities for increased reporting and prevention have many hospitals worried. Manual surveillance of HAIs cannot provide the speed, accuracy and consistency of computerized surveillance. Computer tools can also improve the speed and accuracy of HAI analysis and reporting. Computerized surveillance for HAIs was implemented at LDS Hospital in 1984, but that system required manual entry of data for analysis and reporting. This paper reports on the current functionality and status of the updated computer system for HAI surveillance, analysis and reporting used at LDS Hospital and the 21 other Intermountain Healthcare hospitals. PMID:20351845
Non-standard analysis and embedded software
NASA Technical Reports Server (NTRS)
Platek, Richard
1995-01-01
One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.
Measurement uncertainty analysis techniques applied to PV performance measurements
NASA Astrophysics Data System (ADS)
Wells, C.
1992-10-01
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
[The role of multidetector computer tomography in diagnosis of acute pancreatitis].
Lohanikhina, K Iu; Hordiienko, K P; Kozarenko, T M
2014-10-01
With the objective to improve the diagnostic semiotics of an acute pancreatitis (AP) 35 patients were examined, using 64-cut computeric tomograph Lightspeed VCT (GE, USA) with intravenous augmentation in arterial and portal phases. Basing on analysis of the investigations conducted, using multidetector computeric tomography (MDCT), the AP semiotics was systematized, which is characteristic for oedematous and destructive forms, diagnosed in 19 (44.2%) and 16 (45.8%) patients, accordingly. The procedure for estimation of preservation of the organ functional capacity in pancreonecrosis pres- ence was elaborated, promoting rising of the method diagnostic efficacy by 5.3 - 9.4%.
An Analysis of the Need for a Whole-Body CT Scanner at US Darnall Army Community Hospital
1980-05-01
TASK IWORK UNIT ELEMENT NO. I NO.JC NO. rSSION NO. Ij6T’,WAM ’"Aa1W% A WHOLE BODY CT SCANNER AT DARNALL ARMY COMUNITY HOSPITAL 16PTR3OAL tUTHOR(S)* a...computerized axial tomography or CT. Computerized tomography experiments "were conducted by Godfrey Hounsfield at Central Research Laboratories, EMI, Ltd. in...remained the same, with clinical and nursing unit facilities to support a one division post. Presently, Fort Hood is the home of the III US Army Corps, the
Visualization techniques for tongue analysis in traditional Chinese medicine
NASA Astrophysics Data System (ADS)
Pham, Binh L.; Cai, Yang
2004-05-01
Visual inspection of the tongue has been an important diagnostic method of Traditional Chinese Medicine (TCM). Clinic data have shown significant connections between various viscera cancers and abnormalities in the tongue and the tongue coating. Visual inspection of the tongue is simple and inexpensive, but the current practice in TCM is mainly experience-based and the quality of the visual inspection varies between individuals. The computerized inspection method provides quantitative models to evaluate color, texture and surface features on the tongue. In this paper, we investigate visualization techniques and processes to allow interactive data analysis with the aim to merge computerized measurements with human expert's diagnostic variables based on five-scale diagnostic conditions: Healthy (H), History Cancers (HC), History of Polyps (HP), Polyps (P) and Colon Cancer (C).
NASA Astrophysics Data System (ADS)
Nishino, Takayuki
The face hobbing process has been widely applied in automotive industry. But so far few analytical tools have been developed. This makes it difficult for us to optimize gear design. To settle this situation, this study aims at developing a computerized tool to predict the running performances such as loaded tooth contact pattern, static transmission error and so on. First, based upon kinematical analysis of a cutting machine, a mathematical description of tooth surface generation is given. Second, based upon the theory of gearing and differential geometry, conjugate tooth surfaces are studied. Then contact lines are generated. Third, load distribution along contact lines is formulated. Last, the numerical model is validated by measuring loaded transmission error and loaded tooth contact pattern.
Hsiao, Ju-Ling; Chen, Rai-Fu
2016-01-16
With the widespread use of information communication technologies, computerized clinical practice guidelines are developed and considered as effective decision supporting tools in assisting the processes of clinical activities. However, the development of computerized clinical practice guidelines in Taiwan is still at the early stage and acceptance level among major users (physicians) of computerized clinical practice guidelines is not satisfactory. This study aims to investigate critical factors influencing physicians' intention to computerized clinical practice guideline use through an integrative model of activity theory and the technology acceptance model. The survey methodology was employed to collect data from physicians of the investigated hospitals that have implemented computerized clinical practice guidelines. A total of 505 questionnaires were sent out, with 238 completed copies returned, indicating a valid response rate of 47.1 %. The collected data was then analyzed by structural equation modeling technique. The results showed that attitudes toward using computerized clinical practice guidelines (γ = 0.451, p < 0.001), organizational support (γ = 0.285, p < 0.001), perceived usefulness of computerized clinical practice guidelines (γ = 0.219, p < 0.05), and social influence (γ = 0.213, p < 0.05) were critical factors influencing physicians' intention to use computerized clinical practice guidelines, and these factors can explain 68.6 % of the variance in intention to use computerized clinical practice guidelines. This study confirmed that some subject (human) factors, environment (organization) factors, tool (technology) factors mentioned in the activity theory should be carefully considered when introducing computerized clinical practice guidelines. Managers should pay much attention on those identified factors and provide adequate resources and incentives to help the promotion and use of computerized clinical practice guidelines. Through the appropriate use of computerized clinical practice guidelines, the clinical benefits, particularly in improving quality of care and facilitating the clinical processes, will be realized.
39 CFR 501.15 - Computerized Meter Resetting System.
Code of Federal Regulations, 2010 CFR
2010-07-01
... AND DISTRIBUTE POSTAGE EVIDENCING SYSTEMS § 501.15 Computerized Meter Resetting System. (a) Description. The Computerized Meter Resetting System (CMRS) permits customers to reset their postage meters at... 39 Postal Service 1 2010-07-01 2010-07-01 false Computerized Meter Resetting System. 501.15...
A novel computerized surgeon-machine interface for robot-assisted laser phonomicrosurgery.
Mattos, Leonardo S; Deshpande, Nikhil; Barresi, Giacinto; Guastini, Luca; Peretti, Giorgio
2014-08-01
To introduce a novel computerized surgical system for improved usability, intuitiveness, accuracy, and controllability in robot-assisted laser phonomicrosurgery. Pilot technology assessment. The novel system was developed involving a newly designed motorized laser micromanipulator, a touch-screen display, and a graphics stylus. The system allows the control of a CO2 laser through interaction between the stylus and the live video of the surgical area. This empowers the stylus with the ability to have actual effect on the surgical site. Surgical enhancements afforded by this system were established through a pilot technology assessment using randomized trials comparing its performance with a state-of-the-art laser microsurgery system. Resident surgeons and medical students were chosen as subjects in performing sets of trajectory-following exercises. Image processing-based techniques were used for an objective performance assessment. A System Usability Scale-based questionnaire was used for the qualitative assessment. The computerized interface demonstrated superiority in usability, accuracy, and controllability over the state-of-the-art system. Significant ease of use and learning experienced by the subjects were demonstrated by the usability score assigned to the two compared interfaces: computerized interface = 83.96% versus state-of-the-art = 68.02%. The objective analysis showed a significant enhancement in accuracy and controllability: computerized interface = 90.02% versus state-of-the-art = 75.59%. The novel system significantly enhances the accuracy, usability, and controllability in laser phonomicrosurgery. The design provides an opportunity to improve the ergonomics and safety of current surgical setups. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.
Yokokawa, Miki; Jung, Dae Yon; Joseph, Kim K; Hero, Alfred O; Morady, Fred; Bogun, Frank
2014-11-01
Twelve-lead electrocardiogram (ECG) criteria for epicardial ventricular tachycardia (VT) origins have been described. In patients with structural heart disease, the ability to predict an epicardial origin based on QRS morphology is limited and has been investigated only for limited regions in the heart. The purpose of this study was to determine whether a computerized algorithm is able to accurately differentiate epicardial vs endocardial origins of ventricular arrhythmias. Endocardial and epicardial pace-mapping were performed in 43 patients at 3277 sites. The 12-lead ECGs were digitized and analyzed using a mixture of gaussian model (MoG) to assess whether the algorithm was able to identify an epicardial vs endocardial origin of the paced rhythm. The MoG computerized algorithm was compared to algorithms published in prior reports. The computerized algorithm correctly differentiated epicardial vs endocardial pacing sites for 80% of the sites compared to an accuracy of 42% to 66% of other described criteria. The accuracy was higher in patients without structural heart disease than in those with structural heart disease (94% vs 80%, P = .0004) and for right bundle branch block (82%) compared to left bundle branch block morphologies (79%, P = .001). Validation studies showed the accuracy for VT exit sites to be 84%. A computerized algorithm was able to accurately differentiate the majority of epicardial vs endocardial pace-mapping sites. The algorithm is not region specific and performed best in patients without structural heart disease and with VTs having a right bundle branch block morphology. Copyright © 2014 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC
ERIC Educational Resources Information Center
Jackson, James; Dixon, Mark R.
2007-01-01
The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows MOBLE operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection…
Microcomputer Analysis of Children's Language Samples.
ERIC Educational Resources Information Center
Rosenkoetter, Sharon E.; Rice, Mabel L.
The workshop paper examines the use of microcomputer packages to analyze spontaneous language samples of children with communication disorders. Advantages of computerized analysis are seen to include time saving, more efficient data management, and increased objectivity. To help consumers determine which programs to buy, four aspects are…
21 CFR 884.2800 - Computerized Labor Monitoring System.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Computerized Labor Monitoring System. 884.2800... Devices § 884.2800 Computerized Labor Monitoring System. (a) Identification. A computerized labor monitoring system is a system intended to continuously measure cervical dilation and fetal head descent and...
45 CFR 307.15 - Approval of advance planning documents for computerized support enforcement systems.
Code of Federal Regulations, 2011 CFR
2011-10-01
... feasibility of the proposed effort and provide for the conduct of a requirements analysis study which address... indicate how the results of the requirements analysis study will be incorporated into the proposed system... address requirements analysis, program design, procurement and project management; and, a description of...
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
45 CFR 307.15 - Approval of advance planning documents for computerized support enforcement systems.
Code of Federal Regulations, 2010 CFR
2010-10-01
... computerized support enforcement systems. 307.15 Section 307.15 Public Welfare Regulations Relating to Public... CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES COMPUTERIZED SUPPORT ENFORCEMENT SYSTEMS § 307.15 Approval of advance planning documents for computerized support enforcement systems. (a...
Arkansas' Curriculum Guide. Competency Based Computerized Accounting.
ERIC Educational Resources Information Center
Arkansas State Dept. of Education, Little Rock. Div. of Vocational, Technical and Adult Education.
This guide contains the essential parts of a total curriculum for a one-year secondary-level course in computerized accounting. Addressed in the individual sections of the guide are the following topics: the complete accounting cycle, computer operations for accounting, computerized accounting and general ledgers, computerized accounts payable,…
Impact of uncertainty on modeling and testing
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Brown, Kendall K.
1995-01-01
A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.
Determination of Uncertainties for the New SSME Model
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Hawk, Clark W.
1996-01-01
This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.
Michel, Pierre; Baumstarck, Karine; Lancon, Christophe; Ghattas, Badih; Loundou, Anderson; Auquier, Pascal; Boyer, Laurent
2018-04-01
Quality of life (QoL) is still assessed using paper-based and fixed-length questionnaires, which is one reason why QoL measurements have not been routinely implemented in clinical practice. Providing new QoL measures that combine computer technology with modern measurement theory may enhance their clinical use. The aim of this study was to develop a QoL multidimensional computerized adaptive test (MCAT), the SQoL-MCAT, from the fixed-length SQoL questionnaire for patients with schizophrenia. In this multicentre cross-sectional study, we collected sociodemographic information, clinical characteristics (i.e., duration of illness, the PANSS, and the Calgary Depression Scale), and quality of life (i.e., SQoL). The development of the SQoL-CAT was divided into three stages: (1) multidimensional item response theory (MIRT) analysis, (2) multidimensional computerized adaptive test (MCAT) simulations with analyses of accuracy and precision, and (3) external validity. Five hundred and seventeen patients participated in this study. The MIRT analysis found that all items displayed good fit with the multidimensional graded response model, with satisfactory reliability for each dimension. The SQoL-MCAT was 39% shorter than the fixed-length SQoL questionnaire and had satisfactory accuracy (levels of correlation >0.9) and precision (standard error of measurement <0.55 and root mean square error <0.3). External validity was confirmed via correlations between the SQoL-MCAT dimension scores and symptomatology scores. The SQoL-MCAT is the first computerized adaptive QoL questionnaire for patients with schizophrenia. Tailored for patient characteristics and significantly shorter than the paper-based version, the SQoL-MCAT may improve the feasibility of assessing QoL in clinical practice.
Accounting Information Systems in Healthcare: A Review of the Literature.
Hammour, Hadal; Househ, Mowafa; Razzak, Hira Abdul
2017-01-01
As information technology progresses in Saudi Arabia, the manual accounting systems have become graduallyinadequate for decision needs. Subsequently, private and public healthcare divisions in Saudi Arabia perceive Computerized accounting information system (CAIS) as a vehicle to safeguard efficient and effective flow of information during the analysis, processes, and recording of financial data. Efficient and effective flow of information improvesthe decision making of staff, thereby improving the capability of health care sectors to reduce cost of the medical services.In this paper, we define computerized accounting systems from the point of view of health informatics. Also, the challenges and benefits of supporting CAIS applications in hospitals of Saudi Arabia. With these elements, we conclude that CAIS in Saudi Arabia can serve as a valuable tool for evaluating and controlling the cost of medical services in healthcare sectors. Supplementary education on the significance of having systems of computerized accounting within hospitals for nurses, doctors, and accountants with other health care staff is warranted in future.
Transfusion audit of fresh-frozen plasma in southern Taiwan.
Yeh, C-J; Wu, C-F; Hsu, W-T; Hsieh, L-L; Lin, S-F; Liu, T-C
2006-10-01
The demand for transfusions has increased rapidly in southern Taiwan. Between 1993 and 2003, requests for fresh-frozen plasma (FFP) in particular rose dramatically at Kaohsiung Medical University Hospital (KMUH). Transfusion orders were not tightly regulated, and inappropriate use of blood products was common. We carried out a prospective analysis of transfusion requests from October 2003 to January 2004 at KMUH, and then repeated the audit for another 3-month period after the clinical faculty had undergone five sessions of education on transfusion guidelines. Later, our consultant haematologist applied computerized guidelines to periodic audits. A 5.2% decrease in inappropriate FFP usage followed the educational programme and a further 30% reduction took place after the application of computerized transfusion guidelines. With the guidelines and periodic audits, FFP transfusions decreased by 74.6% and inappropriate requests from 65.2% to 30%. Hospital policy, computerized transfusion guidelines and periodic audits greatly reduced inappropriate FFP transfusions. An educational campaign had a more limited effect.
Changing National Forest Values: a content analysis.
David N. Bengston; Zhi Xu
1995-01-01
Empirically analyzes the evolution of national forest values in recent years. A computerized content analysis procedure was developed and used to analyze the forest value systems of forestry professionals, mainstream environmentalists, and the public. National forest values were found to have shifted significantly over the study period.
Buch, Jatin; Kothari, Nitin; Shah, Nishal
2016-01-01
Introduction Prescription order is an important therapeutic transaction between physician and patient. A good quality prescription is an extremely important factor for minimizing errors in dispensing medication and it should be adherent to guidelines for prescription writing for benefit of the patient. Aim To evaluate frequency and type of prescription errors in outpatient prescriptions and find whether prescription writing abides with WHO standards of prescription writing. Materials and Methods A cross-sectional observational study was conducted at Anand city. Allopathic private practitioners practising at Anand city of different specialities were included in study. Collection of prescriptions was started a month after the consent to minimize bias in prescription writing. The prescriptions were collected from local pharmacy stores of Anand city over a period of six months. Prescriptions were analysed for errors in standard information, according to WHO guide to good prescribing. Statistical Analysis Descriptive analysis was performed to estimate frequency of errors, data were expressed as numbers and percentage. Results Total 749 (549 handwritten and 200 computerised) prescriptions were collected. Abundant omission errors were identified in handwritten prescriptions e.g., OPD number was mentioned in 6.19%, patient’s age was mentioned in 25.50%, gender in 17.30%, address in 9.29% and weight of patient mentioned in 11.29%, while in drug items only 2.97% drugs were prescribed by generic name. Route and Dosage form was mentioned in 77.35%-78.15%, dose mentioned in 47.25%, unit in 13.91%, regimens were mentioned in 72.93% while signa (direction for drug use) in 62.35%. Total 4384 errors out of 549 handwritten prescriptions and 501 errors out of 200 computerized prescriptions were found in clinicians and patient details. While in drug item details, total number of errors identified were 5015 and 621 in handwritten and computerized prescriptions respectively. Conclusion As compared to handwritten prescriptions, computerized prescriptions appeared to be associated with relatively lower rates of error. Since out-patient prescription errors are abundant and often occur in handwritten prescriptions, prescribers need to adapt themselves to computerized prescription order entry in their daily practice. PMID:27504305
ERIC Educational Resources Information Center
Forbey, Johnathan D.; Ben-Porath, Yossef S.
2007-01-01
Computerized adaptive testing in personality assessment can improve efficiency by significantly reducing the number of items administered to answer an assessment question. Two approaches have been explored for adaptive testing in computerized personality assessment: item response theory and the countdown method. In this article, the authors…
A Randomized Controlled Trial of the "Cool Teens" CD-ROM Computerized Program for Adolescent Anxiety
ERIC Educational Resources Information Center
Wuthrich, Viviana M.; Rapee, Ronald M.; Cunningham, Michael J.; Lyneham, Heidi J.; Hudson, Jennifer L.; Schniering, Carolyn A.
2012-01-01
Objective: Computerized cognitive behavioral interventions for anxiety disorders in adults have been shown to be efficacious, but limited data are available on the use of computerized interventions with young persons. Adolescents in particular are difficult to engage in treatment and may be especially suited to computerized technologies. This…
Procedures to develop a computerized adaptive test to assess patient-reported physical functioning.
McCabe, Erin; Gross, Douglas P; Bulut, Okan
2018-06-07
The purpose of this paper is to demonstrate the procedures to develop and implement a computerized adaptive patient-reported outcome (PRO) measure using secondary analysis of a dataset and items from fixed-format legacy measures. We conducted secondary analysis of a dataset of responses from 1429 persons with work-related lower extremity impairment. We calibrated three measures of physical functioning on the same metric, based on item response theory (IRT). We evaluated efficiency and measurement precision of various computerized adaptive test (CAT) designs using computer simulations. IRT and confirmatory factor analyses support combining the items from the three scales for a CAT item bank of 31 items. The item parameters for IRT were calculated using the generalized partial credit model. CAT simulations show that reducing the test length from the full 31 items to a maximum test length of 8 items, or 20 items is possible without a significant loss of information (95, 99% correlation with legacy measure scores). We demonstrated feasibility and efficiency of using CAT for PRO measurement of physical functioning. The procedures we outlined are straightforward, and can be applied to other PRO measures. Additionally, we have included all the information necessary to implement the CAT of physical functioning in the electronic supplementary material of this paper.
Web-based automation of green building rating index and life cycle cost analysis
NASA Astrophysics Data System (ADS)
Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul
2018-04-01
Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.
Sacks, Stephanie; Fisher, Melissa; Garrett, Coleman; Alexander, Phillip; Holland, Christine; Rose, Demian; Hooker, Christine; Vinogradov, Sophia
2013-01-01
Social cognitive deficits are an important treatment target in schizophrenia, but it is unclear to what degree they require specialized interventions and which specific components of behavioral interventions are effective. In this pilot study, we explored the effects of a novel computerized neuroplasticity-based auditory training delivered in conjunction with computerized social cognition training (SCT) in patients with schizophrenia. Nineteen clinically stable schizophrenia subjects performed 50 hours of computerized exercises that place implicit, increasing demands on auditory perception, plus 12 hours of computerized training in emotion identification, social perception, and theory of mind tasks. All subjects were assessed with MATRICS-recommended measures of neurocognition and social cognition, plus a measure of self-referential source memory before and after the computerized training. Subjects showed significant improvements on multiple measures of neurocognition. Additionally, subjects showed significant gains on measures of social cognition, including the MSCEIT Perceiving Emotions, MSCEIT Managing Emotions, and self-referential source memory, plus a significant decrease in positive symptoms. Computerized training of auditory processing/verbal learning in schizophrenia results in significant basic neurocognitive gains. Further, addition of computerized social cognition training results in significant gains in several social cognitive outcome measures. Computerized cognitive training that directly targets social cognitive processes can drive improvements in these crucial functions.
Evaluation of retroreflectometers.
DOT National Transportation Integrated Search
2002-08-01
This project performed field-testing and analysis of two pavement marking retroreflectometers: the Laserlux and the : LTL2000. The Laserlux is a vehiclemounted device that takes readings at driving speed and produces : computerized output. The LTL...
Grid-Enabled Quantitative Analysis of Breast Cancer
2009-10-01
large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast
Huang, Chien-Yu; Tung, Li-Chen; Chou, Yeh-Tai; Chou, Willy; Chen, Kuan-Lin; Hsieh, Ching-Lin
2017-07-27
This study aimed at improving the utility of the fine motor subscale of the comprehensive developmental inventory for infants and toddlers (CDIIT) by developing a computerized adaptive test of fine motor skills. We built an item bank for the computerized adaptive test of fine motor skills using the fine motor subscale of the CDIIT items fitting the Rasch model. We also examined the psychometric properties and efficiency of the computerized adaptive test of fine motor skills with simulated computerized adaptive tests. Data from 1742 children with suspected developmental delays were retrieved. The mean scores of the fine motor subscale of the CDIIT increased along with age groups (mean scores = 1.36-36.97). The computerized adaptive test of fine motor skills contains 31 items meeting the Rasch model's assumptions (infit mean square = 0.57-1.21, outfit mean square = 0.11-1.17). For children of 6-71 months, the computerized adaptive test of fine motor skills had high Rasch person reliability (average reliability >0.90), high concurrent validity (rs = 0.67-0.99), adequate to excellent diagnostic accuracy (area under receiver operating characteristic = 0.71-1.00), and large responsiveness (effect size = 1.05-3.93). The computerized adaptive test of fine motor skills used 48-84% fewer items than the fine motor subscale of the CDIIT. The computerized adaptive test of fine motor skills used fewer items for assessment but was as reliable and valid as the fine motor subscale of the CDIIT. Implications for Rehabilitation We developed a computerized adaptive test based on the comprehensive developmental inventory for infants and toddlers (CDIIT) for assessing fine motor skills. The computerized adaptive test has been shown to be efficient because it uses fewer items than the original measure and automatically presents the results right after the test is completed. The computerized adaptive test is as reliable and valid as the CDIIT.
NASA Astrophysics Data System (ADS)
Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.
2017-12-01
Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
Code of Federal Regulations, 2010 CFR
2010-10-01
... records for monitoring Computerized Tribal IV-D Systems and Office Automation? 310.40 Section 310.40... COMPUTERIZED TRIBAL IV-D SYSTEMS AND OFFICE AUTOMATION Accountability and Monitoring Procedures for... monitoring Computerized Tribal IV-D Systems and Office Automation? In accordance with Part 95 of this title...
Computerized Doppler Tomography and Spectrum Analysis of Carotid Artery Flow
Morton, Paul; Goldman, Dave; Nichols, W. Kirt
1981-01-01
Contrast angiography remains the definitive study in the evaluation of atherosclerotic occlusive vascular disease. However, a safer technique for serial screening of symptomatic patients and for routine follow up is necessary. Computerized pulsed Doppler ultrasonic arteriography is a noninvasive technique developed by Miles6 for imaging lateral, antero-posterior and transverse sections of the carotid artery. We [ill] this system with new software and hardware to analyze the three-dimensional blood flow data. The system now provides information about the location of the occlusive process in the artery and a semi-quantitative evaluation of the degree of obstruction. In addition, we interfaced a digital signal analyzer to the system which permits spectrum analysis of the pulsed Doppler signal. This addition has allowed us to identify lesions which are not yet hemodynamically significant. ImagesFig. 2bFig. 2c
Free lipid and computerized determination of adipocyte size.
Svensson, Henrik; Olausson, Daniel; Holmäng, Agneta; Jennische, Eva; Edén, Staffan; Lönn, Malin
2018-06-21
The size distribution of adipocytes in a suspension, after collagenase digestion of adipose tissue, can be determined by computerized image analysis. Free lipid, forming droplets, in such suspensions implicates a bias since droplets present in the images may be identified as adipocytes. This problem is not always adjusted for and some reports state that distinguishing droplets and cells is a considerable problem. In addition, if the droplets originate mainly from rupture of large adipocytes, as often described, this will also bias size analysis. We here confirm that our ordinary manual means of distinguishing droplets and adipocytes in the images ensure correct and rapid identification before exclusion of the droplets. Further, in our suspensions, prepared with focus on gentle handling of tissue and cells, we find no association between the amount of free lipid and mean adipocyte size or proportion of large adipocytes.
Bligh, Michelle C; Kohles, Jeffrey C; Meindl, James R
2004-06-01
In many ways, leadership is a phenomenon that is ideally suited for new and inventive research methods. For researchers who seek to reliably study and systematically compare linguistically based elements of the leadership relationship, computerized content analysis has the potential to supplement, extend, and qualify existing leadership theory and practice. Through an examination of President Bush's rhetoric and the media coverage before and after the crisis of 9/11. the authors explore how elements of the President's speeches changed in response to the post-crisis environment. Using this example, the authors illustrate the process of computerized content analysis and many of its strengths and limitations, with the hope of facilitating future leadership research that uses this approach to explore important contextual and symbolic elements of the leadership relationship. (c) 2004 APA
Modified Involute Helical Gears: Computerized Design, Simulation of Meshing, and Stress Analysis
NASA Technical Reports Server (NTRS)
Handschuh, Robert (Technical Monitor); Litvin, Faydor L.; Gonzalez-Perez, Ignacio; Carnevali, Luca; Kawasaki, Kazumasa; Fuentes-Aznar, Alfonso
2003-01-01
The computerized design, methods for generation, simulation of meshing, and enhanced stress analysis of modified involute helical gears is presented. The approaches proposed for modification of conventional involute helical gears are based on conjugation of double-crowned pinion with a conventional helical involute gear. Double-crowning of the pinion means deviation of cross-profile from an involute one and deviation in longitudinal direction from a helicoid surface. Using the method developed, the pinion-gear tooth surfaces are in point-contact, the bearing contact is localized and oriented longitudinally, and edge contact is avoided. Also, the influence of errors of aligment on the shift of bearing contact, vibration, and noise are reduced substantially. The theory developed is illustrated with numerical examples that confirm the advantages of the gear drives of the modified geometry in comparison with conventional helical involute gears.
Modified Involute Helical Gears: Computerized Design, Simulation of Meshing and Stress Analysis
NASA Technical Reports Server (NTRS)
2003-01-01
The computerized design, methods for generation, simulation of meshing, and enhanced stress analysis of modified involute helical gears is presented. The approaches proposed for modification of conventional involute helical gears are based on conjugation of double-crowned pinion with a conventional helical involute gear. Double-crowning of the pinion means deviation of cross-profile from an involute one and deviation in longitudinal direction from a helicoid surface. Using the method developed, the pinion-gear tooth surfaces are in point-contact, the bearing contact is localized and oriented longitudinally, and edge contact is avoided. Also, the influence of errors of alignment on the shift of bearing contact, vibration, and noise are reduced substantially. The theory developed is illustrated with numerical examples that confirm the advantages of the gear drives of the modified geometry in comparison with conventional helical involute gears.
Chou, Ting-Chao
2011-01-01
The mass-action law based system analysis via mathematical induction and deduction lead to the generalized theory and algorithm that allows computerized simulation of dose-effect dynamics with small size experiments using a small number of data points in vitro, in animals, and in humans. The median-effect equation of the mass-action law deduced from over 300 mechanism specific-equations has been shown to be the unified theory that serves as the common-link for complicated biomedical systems. After using the median-effect principle as the common denominator, its applications are mechanism-independent, drug unit-independent, and dynamic order-independent; and can be used generally for single drug analysis or for multiple drug combinations in constant-ratio or non-constant ratios. Since the "median" is the common link and universal reference point in biological systems, these general enabling lead to computerized quantitative bio-informatics for econo-green bio-research in broad disciplines. Specific applications of the theory, especially relevant to drug discovery, drug combination, and clinical trials, have been cited or illustrated in terms of algorithms, experimental design and computerized simulation for data analysis. Lessons learned from cancer research during the past fifty years provide a valuable opportunity to reflect, and to improve the conventional divergent approach and to introduce a new convergent avenue, based on the mass-action law principle, for the efficient cancer drug discovery and the low-cost drug development.
Chou, Ting-Chao
2011-01-01
The mass-action law based system analysis via mathematical induction and deduction lead to the generalized theory and algorithm that allows computerized simulation of dose-effect dynamics with small size experiments using a small number of data points in vitro, in animals, and in humans. The median-effect equation of the mass-action law deduced from over 300 mechanism specific-equations has been shown to be the unified theory that serves as the common-link for complicated biomedical systems. After using the median-effect principle as the common denominator, its applications are mechanism-independent, drug unit-independent, and dynamic order-independent; and can be used generally for single drug analysis or for multiple drug combinations in constant-ratio or non-constant ratios. Since the “median” is the common link and universal reference point in biological systems, these general enabling lead to computerized quantitative bio-informatics for econo-green bio-research in broad disciplines. Specific applications of the theory, especially relevant to drug discovery, drug combination, and clinical trials, have been cited or illustrated in terms of algorithms, experimental design and computerized simulation for data analysis. Lessons learned from cancer research during the past fifty years provide a valuable opportunity to reflect, and to improve the conventional divergent approach and to introduce a new convergent avenue, based on the mass-action law principle, for the efficient cancer drug discovery and the low-cost drug development. PMID:22016837
ERIC Educational Resources Information Center
Aagard, James A.; Ansbro, Thomas M.
The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…
ERIC Educational Resources Information Center
Bailey, Kathleen M., Ed.; And Others
This collection of 10 selected conference papers report the results of language testing research. Titles and authors are: "Computerized Adaptive Language Testing: A Spanish Placement Exam" (Jerry W. Larson); "Utilizing Rasch Analysis to Detect Cheating on Language Examinations" (Harold S. Madsen); "Scalar Analysis of…
An Empirical Analysis of Negotiation Teaching Methodologies Using a Negotiation Support System
ERIC Educational Resources Information Center
Jones, Beth H.; Jones, Gary H.; Banerjee, Debasish
2005-01-01
This article describes an experiment that compared different methods of teaching undergraduates the fundamentals of negotiation analysis. Using student subjects, we compared three conditions: reading, lecture-only, and lecture accompanied by student use of a computerized negotiation support system (NSS). The authors examined two facets of…
Digital Data Collection and Analysis: Application for Clinical Practice
ERIC Educational Resources Information Center
Ingram, Kelly; Bunta, Ferenc; Ingram, David
2004-01-01
Technology for digital speech recording and speech analysis is now readily available for all clinicians who use a computer. This article discusses some advantages of moving from analog to digital recordings and outlines basic recording procedures. The purpose of this article is to familiarize speech-language pathologists with computerized audio…
Simplified bridge load rating methodology using the national bridge inventory file : user manual
DOT National Transportation Integrated Search
1988-08-01
The purpose of this research was to develop a computerized system to determine the adequacy of a bridge or group of bridges to carry specified overload vehicles. The system utilizes two levels of analysis. The Level 1 analysis is the basic rating sys...
Simplified bridge load rating methodology using the national bridge inventory file : program listing
DOT National Transportation Integrated Search
1987-08-01
The purpose of this research was to develop a computerized system to determine the adequacy of a bridge or group of bridges to carry specified overload vehicles. The system utilizes two levels of analysis. The Level 1 analysis is the basic rating sys...
Optical analysis of crystal growth
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Passeur, Andrea; Harper, Sabrina
1994-01-01
Processing and data reduction of holographic images from Spacelab presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Evaluation of several processing techniques, including the Computerized Holographic Image Processing System and the image processing software ITEX150, will provide fundamental information for holographic analysis of the space flight data.
Seniors' Online Communities: A Quantitative Content Analysis
ERIC Educational Resources Information Center
Nimrod, Galit
2010-01-01
Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…
Forward and backward uncertainty propagation: an oxidation ditch modelling example.
Abusam, A; Keesman, K J; van Straten, G
2003-01-01
In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.
Pretest uncertainty analysis for chemical rocket engine tests
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1987-01-01
A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.
Acute asthma severity identification of expert system flow in emergency department
NASA Astrophysics Data System (ADS)
Sharif, Nurul Atikah Mohd; Ahmad, Norazura; Ahmad, Nazihah; Desa, Wan Laailatul Hanim Mat
2017-11-01
Integration of computerized system in healthcare management help in smoothening the documentation of patient records, highly accesses of knowledge and clinical practices guideline, and advice on decision making. Exploit the advancement of artificial intelligent such as fuzzy logic and rule-based reasoning may improve the management of emergency department in terms of uncertainty condition and medical practices adherence towards clinical guideline. This paper presenting details of the emergency department flow for acute asthma severity identification with the embedding of acute asthma severity identification expert system (AASIES). Currently, AASIES is still in preliminary stage of system validation. However, the implementation of AASIES in asthma bay management is hope can reduce the usage of paper for manual documentation and be a pioneer for the development of a more complex decision support system to smoothen the ED management and more systematic.
Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.
Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng
2010-01-01
Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.
Woodgate, Roberta L; West, Christina H; Tailor, Ketan
2014-01-01
Until now, most existentially focused cancer research has been conducted within adult populations. Only a handful of qualitative investigations have captured the experiences of children with cancer relative to themes such as existential fear and finitude, meaning/meaninglessness, uncertainty, authenticity, and inauthenticity. This article aimed to provide a deeper understanding of the existential challenges faced by children living with cancer. An interpretive, descriptive qualitative research approach was used. Thirteen children (8-17 years) undergoing treatment for cancer participated. Children participated in individual open-ended interviews and also had the opportunity to journal their experiences in a computerized drawing tool. The 4 main themes that emerged in relation to the existential challenges experienced by children with cancer included (1) existential worry, (2) existential vacuum, (3) existential longing, and (4) existential growth. The drawing tool within the computer diary was found to be particularly beneficial in assisting children to express the existential challenges that they had previously been unable to articulate in words. Children moved between existential anxiety and existential growth within the cancer world. The expressive means of drawing pictures gave children a therapeutic space to explore and work at understanding the existential challenges experienced. This research provides evidence that the active engagement of children's imaginations through the use of a computer-drawing tool may have significant therapeutic value for children with cancer. As well, the findings support the importance of nurses "being there" for young patients with cancer in their time of despair.
Detailed Uncertainty Analysis of the ZEM-3 Measurement System
NASA Technical Reports Server (NTRS)
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.
Wright-Berryman, Jennifer L; Salyers, Michelle P; O'Halloran, James P; Kemp, Aaron S; Mueser, Kim T; Diazoni, Amanda J
2013-12-01
To explore mental health consumer and provider responses to a computerized version of the Illness Management and Recovery (IMR) program. Semistructured interviews were conducted to gather data from 6 providers and 12 consumers who participated in a computerized prototype of the IMR program. An inductive-consensus-based approach was used to analyze the interview responses. Qualitative analysis revealed consumers perceived various personal benefits and ease of use afforded by the new technology platform. Consumers also highly valued provider assistance and offered several suggestions to improve the program. The largest perceived barriers to future implementation were lack of computer skills and access to computers. Similarly, IMR providers commented on its ease and convenience, and the reduction of time intensive material preparation. Providers also expressed that the use of technology creates more options for the consumer to access treatment. The technology was acceptable, easy to use, and well-liked by consumers and providers. Clinician assistance with technology was viewed as helpful to get clients started with the program, as lack of computer skills and access to computers was a concern. Access to materials between sessions appears to be desired; however, given perceived barriers of computer skills and computer access, additional supports may be needed for consumers to achieve full benefits of a computerized version of IMR. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Jousimaa, Jukkapekka; Mäkelä, Marjukka; Kunnamo, Ilkka; MacLennan, Graeme; Grimshaw, Jeremy M
2002-01-01
To compare the effects of computerized and paper-based versions of guidelines on recently qualified physicians' consultation practices. Two arm cluster randomized controlled trial. Physicians were randomized to receive computerized or textbook-based versions of the same guidelines for a 4-week study period. Physicians' compliance with guideline recommendations about laboratory, radiological, physical and other examinations, procedures, nonpharmacologic and pharmacologic treatments, physiotherapy, and referrals were measured by case note review. There were 139 recently qualified physicians working in 96 primary healthcare centers in Finland who participated in the study. Data on 4,633 patient encounters were abstracted, of which 3,484 were suitable for further analysis. Physicians' compliance with guidelines was high (over 80% for use of laboratory, radiology, physical examinations, and referrals). There were no significant differences in physicians' consultation practices in any of the measured outcomes between the computerized and textbook group. Guidelines are a useful source of information for recently qualified physicians working in primary care. However, the method of presentation of the guidelines (electronic or paper) does not have an effect on guideline use or their impact on decisions. Other factors should be considered when choosing the method of presentation of guidelines, such as information-seeking time, ease of use during the consultation, ability to update, production costs, and the physician's own preferences.
NASA Astrophysics Data System (ADS)
Tamez-Peña, José G.; Barbu-McInnis, Monica; Totterman, Saara
2006-03-01
Abnormal MR findings including cartilage defects, cartilage denuded areas, osteophytes, and bone marrow edema (BME) are used in staging and evaluating the degree of osteoarthritis (OA) in the knee. The locations of the abnormal findings have been correlated to the degree of pain and stiffness of the joint in the same location. The definition of the anatomic region in MR images is not always an objective task, due to the lack of clear anatomical features. This uncertainty causes variance in the location of the abnormality between readers and time points. Therefore, it is important to have a reproducible system to define the anatomic regions. This works present a computerized approach to define the different anatomic knee regions. The approach is based on an algorithm that uses unique features of the femur and its spatial relation in the extended knee. The femur features are found from three dimensional segmentation maps of the knee. From the segmentation maps, the algorithm automatically divides the femur cartilage into five anatomic regions: trochlea, medial weight bearing area, lateral weight bearing area, posterior medial femoral condyle, and posterior lateral femoral condyle. Furthermore, the algorithm automatically labels the medial and lateral tibia cartilage. The unsupervised definition of the knee regions allows a reproducible way to evaluate regional OA changes. This works will present the application of this automated algorithm for the regional analysis of the cartilage tissue.
NASA Astrophysics Data System (ADS)
1983-01-01
FMC Corporation conducts extensive proof lift tests and computerized analysis to insure that the cranes can lift rated capacity loads up to one million pounds in a wide range of applications. In their analysis work, engineers makes use of a computer program supplied by COSMIC. Called Analysis of Beam Columns, the program is used as part of the required analysis for determining bending moments, deflections and critical load for latticed crane booms.
Computerized adaptive control weld skate with CCTV weld guidance project
NASA Technical Reports Server (NTRS)
Wall, W. A.
1976-01-01
This report summarizes progress of the automatic computerized weld skate development portion of the Computerized Weld Skate with Closed Circuit Television (CCTV) Arc Guidance Project. The main goal of the project is to develop an automatic welding skate demonstration model equipped with CCTV weld guidance. The three main goals of the overall project are to: (1) develop a demonstration model computerized weld skate system, (2) develop a demonstration model automatic CCTV guidance system, and (3) integrate the two systems into a demonstration model of computerized weld skate with CCTV weld guidance for welding contoured parts.
An Application of Computerized Axial Tomography (CAT) Technology to Mass Raid Tracking
1989-08-01
ESD-TR-89-305 MTR-10542 An Application of Computerized Axial Tomography ( CAT ) Technology to Mass Raid Tracking By John K. Barr August 1989...NO 11. TITLE (Include Security Classification) An Application of Computerized Axial Tomography ( CAT ) Technology to Mass Raid Tracking 12...by block number) Computerized Axial Tomography ( CAT ) Scanner Electronic Support Measures (ESM) Fusion (continued) 19. ABSTRACT (Continue on
Increasing profitability through computerization.
Sokol, D J
1988-01-01
The author explores the pragmatic or financial justification for computerizing a dental practice and discusses a computerized approach to precollection and collection for the dental office. The article also deals with the use of computerized correspondence to augment the recall policy of the office and to help generate new patient referrals and discusses the pros and cons of utilizing a dental computer service bureau in implementing these policies.
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler
2016-01-01
An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.
Yoo, Kyung Hee
2007-06-01
This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.
McCarthy, Jillian H; Hogan, Tiffany P; Beukelman, David R; Schwarz, Ilsa E
2015-05-01
Spelling is an important skill for individuals who rely on augmentative and alternative communication (AAC). The purpose of this study was to investigate how computerized sounding out influenced spelling accuracy of pseudo-words. Computerized sounding out was defined as a word elongated, thus providing an opportunity for a child to hear all the sounds in the word at a slower rate. Seven children with cerebral palsy, four who use AAC and three who do not, participated in a single subject AB design. The results of the study indicated that the use of computerized sounding out increased the phonologic accuracy of the pseudo-words produced by participants. The study provides preliminary evidence for the use of computerized sounding out during spelling tasks for children with cerebral palsy who do and do not use AAC. Future directions and clinical implications are discussed. We investigated how computerized sounding out influenced spelling accuracy of pseudowords for children with complex communication needs who did and did not use augmentative and alternative communication (AAC). Results indicated that the use of computerized sounding out increased the phonologic accuracy of the pseudo-words by participants, suggesting that computerized sounding out might assist in more accurate spelling for children who use AAC. Future research is needed to determine how language and reading abilities influence the use of computerized sounding out with children who have a range of speech intelligibility abilities and do and do not use AAC.
Conjoint analysis of nature tourism values in Bahia, Brazil
Thomas Holmes; Chris Zinkhan; Keith Alger; D. Evan Mercer
1996-01-01
This paper uses conjoint analysis to estimate the value of nature tourism attributes in a threatened forest ecosystem in northeastern Brazil. Computerized interviews were conducted using a paired comparison design. An ordinal interpretation of the rating scale was used and marginal utilities were estimated using ordered probit. The empirical results showed that the...
Purdue Plane Structures Analyzer II : a computerized wood engineering system
S. K. Suddarth; R. W. Wolfe
1984-01-01
The Purdue Plane Structures Analyzer (PPSA) is a computer program developed specifically for the analysis of wood structures. It uses recognized analysis procedures, in conjunction with recommendations of the 1982 National Design Specification for Wood Construction, to determine stresses and deflections of wood trusses and frames. The program offers several options for...
ERIC Educational Resources Information Center
Duffy, Larry B.; And Others
The Educational Technology Assessment Model (ETAM) is a set of comprehensive procedures and variables for the analysis, synthesis, and decision making, in regard to the benefits, costs, and risks associated with introducing technical innovations in education and training. This final report summarizes the analysis, design, and development…
Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler
2016-01-01
This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.
Evaluation of roadway sites for queue management.
DOT National Transportation Integrated Search
1991-01-01
This study addresses the problem of queueing on highway facilities, wherein a large number of computerized methods for the analysis of different queueing situations are available. A three-tier classification system of the methodologies was used with ...
National Voice Response System (VRS) Implementation Plan Alternatives Study
DOT National Transportation Integrated Search
1979-07-01
This study examines the alternatives available to implement a national Voice Response System (VRS) for automated preflight weather briefings and flight plan filing. Four major hardware configurations are discussed. A computerized analysis model was d...
Analysis of computer capabilities of Pacific Northwest paratransit providers
DOT National Transportation Integrated Search
1996-07-01
The major project objectives are to quantify the computer capabilities and to determine the computerization needs of paratransit operators in the Northwest, and to create a training program to assist paratransit operators in developing realistic spec...
ERIC Educational Resources Information Center
Moore, John W., Ed.
1981-01-01
Provides short descriptions of chemists' applications of computers in instruction: an interactive instructional program for Instrumental-Qualitative Organic Analysis; question-and-answer exercises in organic chemistry; computerized organic nomenclature drills; integration of theoretical and descriptive materials; acid-base titration simulation;…
Computerized traffic data analysis system.
DOT National Transportation Integrated Search
1975-01-01
The techniques of collecting detailed traffic data for a given site are well known. A popular method uses chart recorders in combination with various vehicle sensing devices, such as tape switches, to provide an accurate pictoral display of the traff...
Blood platelets: computerized morphometry applied on optical images
NASA Astrophysics Data System (ADS)
Korobova, Farida V.; Ivanova, Tatyana V.; Gusev, Alexander A.; Shmarov, Dmitry A.; Kozinets, Gennady I.
2000-11-01
The new technology of computerized morphometric image analysis of platelets on blood smears was developed. In a basis of the device is included analysis of cytophotometric and morphometric parameters of platelets. Geometrical and optical parameters of platelets on 35 donors, platelet concentrates and 15 patients with haemorrhagic thrombocythaemia were investigated, average meanings for the area, diameter, its logarithms and optical density of platelets in norm were received. Distribution of the areas, diameters and optical densities of platelets of patients with haemorrhagic thrombocythaemia differed from those at the healthy people. After a course of treatment these meanings came nearer to normal. The important characteristics of platelets in platelet concentrates after three days of storage were in limits of normal meanings, but differed from those in whole blood platelets. Obtained data allow to enter the quantitative standards into investigation of platelets of the healthy people and at various alteration of thrombocytopoieses.
NASA Technical Reports Server (NTRS)
1982-01-01
Farmers are increasingly turning to aerial applications of pesticides, fertilizers and other materials. Sometimes uneven distribution of the chemicals is caused by worn nozzles, improper alignment of spray nozzles or system leaks. If this happens, job must be redone with added expense to both the pilot and customer. Traditional pattern analysis techniques take days or weeks. Utilizing NASA's wind tunnel and computer validation technology, Dr. Roth, Oklahoma State University (OSU), developed a system for providing answers within minutes. Called the Rapid Distribution Pattern Evaluation System, the OSU system consists of a 100-foot measurement frame tied in to computerized analysis and readout equipment. System is mobile, delivered by trailer to airfields in agricultural areas where OSU conducts educational "fly-ins." A fly-in typically draws 50 to 100 aerial applicators, researchers, chemical suppliers and regulatory officials. An applicator can have his spray pattern checked. A computerized readout, available in five to 12 minutes, provides information for correcting shortcomings in the distribution pattern.
Study of metallic structural design concepts for an arrow wing supersonic cruise configuration
NASA Technical Reports Server (NTRS)
Turner, M. J.; Grande, D. L.
1977-01-01
A structural design study was made, to assess the relative merits of various metallic structural concepts and materials for an advanced supersonic aircraft cruising at Mach 2.7. Preliminary studies were made to ensure compliance of the configuration with general design criteria, integrate the propulsion system with the airframe, select structural concepts and materials, and define an efficient structural arrangement. An advanced computerized structural design system was used, in conjunction with a relatively large, complex finite element model, for detailed analysis and sizing of structural members to satisfy strength and flutter criteria. A baseline aircraft design was developed for assessment of current technology. Criteria, analysis methods, and results are presented. The effect on design methods of using the computerized structural design system was appraised, and recommendations are presented concerning further development of design tools, development of materials and structural concepts, and research on basic technology.
The development of the Medical Literature Analysis and Retrieval System (MEDLARS)*
Dee, Cheryl Rae
2007-01-01
Objective: The research provides a chronology of the US National Library of Medicine's (NLM's) contribution to access to the world's biomedical literature through its computerization of biomedical indexes, particularly the Medical Literature Analysis and Retrieval System (MEDLARS). Method: Using material gathered from NLM's archives and from personal interviews with people associated with developing MEDLARS and its associated systems, the author discusses key events in the history of MEDLARS. Discussion: From the development of the early mechanized bibliographic retrieval systems of the 1940s and to the beginnings of online, interactive computerized bibliographic search systems of the early 1970s chronicled here, NLM's contributions to automation and bibliographic retrieval have been extensive. Conclusion: As NLM's technological experience and expertise grew, innovative bibliographic storage and retrieval systems emerged. NLM's accomplishments regarding MEDLARS were cutting edge, placing the library at the forefront of incorporating mechanization and technologies into medical information systems. PMID:17971889
NASA Technical Reports Server (NTRS)
Lundqvist, S.; Margolis, J.; Reid, J.
1982-01-01
Foreign-gas broadening coefficients have been measured for selected lines of ozone in the 9.2 micron region and for several R-branch lines of nitric oxide in the 5.4 micron region using a computerized tunable diode laser spectrometer. The data analysis showed the importance of fitting a Lorentzian line shape out to several times the halfwidth to obtain a correct value of the broadening coefficient. The measured broadening coefficients of nitric oxide were in good agreement with those obtained by Abels and Shaw (1966). The results of the analysis of eleven lines in the v-1 band and five lines in the v-3 band of ozone show a transition-dependent broadening coefficient. The average value of the foreign-gas broadening ceofficients for the measured v-1 and v-3 lines are 0.075 and 0.073 per cm per atm, respectively.
NASA Technical Reports Server (NTRS)
Wang, T.; Simon, T. W.
1988-01-01
Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.
Uncertainty analysis of hydrological modeling in a tropical area using different algorithms
NASA Astrophysics Data System (ADS)
Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh
2018-01-01
Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David
2015-08-01
Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...
Design of aerosol face masks for children using computerized 3D face analysis.
Amirav, Israel; Luder, Anthony S; Halamish, Asaf; Raviv, Dan; Kimmel, Ron; Waisman, Dan; Newhouse, Michael T
2014-08-01
Aerosol masks were originally developed for adults and downsized for children. Overall fit to minimize dead space and a tight seal are problematic, because children's faces undergo rapid and marked topographic and internal anthropometric changes in their first few months/years of life. Facial three-dimensional (3D) anthropometric data were used to design an optimized pediatric mask. Children's faces (n=271, aged 1 month to 4 years) were scanned with 3D technology. Data for the distance from the bridge of the nose to the tip of the chin (H) and the width of the mouth opening (W) were used to categorize the scans into "small," "medium," and "large" "clusters." "Average" masks were developed from each cluster to provide an optimal seal with minimal dead space. The resulting computerized contour, W and H, were used to develop the SootherMask® that enables children, "suckling" on their own pacifier, to keep the mask on their face, mainly by means of subatmospheric pressure. The relatively wide and flexible rim of the mask accommodates variations in facial size within and between clusters. Unique pediatric face masks were developed based on anthropometric data obtained through computerized 3D face analysis. These masks follow facial contours and gently seal to the child's face, and thus may minimize aerosol leakage and dead space.
Dalino Ciaramella, Paolo; Vertemati, Maurizio; Petrella, Duccio; Bonacina, Edgardo; Grossrubatscher, Erika; Duregon, Eleonora; Volante, Marco; Papotti, Mauro; Loli, Paola
2017-07-01
Diagnosis of benign and purely localized malignant adrenocortical lesions is still a complex issue. Moreover, histology-based diagnosis may suffer of a moment of subjectivity due to inter- and intra-individual variations. The aim of the present study was to assess, by computerized morphometry, the morphological features in benign and malignant adrenocortical neoplasms. Eleven adrenocortical adenomas (ACA) were compared with 18 adrenocortical cancers (ACC). All specimens were stained with H&E, cellular proliferation marker Ki-67 and reticulin. We generated a morphometric model based on the analysis of volume fractions occupied by Ki-67 positive and negative cells (nuclei and cytoplasm), vascular and inflammatory compartment; we also analyzed the surface fraction occupied by reticulin. We compared the quantitative data of Ki-67 obtained by morphometry with the quantification resulting from pathologist's visual reading. The volume fraction of Ki-67 positive cells in ACCs was higher than in ACAs. The volume fraction of nuclei in unit volume and the nuclear/cytoplasmic ratio in both Ki-67 negative cells and Ki-67 positive cells were prominent in ACCs. The surface fraction of reticulin was considerably lower in ACCs. Our computerized morphometric model is simple, reproducible and can be used by the pathologist in the histological workup of adrenocortical tumors to achieve precise and reader-independent quantification of several morphological characteristics of adrenocortical tumors. Copyright © 2017 Elsevier GmbH. All rights reserved.
The Computerized Laboratory Notebook concept for genetic toxicology experimentation and testing.
Strauss, G H; Stanford, W L; Berkowitz, S J
1989-03-01
We describe a microcomputer system utilizing the Computerized Laboratory Notebook (CLN) concept developed in our laboratory for the purpose of automating the Battery of Leukocyte Tests (BLT). The BLT was designed to evaluate blood specimens for toxic, immunotoxic, and genotoxic effects after in vivo exposure to putative mutagens. A system was developed with the advantages of low cost, limited spatial requirements, ease of use for personnel inexperienced with computers, and applicability to specific testing yet flexibility for experimentation. This system eliminates cumbersome record keeping and repetitive analysis inherent in genetic toxicology bioassays. Statistical analysis of the vast quantity of data produced by the BLT would not be feasible without a central database. Our central database is maintained by an integrated package which we have adapted to develop the CLN. The clonal assay of lymphocyte mutagenesis (CALM) section of the CLN is demonstrated. PC-Slaves expand the microcomputer to multiple workstations so that our computerized notebook can be used next to a hood while other work is done in an office and instrument room simultaneously. Communication with peripheral instruments is an indispensable part of many laboratory operations, and we present a representative program, written to acquire and analyze CALM data, for communicating with both a liquid scintillation counter and an ELISA plate reader. In conclusion we discuss how our computer system could easily be adapted to the needs of other laboratories.
Lichtner, Valentina; Dowding, Dawn; Closs, S José
2015-12-24
Assessment and management of pain in patients with dementia is known to be challenging, due to patients' cognitive and/or communication difficulties. In the UK, pain in hospital is managed through regular assessments, with the use of pain intensity scores as triggers for action. The aim of this study was to understand current pain assessment practices, in order to later inform the development of a decision support tool designed to improve the management of pain for people with dementia in hospital. An exploratory study was conducted in four hospitals in the UK (11 wards), with observations of patients with dementia (n = 31), interviews of staff (n = 52) and patients' family members (n = 4) and documentary analysis. A thematic analysis was carried out, structured along dimensions of decision making. This paper focuses on the emergent themes related to the use of assessment tools and pain intensity scores. A variety of tools were used to record pain intensity, usually with numerical scales. None of the tools in actual use had been specifically designed for patients with cognitive impairment. With patients with more severe dementia, the patient's body language and other cues were studied to infer pain intensity and then a score entered on behalf of the patient. Information regarding the temporality of pain and changes in pain experience (rather than a score at a single point in time) seemed to be most useful to the assessment of pain. Given the inherent uncertainty of the meaning of pain scores for patients with dementia, numerical scales were used with caution. Numerical scores triggered action but their meaning was relative - to the patient, to the clinician, to the time of recording and to the purpose of documenting. There are implications for use of data and computerized decision support systems design. Decision support interventions should include personalized alerting cut-off scores for individual patients, display pain scores over time and integrate professional narratives, mitigating uncertainties around single pain scores for patients with dementia.
Durability reliability analysis for corroding concrete structures under uncertainty
NASA Astrophysics Data System (ADS)
Zhang, Hao
2018-02-01
This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.
Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B
2016-11-01
As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Uchiyama, Yoshikazu; Gao, Xin; Hara, Takeshi; Fujita, Hiroshi; Ando, Hiromichi; Yamakawa, Hiroyasu; Asano, Takahiko; Kato, Hiroki; Iwama, Toru; Kanematsu, Masayuki; Hoshi, Hiroaki
2008-03-01
The detection of unruptured aneurysms is a major subject in magnetic resonance angiography (MRA). However, their accurate detection is often difficult because of the overlapping between the aneurysm and the adjacent vessels on maximum intensity projection images. The purpose of this study is to develop a computerized method for the detection of unruptured aneurysms in order to assist radiologists in image interpretation. The vessel regions were first segmented using gray-level thresholding and a region growing technique. The gradient concentration (GC) filter was then employed for the enhancement of the aneurysms. The initial candidates were identified in the GC image using a gray-level threshold. For the elimination of false positives (FPs), we determined shape features and an anatomical location feature. Finally, rule-based schemes and quadratic discriminant analysis were employed along with these features for distinguishing between the aneurysms and the FPs. The sensitivity for the detection of unruptured aneurysms was 90.0% with 1.52 FPs per patient. Our computerized scheme can be useful in assisting the radiologists in the detection of unruptured aneurysms in MRA images.
Computerized Measurement of Negative Symptoms in Schizophrenia
Cohen, Alex S.; Alpert, Murray; Nienow, Tasha M.; Dinzeo, Thomas J.; Docherty, Nancy M.
2008-01-01
Accurate measurement of negative symptoms is crucial for understanding and treating schizophrenia. However, current measurement strategies are reliant on subjective symptom rating scales which often have psychometric and practical limitations. Computerized analysis of patients’ speech offers a sophisticated and objective means of evaluating negative symptoms. The present study examined the feasibility and validity of using widely-available acoustic and lexical-analytic software to measure flat affect, alogia and anhedonia (via positive emotion). These measures were examined in their relationships to clinically-rated negative symptoms and social functioning. Natural speech samples were collected and analyzed for 14 patients with clinically-rated flat affect, 46 patients without flat affect and 19 healthy controls. The computer-based inflection and speech rate measures significantly discriminated patients with flat affect from controls, and the computer-based measure of alogia and negative emotion significantly discriminated the flat and non-flat patients. Both the computer and clinical measures of positive emotion/anhedonia corresponded to functioning impairments. The computerized method of assessing negative symptoms offered a number of advantages over the symptom scale-based approach. PMID:17920078
Infrared spectroscopy for the determination of hydrocarbon types in jet fuels
NASA Technical Reports Server (NTRS)
Buchar, C. S.
1981-01-01
The concentration of hydrocarbon types in conventional jet fuels and synfuels can be measured using a computerized infrared spectrophotometer. The computerized spectrophotometer is calibrated using a fuel of known aromatic and olefinic content. Once calibration is completed, other fuels can be rapidly analyzed using an analytical program built into the computer. The concentration of saturates can be calculated as 100 percent minus the sum of the aromatic and olefinic concentrations. The analysis of a number of jet fuels produced an average standard deviation of 1.76 percent for aromatic types and one of 3.99 percent for olefinic types. Other substances such as oils and organic mixtures can be analyzed for their hydrocarbon content.
11 CFR 9033.12 - Production of computerized information.
Code of Federal Regulations, 2012 CFR
2012-01-01
... conform to the technical specifications, including file requirements, described in the Federal Election Commission's Computerized Magnetic Media Requirements for title 26 Candidates/Committees Receiving Federal... outstanding campaign obligations. (b) Organization of computerized information and technical specifications...
11 CFR 9033.12 - Production of computerized information.
Code of Federal Regulations, 2014 CFR
2014-01-01
... conform to the technical specifications, including file requirements, described in the Federal Election Commission's Computerized Magnetic Media Requirements for title 26 Candidates/Committees Receiving Federal... outstanding campaign obligations. (b) Organization of computerized information and technical specifications...
11 CFR 9033.12 - Production of computerized information.
Code of Federal Regulations, 2013 CFR
2013-01-01
... conform to the technical specifications, including file requirements, described in the Federal Election Commission's Computerized Magnetic Media Requirements for title 26 Candidates/Committees Receiving Federal... outstanding campaign obligations. (b) Organization of computerized information and technical specifications...
Microcomputer Network for Computerized Adaptive Testing (CAT)
1984-03-01
PRDC TR 84-33 \\Q.�d-33- \\ MICROCOMPUTER NETWOJlt FOR COMPUTERIZED ADAPTIVE TESTING ( CAT ) Baldwin Quan Thomas A . Park Gary Sandahl John H...ACCEIIION NO NPRDC TR 84-33 4. TITLE (-d Sul>tlllo) MICROCOMP UTER NETWORK FOR COMPUTERIZED ADA PTIVE TESTING ( CAT ) 1. Q B. uan T. A . Park...adaptive testing ( CAT ) Bayesian sequential testing 20. ABSTitACT (Continuo on ro•••• aide II noco .. _, _., ld-tlly ,.,. t.loclt _._.) DO Computerized
Computerized Adaptive Testing (CAT): A User Manual
1984-03-12
NPRDC TR 84-32 COMPUTERIZED ADAPTIVE TESTING ( CAT ): A USER MANUAL Susan Hardwick Lawrence Eastman Ross Cooper Rehab Group, Incorporated San...a ~EI’IOD COVIRED COMPUTERIZED ADAPTIVE TESTING ( CAT ) Final Report Aug 1981-June 1982 A USER MANUAL 1. ~l:l’t,ORMINCI ORCI. RE~ORT NUM.I:R 62-83...II nee• .. _, entl ldentll)’ ,,. llloclr _,.,) A joint-service effort is underway to develop a computerized adaptive testing ( CAT ) system and to
Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff
NASA Astrophysics Data System (ADS)
Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.
2016-03-01
Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.
Computer Analysis of Eye Blood-Vessel Images
NASA Technical Reports Server (NTRS)
Wall, R. J.; White, B. S.
1984-01-01
Technique rapidly diagnoses diabetes mellitus. Photographs of "whites" of patients' eyes scanned by computerized image analyzer programmed to quantify density of small blood vessels in conjuctiva. Comparison with data base of known normal and diabetic patients facilitates rapid diagnosis.
Why Data Linkage? The Importance of CODES (Crash Outcome Data Evaluation System)
DOT National Transportation Integrated Search
1996-06-01
This report briefly explains the computerized linked data system, Crash Outcome : Data Evaluation System (CODES) that provides greater depth accident data : analysis. The linking of data helps researchers to understand the nature of : traffic acciden...
Resources for Improving Computerized Learning Environments.
ERIC Educational Resources Information Center
Yeaman, Andrew R. J.
1989-01-01
Presents an annotated review of human factors literature that discusses computerized environments. Topics discussed include the application of office automation practices to educational environments; video display terminal (VDT) workstations; health and safety hazards; planning educational facilities; ergonomics in computerized offices; and…
An uncertainty analysis of wildfire modeling [Chapter 13
Karin Riley; Matthew Thompson
2017-01-01
Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...
Analytic uncertainty and sensitivity analysis of models with input correlations
NASA Astrophysics Data System (ADS)
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
An overview of selected information storage and retrieval issues in computerized document processing
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Ihebuzor, Valentine U.
1984-01-01
The rapid development of computerized information storage and retrieval techniques has introduced the possibility of extending the word processing concept to document processing. A major advantage of computerized document processing is the relief of the tedious task of manual editing and composition usually encountered by traditional publishers through the immense speed and storage capacity of computers. Furthermore, computerized document processing provides an author with centralized control, the lack of which is a handicap of the traditional publishing operation. A survey of some computerized document processing techniques is presented with emphasis on related information storage and retrieval issues. String matching algorithms are considered central to document information storage and retrieval and are also discussed.
Model parameter uncertainty analysis for an annual field-scale P loss model
NASA Astrophysics Data System (ADS)
Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie
2016-08-01
Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, D.W.; Yambert, M.W.; Kocher, D.C.
1994-12-31
A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less
Normal computerized Q wave measurements in healthy young athletes.
Saini, Divakar; Grober, Aaron F; Hadley, David; Froelicher, Victor
Recent Expert consensus statements have sought to decrease false positive rates of electrocardiographic abnormalities requiring further evaluation when screening young athletes. These statements are largely based on traditional ECG patterns and have not considered computerized measurements. To define the normal limits for Q wave measurements from the digitally recorded ECGs of healthy young athletes. All athletes were categorized by sex and level of participation (high school, college, and professional), and underwent screening ECGs with routine pre-participation physicals, which were electronically captured and analyzed. Q wave amplitude, area and duration were recorded for athletes with Q wave amplitudes greater than 0.5mm at standard paper amplitude display (1mV/10mm). ANOVA analyses were performed to determine differences these parameters among all groups. A positive ECG was defined by our Stanford Computerized Criteria as exceeding the 99th percentile for Q wave area in 2 or more leads. Proportions testing was used to compare the Seattle Conference Q wave criteria with our data-driven criteria. 2073 athletes in total were screened. Significant differences in Q wave amplitude, duration and area were identified both by sex and level of participation. When applying our Stanford Computerized Criteria and the Seattle criteria to our cohort, two largely different groups of athletes are identified as having abnormal Q waves. Computer analysis of athletes' ECGs should be included in future studies that have greater numbers, more diversity and adequate end points. Copyright © 2017 Elsevier Inc. All rights reserved.
Towards a Framework for Developing Semantic Relatedness Reference Standards
Pakhomov, Serguei V.S.; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B.; Ruggieri, Alexander; Chute, Christopher G.
2010-01-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the “moderate” range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. PMID:21044697
Computerized Sociometric Assessment for Preschool Children
ERIC Educational Resources Information Center
Endedijk, Hinke M.; Cillessen, Antonius H. N.
2015-01-01
In preschool classes, sociometric peer ratings are used to measure children's peer relationships. The current study examined a computerized version of preschool sociometric ratings. The psychometric properties were compared of computerized sociometric ratings and traditional peer ratings for preschoolers. The distributions, inter-item…
ERIC Educational Resources Information Center
Czuchry, Andrew J.; And Others
This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…
ERIC Educational Resources Information Center
Lan, Yi-Chin; Lo, Yu-Ling; Hsu, Ying-Shao
2014-01-01
Comprehension is the essence of reading. Finding appropriate and effective reading strategies to support students' reading comprehension has always been a critical issue for educators. This article presents findings from a meta-analysis of 17 studies of metacognitive strategy instruction on students' reading comprehension in computerized…
Numerical Uncertainty Quantification for Radiation Analysis Tools
NASA Technical Reports Server (NTRS)
Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha
2007-01-01
Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.
NASA Astrophysics Data System (ADS)
Udupa, Jayaram K.; Odhner, Dewey; Falcao, Alexandre X.; Ciesielski, Krzysztof C.; Miranda, Paulo A. V.; Vaideeswaran, Pavithra; Mishra, Shipra; Grevera, George J.; Saboury, Babak; Torigian, Drew A.
2011-03-01
To make Quantitative Radiology (QR) a reality in routine clinical practice, computerized automatic anatomy recognition (AAR) becomes essential. As part of this larger goal, we present in this paper a novel fuzzy strategy for building bodywide group-wise anatomic models. They have the potential to handle uncertainties and variability in anatomy naturally and to be integrated with the fuzzy connectedness framework for image segmentation. Our approach is to build a family of models, called the Virtual Quantitative Human, representing normal adult subjects at a chosen resolution of the population variables (gender, age). Models are represented hierarchically, the descendents representing organs contained in parent organs. Based on an index of fuzziness of the models, 32 thorax data sets, and 10 organs defined in them, we found that the hierarchical approach to modeling can effectively handle the non-linear relationships in position, scale, and orientation that exist among organs in different patients.
2010-01-01
Background The Comprehensive Rural Health Services Project Ballabgarh, run by All India Institute of Medical Sciences (AIIMS), New Delhi has a computerized Health Management Information System (HMIS) since 1988. The HMIS at Ballabgarh has undergone evolution and is currently in its third version which uses generic and open source software. This study was conducted to evaluate the effectiveness of a computerized Health Management Information System in rural health system in India. Methods The data for evaluation were collected by in-depth interviews of the stakeholders i.e. program managers (authors) and health workers. Health Workers from AIIMS and Non-AIIMS Primary Health Centers were interviewed to compare the manual with computerized HMIS. A cost comparison between the two methods was carried out based on market costs. The resource utilization for both manual and computerized HMIS was identified based on workers' interviews. Results There have been no major hardware problems in use of computerized HMIS. More than 95% of data was found to be accurate. Health workers acknowledge the usefulness of HMIS in service delivery, data storage, generation of workplans and reports. For program managers, it provides a better tool for monitoring and supervision and data management. The initial cost incurred in computerization of two Primary Health Centers was estimated to be Indian National Rupee (INR) 1674,217 (USD 35,622). Equivalent annual incremental cost of capital items was estimated as INR 198,017 (USD 4213). The annual savings is around INR 894,283 (USD 11,924). Conclusion The major advantage of computerization has been in saving of time of health workers in record keeping and report generation. The initial capital costs of computerization can be recovered within two years of implementation if the system is fully operational. Computerization has enabled implementation of a good system for service delivery, monitoring and supervision. PMID:21078203
Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty
NASA Astrophysics Data System (ADS)
Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.
2012-12-01
Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-25
..., Software, Implants, and Components Thereof; Notice of Receipt of Complaint; Solicitation of Comments... Certain Computerized Orthopedic Surgical Devices, Software, Implants, and Components Thereof, DN 2945; the... importation of certain computerized orthopedic surgical devices, software, implants, and components thereof...
Designing a Computerized Presentation Center.
ERIC Educational Resources Information Center
Christopher, Doris A.
1995-01-01
The Office Systems and Business Education Department at California State University (Los Angeles) developed a computerized presentation center, with multimedia classrooms and a multipurpose room, where students learn computerized presentation design skills, faculty can develop materials for class, and local business can do videoconferencing and…
DOT National Transportation Integrated Search
2013-08-01
This research aimed to evaluate the data requirements for computer assisted construction planning : and staging methods that can be implemented in pavement rehabilitation projects in the state of : Georgia. Results showed that two main issues for the...
ERIC Educational Resources Information Center
Druyan, Mary Ellen
1990-01-01
Viewpoints of speakers and participants at a symposium on nutrition education in dental curricula are reported, and recommendations for action are outlined, including recruitment of nutritionists for dental faculties, screening and counseling exercises in clinical training, instruction in interpretation of computerized dietary analysis, and…
49 CFR 1244.5 - Date of filing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... railroads using the computerized system may submit waybill sample information quarterly as specified in Statement 81-1. (2) Subject railroad using the manual system may submit waybill sample information quarterly... OF TRANSPORTATION (CONTINUED) ACCOUNTS, RECORDS AND REPORTS WAYBILL ANALYSIS OF TRANSPORTATION OF...
Can computerized tomography accurately stage childhood renal tumors?
Abdelhalim, Ahmed; Helmy, Tamer E; Harraz, Ahmed M; Abou-El-Ghar, Mohamed E; Dawaba, Mohamed E; Hafez, Ashraf T
2014-07-01
Staging of childhood renal tumors is crucial for treatment planning and outcome prediction. We sought to identify whether computerized tomography could accurately predict the local stage of childhood renal tumors. We retrospectively reviewed our database for patients diagnosed with childhood renal tumors and treated surgically between 1990 and 2013. Inability to retrieve preoperative computerized tomography, intraoperative tumor spillage and nonWilms childhood renal tumors were exclusion criteria. Local computerized tomography stage was assigned by a single experienced pediatric radiologist blinded to the pathological stage, using a consensus similar to the Children's Oncology Group Wilms tumor staging system. Tumors were stratified into up-front surgery and preoperative chemotherapy groups. The radiological stage of each tumor was compared to the pathological stage. A total of 189 tumors in 179 patients met inclusion criteria. Computerized tomography staging matched pathological staging in 68% of up-front surgery (70 of 103), 31.8% of pre-chemotherapy (21 of 66) and 48.8% of post-chemotherapy scans (42 of 86). Computerized tomography over staged 21.4%, 65.2% and 46.5% of tumors in the up-front surgery, pre-chemotherapy and post-chemotherapy scans, respectively, and under staged 10.7%, 3% and 4.7%. Computerized tomography staging was more accurate in tumors managed by up-front surgery (p <0.001) and those without extracapsular extension (p <0.001). The validity of computerized tomography staging of childhood renal tumors remains doubtful. This staging is more accurate for tumors treated with up-front surgery and those without extracapsular extension. Preoperative computerized tomography can help to exclude capsular breach. Treatment strategy should be based on surgical and pathological staging to avoid the hazards of inaccurate staging. Copyright © 2014 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Hunt, Randall J.
2012-01-01
Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.
15 CFR 950.9 - Computerized Environmental Data and Information Retrieval Service.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Information Retrieval Service. 950.9 Section 950.9 Commerce and Foreign Trade Regulations Relating to Commerce... Computerized Environmental Data and Information Retrieval Service. The Environmental Data Index (ENDEX... computerized, information retrieval service provides a parallel subject-author-abstract referral service. A...
15 CFR 950.9 - Computerized Environmental Data and Information Retrieval Service.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Information Retrieval Service. 950.9 Section 950.9 Commerce and Foreign Trade Regulations Relating to Commerce... Computerized Environmental Data and Information Retrieval Service. The Environmental Data Index (ENDEX... computerized, information retrieval service provides a parallel subject-author-abstract referral service. A...
15 CFR 950.9 - Computerized Environmental Data and Information Retrieval Service.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Information Retrieval Service. 950.9 Section 950.9 Commerce and Foreign Trade Regulations Relating to Commerce... Computerized Environmental Data and Information Retrieval Service. The Environmental Data Index (ENDEX... computerized, information retrieval service provides a parallel subject-author-abstract referral service. A...
15 CFR 950.9 - Computerized Environmental Data and Information Retrieval Service.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Information Retrieval Service. 950.9 Section 950.9 Commerce and Foreign Trade Regulations Relating to Commerce... Computerized Environmental Data and Information Retrieval Service. The Environmental Data Index (ENDEX... computerized, information retrieval service provides a parallel subject-author-abstract referral service. A...
15 CFR 950.9 - Computerized Environmental Data and Information Retrieval Service.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Information Retrieval Service. 950.9 Section 950.9 Commerce and Foreign Trade Regulations Relating to Commerce... Computerized Environmental Data and Information Retrieval Service. The Environmental Data Index (ENDEX... computerized, information retrieval service provides a parallel subject-author-abstract referral service. A...
Innovations in Computerized Assessment.
ERIC Educational Resources Information Center
Drasgow, Fritz, Ed.; Olson-Buchanan, Julie B., Ed.
Chapters in this book present the challenges and dilemmas faced by researchers as they created new computerized assessments, focusing on issues addressed in developing, scoring, and administering the assessments. Chapters are: (1) "Beyond Bells and Whistles; An Introduction to Computerized Assessment" (Julie B. Olson-Buchanan and Fritz Drasgow);…
Advanced Composition and the Computerized Library.
ERIC Educational Resources Information Center
Hult, Christine
1989-01-01
Discusses four kinds of computerized access tools: online catalogs; computerized reference; online database searching; and compact disks and read only memory (CD-ROM). Examines how these technologies are changing research. Suggests how research instruction in advanced writing courses can be refocused to include the new technologies. (RS)
NASA Astrophysics Data System (ADS)
Rohmer, Jeremy; Verdel, Thierry
2017-04-01
Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e.g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.
Validation of a Self-Administered Computerized System to Detect Cognitive Impairment in Older Adults
Brinkman, Samuel D.; Reese, Robert J.; Norsworthy, Larry A.; Dellaria, Donna K.; Kinkade, Jacob W.; Benge, Jared; Brown, Kimberly; Ratka, Anna; Simpkins, James W.
2015-01-01
There is increasing interest in the development of economical and accurate approaches to identifying persons in the community who have mild, undetected cognitive impairments. Computerized assessment systems have been suggested as a viable approach to identifying these persons. The validity of a computerized assessment system for identification of memory and executive deficits in older individuals was evaluated in the current study. Volunteers (N = 235) completed a 3-hr battery of neuropsychological tests and a computerized cognitive assessment system. Participants were classified as impaired (n = 78) or unimpaired (n = 157) on the basis of the Mini Mental State Exam, Wechsler Memory Scale-III and the Trail Making Test (TMT), Part B. All six variables (three memory variables and three executive variables) derived from the computerized assessment differed significantly between groups in the expected direction. There was also evidence of temporal stability and concurrent validity. Application of computerized assessment systems for clinical practice and for identification of research participants is discussed in this article. PMID:25332303
Computerized neurocognitive testing in the management of sport-related concussion: an update.
Resch, Jacob E; McCrea, Michael A; Cullum, C Munro
2013-12-01
Since the late nineties, computerized neurocognitive testing has become a central component of sport-related concussion (SRC) management at all levels of sport. In 2005, a review of the available evidence on the psychometric properties of four computerized neuropsychological test batteries concluded that the tests did not possess the necessary criteria to warrant clinical application. Since the publication of that review, several more computerized neurocognitive tests have entered the market place. The purpose of this review is to summarize the body of published studies on psychometric properties and clinical utility of computerized neurocognitive tests available for use in the assessment of SRC. A review of the literature from 2005 to 2013 was conducted to gather evidence of test-retest reliability and clinical validity of these instruments. Reviewed articles included both prospective and retrospective studies of primarily sport-based adult and pediatric samples. Summaries are provided regarding the available evidence of reliability and validity for the most commonly used computerized neurocognitive tests in sports settings.
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.
2012-01-01
There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.
Facility Measurement Uncertainty Analysis at NASA GRC
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin
2016-01-01
This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.
Hajizadeh, Negin; Perez Figueroa, Rafael E; Uhler, Lauren M; Chiou, Erin; Perchonok, Jennifer E; Montague, Enid
2013-03-06
Computerized decision aids could facilitate shared decision-making at the point of outpatient clinical care. The objective of this study was to investigate whether a computerized shared decision aid would be feasible to implement in an inner-city clinic by evaluating the current practices in shared decision-making, clinicians' use of computers, patient and clinicians' attitudes and beliefs toward computerized decision aids, and the influence of time on shared decision-making. Qualitative data analysis of observations and semi-structured interviews with patients and clinicians at an inner-city outpatient clinic. The findings provided an exploratory look at the prevalence of shared decision-making and attitudes about health information technology and decision aids. A prominent barrier to clinicians engaging in shared decision-making was a lack of perceived patient understanding of medical information. Some patients preferred their clinicians make recommendations for them rather than engage in formal shared decision-making. Health information technology was an integral part of the clinic visit and welcomed by most clinicians and patients. Some patients expressed the desire to engage with health information technology such as viewing their medical information on the computer screen with their clinicians. All participants were receptive to the idea of a decision aid integrated within the clinic visit although some clinicians were concerned about the accuracy of prognostic estimates for complex medical problems. We identified several important considerations for the design and implementation of a computerized decision aid including opportunities to: bridge clinician-patient communication about medical information while taking into account individual patients' decision-making preferences, complement expert clinician judgment with prognostic estimates, take advantage of patient waiting times, and make tasks involved during the clinic visit more efficient. These findings should be incorporated into the design and implementation of a computerized shared decision aid at an inner-city hospital.
Use of the Dynamic Visual Acuity Test as a screener for community-dwelling older adults who fall.
Honaker, Julie A; Shepard, Neil T
2011-01-01
Adequate function of the peripheral vestibular system, specifically the vestibulo-ocular reflex (VOR; a network of neural connections between the peripheral vestibular system and the extraocular muscles) is essential for maintaining stable vision during head movements. Decreased visual acuity resulting from an impaired peripheral vestibular system may impede balance and postural control and place an individual at risk of falling. Therefore, sensitive measures of the vestibular system are warranted to screen for the tendency to fall, alerting clinicians to recommend further risk of falling assessment and referral to a falling risk reduction program. Dynamic Visual Acuity (DVA) testing is a computerized VOR assessment method to evaluate the peripheral vestibular system during head movements; reduced visual acuity as documented with DVA testing may be sensitive to screen for falling risk. This study examined the sensitivity and specificity of the computerized DVA test with yaw plane head movements for identifying community-dwelling adults (58-78 years) who are prone to falling. A total of 16 older adults with a history of two or more unexplained falls in the previous twelve months and 16 age and gender matched controls without a history of falls in the previous twelve months participated. Computerized DVA with horizontal head movements at a fixed velocity of 120 deg/sec was measured and compared with the Dynamic Gait Index (DGI) a gold standard gait assessment measurement for identifying falling risk. Receiver operating characteristics (ROC) curve analysis and area under the ROC curve (AUC) were used to assess the sensitivity and specificity of the computerized DVA as a screening measure for falling risk as determined by the DGI. Results suggested a link between computerized DVA and the propensity to fall; DVA in the yaw plane was found to be a sensitive (92%) and accurate screening measure when using a cutoff logMAR value of >0.25.
Dodani, Sunjay S; Lu, Charles W; Aldridge, J Wayne; Chou, Kelvin L; Patil, Parag G
2018-06-01
Accurate electrode placement is critical to the success of deep brain stimulation (DBS) surgery. Suboptimal targeting may arise from poor initial target localization, frame-based targeting error, or intraoperative brain shift. These uncertainties can make DBS surgery challenging. To develop a computerized system to guide subthalamic nucleus (STN) DBS electrode localization and to estimate the trajectory of intraoperative microelectrode recording (MER) on magnetic resonance (MR) images algorithmically during DBS surgery. Our method is based upon the relationship between the high-frequency band (HFB; 500-2000 Hz) signal from MER and voxel intensity on MR images. The HFB profile along an MER trajectory recorded during surgery is compared to voxel intensity profiles along many potential trajectories in the region of the surgically planned trajectory. From these comparisons of HFB recordings and potential trajectories, an estimate of the MER trajectory is calculated. This calculated trajectory is then compared to actual trajectory, as estimated by postoperative high-resolution computed tomography. We compared 20 planned, calculated, and actual trajectories in 13 patients who underwent STN DBS surgery. Targeting errors for our calculated trajectories (2.33 mm ± 0.2 mm) were significantly less than errors for surgically planned trajectories (2.83 mm ± 0.2 mm; P = .01), improving targeting prediction in 70% of individual cases (14/20). Moreover, in 4 of 4 initial MER trajectories that missed the STN, our method correctly indicated the required direction of targeting adjustment for the DBS lead to intersect the STN. A computer-based algorithm simultaneously utilizing MER and MR information potentially eases electrode localization during STN DBS surgery.
NASA Astrophysics Data System (ADS)
Saxena, Nishank; Hofmann, Ronny; Alpak, Faruk O.; Berg, Steffen; Dietderich, Jesse; Agarwal, Umang; Tandon, Kunj; Hunter, Sander; Freeman, Justin; Wilson, Ove Bjorn
2017-11-01
We generate a novel reference dataset to quantify the impact of numerical solvers, boundary conditions, and simulation platforms. We consider a variety of microstructures ranging from idealized pipes to digital rocks. Pore throats of the digital rocks considered are large enough to be well resolved with state-of-the-art micro-computerized tomography technology. Permeability is computed using multiple numerical engines, 12 in total, including, Lattice-Boltzmann, computational fluid dynamics, voxel based, fast semi-analytical, and known empirical models. Thus, we provide a measure of uncertainty associated with flow computations of digital media. Moreover, the reference and standards dataset generated is the first of its kind and can be used to test and improve new fluid flow algorithms. We find that there is an overall good agreement between solvers for idealized cross-section shape pipes. As expected, the disagreement increases with increase in complexity of the pore space. Numerical solutions for pipes with sinusoidal variation of cross section show larger variability compared to pipes of constant cross-section shapes. We notice relatively larger variability in computed permeability of digital rocks with coefficient of variation (of up to 25%) in computed values between various solvers. Still, these differences are small given other subsurface uncertainties. The observed differences between solvers can be attributed to several causes including, differences in boundary conditions, numerical convergence criteria, and parameterization of fundamental physics equations. Solvers that perform additional meshing of irregular pore shapes require an additional step in practical workflows which involves skill and can introduce further uncertainty. Computation times for digital rocks vary from minutes to several days depending on the algorithm and available computational resources. We find that more stringent convergence criteria can improve solver accuracy but at the expense of longer computation time.
Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.
2017-01-01
"Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.
Lognormal Uncertainty Estimation for Failure Rates
NASA Technical Reports Server (NTRS)
Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.
2017-01-01
"Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.
Computerized Numerical Control Curriculum Guide.
ERIC Educational Resources Information Center
Reneau, Fred; And Others
This guide is intended for use in a course in programming and operating a computerized numerical control system. Addressed in the course are various aspects of programming and planning, setting up, and operating machines with computerized numerical control, including selecting manual or computer-assigned programs and matching them with…
Motion Estimation and Compensation Strategies in Dynamic Computerized Tomography
NASA Astrophysics Data System (ADS)
Hahn, Bernadette N.
2017-12-01
A main challenge in computerized tomography consists in imaging moving objects. Temporal changes during the measuring process lead to inconsistent data sets, and applying standard reconstruction techniques causes motion artefacts which can severely impose a reliable diagnostics. Therefore, novel reconstruction techniques are required which compensate for the dynamic behavior. This article builds on recent results from a microlocal analysis of the dynamic setting, which enable us to formulate efficient analytic motion compensation algorithms for contour extraction. Since these methods require information about the dynamic behavior, we further introduce a motion estimation approach which determines parameters of affine and certain non-affine deformations directly from measured motion-corrupted Radon-data. Our methods are illustrated with numerical examples for both types of motion.
Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.
2008-01-01
This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.
Alderman, Phillip D.; Stanfill, Bryan
2016-10-06
Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less
LSU: The Library Space Utilization Methodology.
ERIC Educational Resources Information Center
Hall, Richard B.
A computerized research technique for measuring the space utilization of public library facilities provides a behavioral activity and occupancy analysis for library planning purposes. The library space utilization (LSU) methodology demonstrates that significant information about the functional requirements of a library can be measured and…
A/C Interface: Expert Systems: Part II.
ERIC Educational Resources Information Center
Dessy, Raymond E., Ed.
1984-01-01
Discusses working implementations of artificial intelligence systems for chemical laboratory applications. They include expert systems for liquid chromatography, spectral analysis, instrument control of a totally computerized triple-quadrupole mass spectrometer, and the determination of the mineral constituents of a rock sample given the powder…
Recipe for Burnout: The Special Education Teachers' Diet.
ERIC Educational Resources Information Center
Bradfield, Robert H.; Fones, Donald M.
1984-01-01
Computerized diet analysis of 41 teachers of learning disabled students revealed deficiencies in carbohydrate, fiber, and micronutrient intake and excessive fat and protein intake. Findings suggested that poor dietary habits may make Ss more susceptible to emotional stress and physical illness. (CL)
NASA Astrophysics Data System (ADS)
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-09-01
Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.
Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center
NASA Technical Reports Server (NTRS)
Reinath, Michael S.
1997-01-01
Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
2016-09-12
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
An Application of the Rasch Model to Computerized Adaptive Testing.
ERIC Educational Resources Information Center
Wisniewski, Dennis R.
Three questions concerning the Binary Search Method (BSM) of computerized adaptive testing were studied: (1) whether it provided a reliable and valid estimation of examinee ability; (2) its effect on examinee attitudes toward computerized adaptive testing and conventional paper-and-pencil testing; and (3) the relationship between item response…
The Reality, Direction, and Future of Computerized Publications
ERIC Educational Resources Information Center
Levenstein, Nicholas
2012-01-01
Sharing information in digital form by using a computer is a growing phenomenon. Many universities are making their applications available on computer. More than one hundred and thirty-six universities have developed computerized applications on their own or through a commercial vendor. Universities developed computerized applications in order to…
Computerized Classification Testing with the Rasch Model
ERIC Educational Resources Information Center
Eggen, Theo J. H. M.
2011-01-01
If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the Sequential Probability Ratio Test (SPRT) (Wald,…
Severity of Organized Item Theft in Computerized Adaptive Testing: A Simulation Study
ERIC Educational Resources Information Center
Yi, Qing; Zhang, Jinming; Chang, Hua-Hua
2008-01-01
Criteria had been proposed for assessing the severity of possible test security violations for computerized tests with high-stakes outcomes. However, these criteria resulted from theoretical derivations that assumed uniformly randomized item selection. This study investigated potential damage caused by organized item theft in computerized adaptive…
Computerized Adaptive Assessment of Cognitive Abilities among Disabled Adults.
ERIC Educational Resources Information Center
Engdahl, Brian
This study examined computerized adaptive testing and cognitive ability testing of adults with cognitive disabilities. Adult subjects (N=250) were given computerized tests on language usage and space relations in one of three administration conditions: paper and pencil, fixed length computer adaptive, and variable length computer adaptive.…
Development and Evaluation of a Confidence-Weighting Computerized Adaptive Testing
ERIC Educational Resources Information Center
Yen, Yung-Chin; Ho, Rong-Guey; Chen, Li-Ju; Chou, Kun-Yi; Chen, Yan-Lin
2010-01-01
The purpose of this study was to examine whether the efficiency, precision, and validity of computerized adaptive testing (CAT) could be improved by assessing confidence differences in knowledge that examinees possessed. We proposed a novel polytomous CAT model called the confidence-weighting computerized adaptive testing (CWCAT), which combined a…
Year 2000 Computerized Farm Project. Final Report.
ERIC Educational Resources Information Center
McGrann, James M.; Lippke, Lawrence A.
An ongoing project was funded to develop and demonstrate a computerized approach to operation and management of a commercial-sized farm. Other project objectives were to facilitate the demonstration of the computerized farm to the public and to develop individual software packages and make them available to the public. Project accomplishments…
10 CFR 719.44 - What categories of costs require advance approval?
Code of Federal Regulations, 2014 CFR
2014-01-01
... application software, or non-routine computerized databases, if they are specifically created for a particular matter. For costs associated with the creation and use of computerized databases, contractors and retained legal counsel must ensure that the creation and use of computerized databases is necessary and...
Computerized Diagnostic Testing: Problems and Possibilities.
ERIC Educational Resources Information Center
McArthur, David L.
The use of computers to build diagnostic inferences is explored in two contexts. In computerized monitoring of liquid oxygen systems for the space shuttle, diagnoses are exact because they can be derived within a world which is closed. In computerized classroom testing of reading comprehension, programs deliver a constrained form of adaptive…
Code of Federal Regulations, 2010 CFR
2010-10-01
... ENFORCEMENT SYSTEMS § 307.13 Security and confidentiality for computerized support enforcement systems in... systems in operation after October 1, 1997. (a) Information integrity and security. Have safeguards... 45 Public Welfare 2 2010-10-01 2010-10-01 false Security and confidentiality for computerized...
A First Life with Computerized Business Simulations
ERIC Educational Resources Information Center
Thavikulwat, Precha
2011-01-01
The author discusses the theoretical lens, origins, and environment of his work on computerized business simulations. Key ideas that inform his work include the two dimensions (control and interaction) of computerized simulation, the two ways of representing a natural process (phenotypical and genotypical) in a simulation, which he defines as a…
45 CFR 307.15 - Approval of advance planning documents for computerized support enforcement systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES COMPUTERIZED SUPPORT ENFORCEMENT SYSTEMS..., organization, services and constraints related to the computerized support enforcement system; (4) The APD must... design, development, installation or enhancement; (5) The APD must contain a description of each...
45 CFR 307.15 - Approval of advance planning documents for computerized support enforcement systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES COMPUTERIZED SUPPORT ENFORCEMENT SYSTEMS..., organization, services and constraints related to the computerized support enforcement system; (4) The APD must... design, development, installation or enhancement; (5) The APD must contain a description of each...
45 CFR 307.15 - Approval of advance planning documents for computerized support enforcement systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES COMPUTERIZED SUPPORT ENFORCEMENT SYSTEMS..., organization, services and constraints related to the computerized support enforcement system; (4) The APD must... design, development, installation or enhancement; (5) The APD must contain a description of each...
Protecting Privacy in Computerized Medical Information.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Office of Technology Assessment.
This report analyzes the implications of computerized medical information and the challenges it brings to individual privacy. The report examines the nature of the privacy interest in health care information and the current state of the law protecting that information; the nature of proposals to computerize health care information and the…
45 CFR 310.25 - What conditions apply to acquisitions of Computerized Tribal IV-D Systems?
Code of Federal Regulations, 2010 CFR
2010-10-01
... FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES COMPUTERIZED TRIBAL IV-D SYSTEMS AND OFFICE AUTOMATION... Acquisition Threshold; (c) Software and ownership rights. (1) All procurement and contract instruments must... Computerized Tribal IV-D System software or enhancements thereof and all associated documentation designed...
Computerization of the Newspaper in the 1980s.
ERIC Educational Resources Information Center
Garrison, Bruce
A review of the literature on the computerization of newspaper newsrooms shows that since 1960, computers have assumed an increasingly important role in information collection, news writing and editing, pagination, and news transmission. When newspaper libraries are computerized, reporters are able to find information more quickly and to use…
[Computerized medical record: deontology and legislation].
Allaert, F A; Dusserre, L
1996-02-01
Computerization of medical records is making headway for patients' follow-up, scientific research, and health expenses control, but it must not alter the guarantees provided to the patients by the medical code of ethics and the law of January 6, 1978. This law, modified on July 1, 1994, requires to register all computerized records of personal data and establishes rights to protect privacy against computer misdemeanor. All medical practitioners using computerized medical records must be aware that the infringement of this law may provoke suing in professional, civil or criminal court.
L.R. Grosenbaugh
1967-01-01
Describes an expansible computerized system that provides data needed in regression or covariance analysis of as many as 50 variables, 8 of which may be dependent. Alternatively, it can screen variously generated combinations of independent variables to find the regression with the smallest mean-squared-residual, which will be fitted if desired. The user can easily...
Gamma scintigraphic analysis of albumin flux in patients with acute respiratory distress syndrome
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sugerman, H.J.; Tatum, J.L.; Burke, T.S.
1984-06-01
Computerized gamma-scintigraphy provides a new method for the analysis of albumin flux in patients with pulmonary permeability edema. In this technique, 10 mCi of /sup 99/mTc -tagged human serum albumin is administered and lung:heart radioactivity ratios are determined. This ratio remains constant unless there is a leak of albumin, when a rising ratio with time, called the ''slope index'' (SI), is seen. Thirty-five scintigraphic studies were obtained in 28 patients by means of a portable computerized gamma-camera. Thirteen of these patients had clinical evidence of the acute respiratory distress syndrome (ARDS) and six had or were recovering from left ventricularmore » induced congestive heart failure (CHF). Five of the patients with CHF and pulmonary capillary wedge pressure (PCWP) below 30 mm Hg had normal scintigraphic studies. The patients with ARDS were found to have significantly higher SIs than patients who did not have, or had recovered from, ARDS. Positive SIs were present from 1 to 8 days following the apparent onset of ARDS in seven studies in five patients. Recovery of gas exchange was associated with a return to a normal SI in four patients. In conclusion, computerized gamma-scintigraphy was a sensitive, noninvasive tool for the detection of a pathologic increase in pulmonary protein flux. Positive scintigraphic findings were associated with significantly impaired gas exchange. The method documented that the leak of albumin in patients with ARDS may last for days but resolves with recovery.« less
FFDM image quality assessment using computerized image texture analysis
NASA Astrophysics Data System (ADS)
Berger, Rachelle; Carton, Ann-Katherine; Maidment, Andrew D. A.; Kontos, Despina
2010-04-01
Quantitative measures of image quality (IQ) are routinely obtained during the evaluation of imaging systems. These measures, however, do not necessarily correlate with the IQ of the actual clinical images, which can also be affected by factors such as patient positioning. No quantitative method currently exists to evaluate clinical IQ. Therefore, we investigated the potential of using computerized image texture analysis to quantitatively assess IQ. Our hypothesis is that image texture features can be used to assess IQ as a measure of the image signal-to-noise ratio (SNR). To test feasibility, the "Rachel" anthropomorphic breast phantom (Model 169, Gammex RMI) was imaged with a Senographe 2000D FFDM system (GE Healthcare) using 220 unique exposure settings (target/filter, kVs, and mAs combinations). The mAs were varied from 10%-300% of that required for an average glandular dose (AGD) of 1.8 mGy. A 2.5cm2 retroareolar region of interest (ROI) was segmented from each image. The SNR was computed from the ROIs segmented from images linear with dose (i.e., raw images) after flat-field and off-set correction. Image texture features of skewness, coarseness, contrast, energy, homogeneity, and fractal dimension were computed from the Premium ViewTM postprocessed image ROIs. Multiple linear regression demonstrated a strong association between the computed image texture features and SNR (R2=0.92, p<=0.001). When including kV, target and filter as additional predictor variables, a stronger association with SNR was observed (R2=0.95, p<=0.001). The strong associations indicate that computerized image texture analysis can be used to measure image SNR and potentially aid in automating IQ assessment as a component of the clinical workflow. Further work is underway to validate our findings in larger clinical datasets.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
Handbook for Construction of Task Inventories for Navy Enlisted Ratings
1984-01-01
1 March 1977. Cambardella, J. J. 9., & Alvord, V. 0. TI-CODAP: A computerized method of Aob analysis for personnel nasement. Prince George’s County...ficity needed In the occupational analysis and influences the choice of an analysis method . The primary source of job data usually is the job holder at...analyzed, various methods of collecting and procersing data were considered, and an Introduc- tory approach to the collection and analysis of Navy
NASA Astrophysics Data System (ADS)
Peng, Yahui; Jiang, Yulei; Liarski, Vladimir M.; Kaverina, Natalya; Clark, Marcus R.; Giger, Maryellen L.
2012-03-01
Analysis of interactions between B and T cells in tubulointerstitial inflammation is important for understanding human lupus nephritis. We developed a computer technique to perform this analysis, and compared it with manual analysis. Multi-channel immunoflourescent-microscopy images were acquired from 207 regions of interest in 40 renal tissue sections of 19 patients diagnosed with lupus nephritis. Fresh-frozen renal tissue sections were stained with combinations of immunoflourescent antibodies to membrane proteins and counter-stained with a cell nuclear marker. Manual delineation of the antibodies was considered as the reference standard. We first segmented cell nuclei and cell membrane markers, and then determined corresponding cell types based on the distances between cell nuclei and specific cell-membrane marker combinations. Subsequently, the distribution of the shortest distance from T cell nuclei to B cell nuclei was obtained and used as a surrogate indicator of cell-cell interactions. The computer and manual analyses results were concordant. The average absolute difference was 1.1+/-1.2% between the computer and manual analysis results in the number of cell-cell distances of 3 μm or less as a percentage of the total number of cell-cell distances. Our computerized analysis of cell-cell distances could be used as a surrogate for quantifying cell-cell interactions as either an automated and quantitative analysis or for independent confirmation of manual analysis.
NASA Astrophysics Data System (ADS)
Zhu, Q.; Xu, Y. P.; Gu, H.
2014-12-01
Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.
CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrer, R.; Rhodes, J.; Smith, K.
2012-07-01
The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emery, Keith
The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less
Computerized quantitative evaluation of mammographic accreditation phantom images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu
2010-12-15
Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria,more » the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.« less
Dennett, Amy M; Taylor, Nicholas F
2015-01-01
To determine the effectiveness of computer-based electronic devices that provide feedback in improving mobility and balance and reducing falls. Randomized controlled trials were searched from the earliest available date to August 2013. Standardized mean differences were used to complete meta-analyses, with statistical heterogeneity being described with the I-squared statistic. The GRADE approach was used to summarize the level of evidence for each completed meta-analysis. Risk of bias for individual trials was assessed with the (Physiotherapy Evidence Database) PEDro scale. Thirty trials were included. There was high-quality evidence that computerized devices can improve dynamic balance in people with a neurological condition compared with no therapy. There was low-to-moderate-quality evidence that computerized devices have no significant effect on mobility, falls efficacy and falls risk in community-dwelling older adults, and people with a neurological condition compared with physiotherapy. There is high-quality evidence that computerized devices that provide feedback may be useful in improving balance in people with neurological conditions compared with no therapy, but there is a lack of evidence supporting more meaningful changes in mobility and falls risk.
A computerized clinical decision support system as a means of implementing depression guidelines.
Trivedi, Madhukar H; Kern, Janet K; Grannemann, Bruce D; Altshuler, Kenneth Z; Sunderajan, Prabha
2004-08-01
The authors describe the history and current use of computerized systems for implementing treatment guidelines in general medicine as well as the development, testing, and early use of a computerized decision support system for depression treatment among "real-world" clinical settings in Texas. In 1999 health care experts from Europe and the United States met to confront the well-documented challenges of implementing treatment guidelines and to identify strategies for improvement. They suggested the integration of guidelines into computer systems that is incorporated into clinical workflow. Several studies have demonstrated improvements in physicians' adherence to guidelines when such guidelines are provided in a computerized format. Although computerized decision support systems are being used in many areas of medicine and have demonstrated improved patient outcomes, their use in psychiatric illness is limited. The authors designed and developed a computerized decision support system for the treatment of major depressive disorder by using evidence-based guidelines, transferring the knowledge gained from the Texas Medication Algorithm Project (TMAP). This computerized decision support system (CompTMAP) provides support in diagnosis, treatment, follow-up, and preventive care and can be incorporated into the clinical setting. CompTMAP has gone through extensive testing to ensure accuracy and reliability. Physician surveys have indicated a positive response to CompTMAP, although the sample was insufficient for statistical testing. CompTMAP is part of a new era of comprehensive computerized decision support systems that take advantage of advances in automation and provide more complete clinical support to physicians in clinical practice.
Diverticular Disease in the Primary Care Setting.
Wensaas, Knut-Arne; Hungin, Amrit Pali
2016-10-01
Diverticular disease is a chronic and common condition, and yet the impact of diverticular disease in primary care is largely unknown. The diagnosis of diverticular disease relies on the demonstration of diverticula in the colon, and the necessary investigations are often not available in primary care. The specificity and sensitivity of symptoms, clinical signs and laboratory tests alone are generally low and consequently the diagnostic process will be characterized by uncertainty. Also, the criteria for symptomatic uncomplicated diverticular disease in the absence of macroscopic inflammation are not clearly defined. Therefore both the prevalence of diverticular disease and the incidence of diverticulitis in primary care are unknown. Current recommendations for treatment and follow-up of patients with acute diverticulitis are based on studies where the diagnosis has been verified by computerized tomography. The results cannot be directly transferred to primary care where the diagnosis has to rely on the interpretation of symptoms and signs. Therefore, one must allow for greater diagnostic uncertainty, and safety netting in the event of unexpected development of the condition is an important aspect of the management of diverticulitis in primary care. The highest prevalence of diverticular disease is found among older patients, where multimorbidity and polypharmacy is common. The challenge is to remember the possible contribution of diverticular disease to the patient's overall condition and to foresee its implications in terms of advice and treatment in relation to other diseases.
Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry
NASA Technical Reports Server (NTRS)
Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak
2011-01-01
This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.
ERIC Educational Resources Information Center
1998
This document contains four papers from a symposium on technology in human resource development (HRD). "COBRA, an Electronic Performance Support System for the Analysis of Jobs and Tasks" (Theo J. Bastiaens) is described as an integrated computerized environment that provides tools, information, advice, and training to help employees do…
But Is It Nutritious? Computer Analysis Creates Healthier Meals.
ERIC Educational Resources Information Center
Corrigan, Kathleen A.; Aumann, Margaret B.
1993-01-01
A computerized menu-planning method, "Nutrient Standard Menu Planning" (NSMP), uses today's technology to create healthier menus. Field tested in 20 California school districts, the advantages of NSMP are cost effectiveness, increased flexibility, greater productivity, improved public relations, improved finances, and improved student…
Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares
NASA Technical Reports Server (NTRS)
Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.
2012-01-01
A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.
Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul
2013-11-01
This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
Towards a framework for developing semantic relatedness reference standards.
Pakhomov, Serguei V S; Pedersen, Ted; McInnes, Bridget; Melton, Genevieve B; Ruggieri, Alexander; Chute, Christopher G
2011-04-01
Our objective is to develop a framework for creating reference standards for functional testing of computerized measures of semantic relatedness. Currently, research on computerized approaches to semantic relatedness between biomedical concepts relies on reference standards created for specific purposes using a variety of methods for their analysis. In most cases, these reference standards are not publicly available and the published information provided in manuscripts that evaluate computerized semantic relatedness measurement approaches is not sufficient to reproduce the results. Our proposed framework is based on the experiences of medical informatics and computational linguistics communities and addresses practical and theoretical issues with creating reference standards for semantic relatedness. We demonstrate the use of the framework on a pilot set of 101 medical term pairs rated for semantic relatedness by 13 medical coding experts. While the reliability of this particular reference standard is in the "moderate" range; we show that using clustering and factor analyses offers a data-driven approach to finding systematic differences among raters and identifying groups of potential outliers. We test two ontology-based measures of relatedness and provide both the reference standard containing individual ratings and the R program used to analyze the ratings as open-source. Currently, these resources are intended to be used to reproduce and compare results of studies involving computerized measures of semantic relatedness. Our framework may be extended to the development of reference standards in other research areas in medical informatics including automatic classification, information retrieval from medical records and vocabulary/ontology development. Copyright © 2010 Elsevier Inc. All rights reserved.
Van de Weijer-Bergsma, Eva; Kroesbergen, Evelyn H; Jolani, Shahab; Van Luit, Johannes E H
2016-06-01
In two studies, the psychometric properties of an online self-reliant verbal working memory task (the Monkey game) for primary school children (6-12 years of age) were examined. In Study 1, children (n = 5,203) from 31 primary schools participated. The participants completed computerized verbal and visual-spatial working memory tasks (i.e., the Monkey game and the Lion game) and a paper-and-pencil version of Raven's Standard Progressive Matrices. Reading comprehension and math achievement test scores were obtained from the schools. First, the internal consistency of the Monkey game was examined. Second, multilevel modeling was used to examine the effects of classroom membership. Multilevel multivariate regression analysis was used to examine the Monkey game's concurrent relationship with the Lion game and its predictive relationships with reading comprehension and math achievement. Also, age-related differences in performance were examined. In Study 2, the concurrent relationships between the Monkey game and two tester-led computerized working memory tasks were further examined (n = 140). Also, the 1- and 2-year stability of the Monkey game was investigated. The Monkey game showed excellent internal consistency, good concurrent relationships with the other working memory measures, and significant age differences in performance. Performance on the Monkey game was also predictive of subsequent reading comprehension and mathematics performance, even after controlling for individual differences in intelligence. Performance on the Monkey game was influenced by classroom membership. The Monkey game is a reliable and suitable instrument for the online computerized and self-reliant assessment of verbal working memory in primary school children.
Computerized database management system for breast cancer patients.
Sim, Kok Swee; Chong, Sze Siang; Tso, Chih Ping; Nia, Mohsen Esmaeili; Chong, Aun Kee; Abbas, Siti Fathimah
2014-01-01
Data analysis based on breast cancer risk factors such as age, race, breastfeeding, hormone replacement therapy, family history, and obesity was conducted on breast cancer patients using a new enhanced computerized database management system. My Structural Query Language (MySQL) is selected as the application for database management system to store the patient data collected from hospitals in Malaysia. An automatic calculation tool is embedded in this system to assist the data analysis. The results are plotted automatically and a user-friendly graphical user interface is developed that can control the MySQL database. Case studies show breast cancer incidence rate is highest among Malay women, followed by Chinese and Indian. The peak age for breast cancer incidence is from 50 to 59 years old. Results suggest that the chance of developing breast cancer is increased in older women, and reduced with breastfeeding practice. The weight status might affect the breast cancer risk differently. Additional studies are needed to confirm these findings.
Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items
ERIC Educational Resources Information Center
Aybek, Eren Can; Demirtasli, R. Nukhet
2017-01-01
This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…
ERIC Educational Resources Information Center
Gullo, Shirna R.
2014-01-01
Computerized testing may be one solution to enhance performance on the curricular Health Education Systems Inc. (HESI) exam and the National Council Licensure Exam for Registered Nurses (NCLEX-RN). Due to the integration of improved technological processes and procedures in healthcare for computerized documentation and electronicmedical records,…
ERIC Educational Resources Information Center
Wu, Huey-Min; Kuo, Bor-Chen; Wang, Su-Chen
2017-01-01
In this study, a computerized dynamic assessment test with both immediately individualized feedback and adaptively property was applied to Mathematics learning in primary school. For evaluating the effectiveness of the computerized dynamic adaptive test, the performances of three types of remedial instructions were compared by a pre-test/post-test…
Computerized Management of Physical Plant Services.
ERIC Educational Resources Information Center
Hawkey, Earl W.; Kleinpeter, Joseph
Outlining the major areas to be considered when deciding whether or not to computerize physical plant services in higher education institutions, the author points out the shortcomings of manual record keeping systems. He gives five factors to consider when deciding to computerize: (1) time and money, (2) extent of operation, (3) current and future…
The Evaluation of SISMAKOM (Computerized SDI Project).
ERIC Educational Resources Information Center
University of Science, Penang (Malaysia).
A survey of 88 users of SISMAKOM, a computerized selective dissemination of information (SDI) and document delivery service provided by the Universiti Sains Malaysia and four other Malaysian universities, was conducted in August 1982 in order to collect data about SISMAKOM and to assess the value of a computerized SDI service in a developing…
ERIC Educational Resources Information Center
Klemes, Joel; Epstein, Alit; Zuker, Michal; Grinberg, Nira; Ilovitch, Tamar
2006-01-01
The current study examines how a computerized learning environment assists students with learning disabilities (LD) enrolled in a distance learning course at the Open University of Israel. The technology provides computer display of the text, synchronized with auditory output and accompanied by additional computerized study skill tools which…
ERIC Educational Resources Information Center
Sabbah, Sabah Salman
2015-01-01
This study explored the potential effect of college students' self-generated computerized mind maps on their reading comprehension. It also investigated the subjects' attitudes toward generating computerized mind maps for reading comprehension. The study was conducted in response to the inability of the foundation-level students, who were learning…
The Impact of Computerization on Archival Finding Aids: A RAMP Study.
ERIC Educational Resources Information Center
Kitching, Christopher
This report is based on a questionnaire sent to 32 selected National Archives and on interviews with archivists from eight countries. Geared to the needs of developing countries, the report covers: (1) the impact of computerization on finding aids; (2) advantages and problems of computerization, including enhanced archival control, integration of…
Uhm, Yo-Han; Yang, Dae-Jung
2018-02-01
[Purpose] The purpose of this study was to examine the effect of computerized postural control training using whole body vibration on lower limb muscle activity and cerebral cortical activation in acute stroke patients. [Subjects and Methods] Thirty stroke patients participated and were divided into groups of 10, a group of the computerized postural control training using whole body vibration (Group I), the computerized postural control training combined with aero step (Group II) and computerized postural control training (Group III). MP100 was used to measure lower limb muscle activity, and QEEG-8 was used to measure cerebral cortical activation. [Results] Comparison of muscle activity and cerebral cortical activation before and after intervention between groups showed that Group I had significant differences in lower limb muscle activity and cerebral cortical activation compared to Groups II and III. [Conclusion] This study showed that whole body vibration combined computerized postural control training is effective for improving muscle activity and cerebral cortex activity in stroke patients.
Uncertainty in Operational Atmospheric Analyses and Re-Analyses
NASA Astrophysics Data System (ADS)
Langland, R.; Maue, R. N.
2016-12-01
This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.
Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support
NASA Astrophysics Data System (ADS)
Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.
2016-12-01
Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.
NASA Astrophysics Data System (ADS)
Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng
2016-09-01
This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.
Fumis, Renata Rego Lins; Costa, Eduardo Leite Vieira; Martins, Paulo Sergio; Pizzo, Vladimir; Souza, Ivens Augusto; Schettino, Guilherme de Paula Pinto
2014-01-01
To evaluate the satisfaction of the intensive care unit staff with a computerized physician order entry and to compare the concept of the computerized physician order entry relevance among intensive care unit healthcare workers. We performed a cross-sectional survey to assess the satisfaction of the intensive care unit staff with the computerized physician order entry in a 30-bed medical/surgical adult intensive care unit using a self-administered questionnaire. The questions used for grading satisfaction levels were answered according to a numerical scale that ranged from 1 point (low satisfaction) to 10 points (high satisfaction). The majority of the respondents (n=250) were female (66%) between the ages of 30 and 35 years of age (69%). The overall satisfaction with the computerized physician order entry scored 5.74±2.14 points. The satisfaction was lower among physicians (n=42) than among nurses, nurse technicians, respiratory therapists, clinical pharmacists and diet specialists (4.62±1.79 versus 5.97±2.14, p<0.001); satisfaction decreased with age (p<0.001). Physicians scored lower concerning the potential of the computerized physician order entry for improving patient safety (5.45±2.20 versus 8.09±2.21, p<0.001) and the ease of using the computerized physician order entry (3.83±1.88 versus 6.44±2.31, p<0.001). The characteristics independently associated with satisfaction were the system's user-friendliness, accuracy, capacity to provide clear information, and fast response time. Six months after its implementation, healthcare workers were satisfied, albeit not entirely, with the computerized physician order entry. The overall users' satisfaction with computerized physician order entry was lower among physicians compared to other healthcare professionals. The factors associated with satisfaction included the belief that digitalization decreased the workload and contributed to the intensive care unit quality with a user-friendly and accurate system and that digitalization provided concise information within a reasonable time frame.
The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...
Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization
NASA Technical Reports Server (NTRS)
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
Truyers, Carla; Lesaffre, Emmanuel; Bartholomeeusen, Stefaan; Aertgeerts, Bert; Snacken, René; Brochier, Bernard; Yane, Fernande; Buntinx, Frank
2010-03-22
Computerized morbidity registration networks might serve as early warning systems in a time where natural epidemics such as the H1N1 flu can easily spread from one region to another. In this contribution we examine whether general practice based broad-spectrum computerized morbidity registration networks have the potential to act as a valid surveillance instrument of frequently occurring diseases. We compare general practice based computerized data assessing the frequency of influenza-like illness (ILI) and acute respiratory infections (ARI) with data from a well established case-specific sentinel network, the European Influenza Surveillance Scheme (EISS). The overall frequency and trends of weekly ILI and ARI data are compared using both networks. Detection of influenza-like illness and acute respiratory illness occurs equally fast in EISS and the computerized network. The overall frequency data for ARI are the same for both networks, the overall trends are similar, but the increases and decreases in frequency do not occur in exactly the same weeks. For ILI, the overall rate was slightly higher for the computerized network population, especially before the increase of ILI, the overall trend was almost identical and the increases and decreases occur in the same weeks for both networks. Computerized morbidity registration networks are a valid tool for monitoring frequent occurring respiratory diseases and the detection of sudden outbreaks.
Uncertainty analysis of diffuse-gray radiation enclosure problems: A hypersensitive case study
NASA Technical Reports Server (NTRS)
Taylor, Robert P.; Luck, Rogelio; Hodge, B. K.; Steele, W. Glenn
1993-01-01
An uncertainty analysis of diffuse-gray enclosure problems is presented. The genesis was a diffuse-gray enclosure problem which proved to be hypersensitive to the specification of view factors. This genesis is discussed in some detail. The uncertainty analysis is presented for the general diffuse-gray enclosure problem and applied to the hypersensitive case study. It was found that the hypersensitivity could be greatly reduced by enforcing both closure and reciprocity for the view factors. The effects of uncertainties in the surface emissivities and temperatures are also investigated.
Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)
DOE Office of Scientific and Technical Information (OSTI.GOV)
BABA,T.; ISHIGURO,K.; ISHIHARA,Y.
1999-08-30
Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less
Methods for Estimating the Uncertainty in Emergy Table-Form Models
Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...
Code of Federal Regulations, 2010 CFR
2010-10-01
... installation, operation, maintenance and enhancement of Computerized Tribal IV-D Systems and Office Automation... HEALTH AND HUMAN SERVICES COMPUTERIZED TRIBAL IV-D SYSTEMS AND OFFICE AUTOMATION Funding for Computerized Tribal IV-D Systems and Office Automation § 310.20 What are the conditions for funding the installation...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 2 2010-10-01 2010-10-01 false Under what circumstances would emergency FFP be... AND OFFICE AUTOMATION Funding for Computerized Tribal IV-D Systems and Office Automation § 310.35 Under what circumstances would emergency FFP be available for Computerized Tribal IV-D Systems? (a...
ERIC Educational Resources Information Center
Lavy, Ilana
2006-01-01
This paper presents a description of the different types of arguments that emerged as two students, working in a computerized environment, engaged in an investigation of several number theory concepts. The emerging arguments are seen as a result of the influence of the computerized environment together with collaborative learning. Using…
Development of a Computerized In-Basket Exercise for the Classroom: A Sales Management Example
ERIC Educational Resources Information Center
Pearson, Michael M.; Barnes, John W.; Onken, Marina H.
2006-01-01
This article follows the development of a sales management in-basket exercise for use in the classroom. The authors have computerized the exercise and added features to allow for additional and more quantitative input from the students. The exercise has evolved and been tested in numerous classroom situations. The computerized in-basket exercise…
ERIC Educational Resources Information Center
Chang, Frank Tien-Jin
Computerized school administration has become one of the most crucial innovations in vocational education in Taiwan in the Republic of China. As these educators begin to design or purchase computerized information systems for their own schools, they must first define their specific information needs. Next, they should pay attention to…
Assessment Outcomes: Computerized Instruction in a Human Gross Anatomy Course.
ERIC Educational Resources Information Center
Bukowski, Elaine L.
2002-01-01
The first of three successive classes of beginning physical therapy students (n=17) completed traditional cadaver anatomy lecture/lab; the next 17 a self-study computerized anatomy lab, and the next 20 both lectures and computer lab. No differences in study times and course or licensure exam performance appeared. Computerized self-study is a…
Preliminary GIS analysis of the agricultural landscape of Cuyo Cuyo, Department of Puno, Peru
NASA Technical Reports Server (NTRS)
Winterhalder, Bruce; Evans, Tom
1991-01-01
Computerized analysis of a geographic database (GIS) for Cuyo Cuyo, (Dept. Puno, Peru) is used to correlate the agricultural production zones of two adjacent communities to altitude, slope, aspect, and other geomorphological features of the high-altitude eastern escarpment landscape. The techniques exemplified will allow ecological anthropologists to analyze spatial patterns at regional scales with much greater control over the data.
ERIC Educational Resources Information Center
Stich, Judith, Ed.
Proceedings of the 1980 Financial Measures Conference are presented. Papers and authors are as follows: "Ratio Analysis in Higher Education" (John Minter); "Computerized Application of Financial Assessment Technology" (Daniel Updegrove and Stephen D. Campbell); "The Uses and Utility of HEGIS Financial Data" (Loyd…
Spring Small Grains Area Estimation
NASA Technical Reports Server (NTRS)
Palmer, W. F.; Mohler, R. J.
1986-01-01
SSG3 automatically estimates acreage of spring small grains from Landsat data. Report describes development and testing of a computerized technique for using Landsat multispectral scanner (MSS) data to estimate acreage of spring small grains (wheat, barley, and oats). Application of technique to analysis of four years of data from United States and Canada yielded estimates of accuracy comparable to those obtained through procedures that rely on trained analysis.
A computer program for cyclic plasticity and structural fatigue analysis
NASA Technical Reports Server (NTRS)
Kalev, I.
1980-01-01
A computerized tool for the analysis of time independent cyclic plasticity structural response, life to crack initiation prediction, and crack growth rate prediction for metallic materials is described. Three analytical items are combined: the finite element method with its associated numerical techniques for idealization of the structural component, cyclic plasticity models for idealization of the material behavior, and damage accumulation criteria for the fatigue failure.
Preliminary Analysis of LORAN-C System Reliability for Civil Aviation.
1981-09-01
overviev of the analysis technique. Section 3 describes the computerized LORAN-C coverage model which is used extensively in the reliability analysis...Xth Plenary Assembly, Geneva, 1963, published by International Telecomunications Union. S. Braff, R., Computer program to calculate a Karkov Chain Reliability Model, unpublished york, MITRE Corporation. A-1 I.° , 44J Ili *Y 0E 00 ...F i8 1110 Prelim inary Analysis of Program Engineering & LORAN’C System ReliabilityMaintenance Service i ~Washington. D.C.
ERIC Educational Resources Information Center
Mueller, Richard J.
Current computerized electronic technology is making possible, not only the broad and rapid distribution of information, but also its manipulation, analysis, synthesis, and recombination. The shift from print to a combination of visual and oral expression is being propelled by the mass media, and visual literacy is both a concept and an…
21 CFR 211.68 - Automatic, mechanical, and electronic equipment.
Code of Federal Regulations, 2013 CFR
2013-04-01
... satisfactorily, may be used in the manufacture, processing, packing, and holding of a drug product. If such... performed in connection with laboratory analysis, are eliminated by computerization or other automated... erasures, or loss shall be maintained. (c) Such automated equipment used for performance of operations...
21 CFR 211.68 - Automatic, mechanical, and electronic equipment.
Code of Federal Regulations, 2014 CFR
2014-04-01
... satisfactorily, may be used in the manufacture, processing, packing, and holding of a drug product. If such... performed in connection with laboratory analysis, are eliminated by computerization or other automated... erasures, or loss shall be maintained. (c) Such automated equipment used for performance of operations...
Microcomputers: Software Evaluation. Evaluation Guides. Guide Number 17.
ERIC Educational Resources Information Center
Gray, Peter J.
This guide discusses three critical steps in selecting microcomputer software and hardware: setting the context, software evaluation, and managing microcomputer use. Specific topics addressed include: (1) conducting an informal task analysis to determine how the potential user's time is spent; (2) identifying tasks amenable to computerization and…
Computerized Analysis of MR and Ultrasound Images of Breast Lesions
2001-07-01
Although general rules for the differentiation between benign and malignant mammographically identified breast lesions exist, considerable...round-robin runs yielded A(sub z) values of 0.94 and 0.87 in the task of distinguishing between benign and malignant lesions in the entire database
Computerized Analysis of MR and Ultrasound Images of Breast Lesions
2000-07-01
Although general rules for the differentiation between benign and malignant mammographically identified breast lesions exist, considerable...round-robin runs yielded Az values of 0.94 and 0.87 in the task of distinguishing between benign and malignant lesions in the entire database and the
Contemporary computerized methods for logging planning
Chris B. LeDoux
1988-01-01
Contemporary harvest planning graphic software is highlighted with practical applications. Planning results from a production study of the Clearwater Cable Yarder are summarized. Application of the planning methods to evaluation of proposed silvicultural treatments is included. Results show that 3-dimensional graphic analysis of proposed harvesting or silvicultural...
Irreducible Uncertainty in Terrestrial Carbon Projections
NASA Astrophysics Data System (ADS)
Lovenduski, N. S.; Bonan, G. B.
2016-12-01
We quantify and isolate the sources of uncertainty in projections of carbon accumulation by the ocean and terrestrial biosphere over 2006-2100 using output from Earth System Models participating in the 5th Coupled Model Intercomparison Project. We consider three independent sources of uncertainty in our analysis of variance: (1) internal variability, driven by random, internal variations in the climate system, (2) emission scenario, driven by uncertainty in future radiative forcing, and (3) model structure, wherein different models produce different projections given the same emission scenario. Whereas uncertainty in projections of ocean carbon accumulation by 2100 is 100 Pg C and driven primarily by emission scenario, uncertainty in projections of terrestrial carbon accumulation by 2100 is 50% larger than that of the ocean, and driven primarily by model structure. This structural uncertainty is correlated with emission scenario: the variance associated with model structure is an order of magnitude larger under a business-as-usual scenario (RCP8.5) than a mitigation scenario (RCP2.6). In an effort to reduce this structural uncertainty, we apply various model weighting schemes to our analysis of variance in terrestrial carbon accumulation projections. The largest reductions in uncertainty are achieved when giving all the weight to a single model; here the uncertainty is of a similar magnitude to the ocean projections. Such an analysis suggests that this structural uncertainty is irreducible given current terrestrial model development efforts.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Uncertainty Analysis of Consequence Management (CM) Data Products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole
The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.
Influences of system uncertainties on the numerical transfer path analysis of engine systems
NASA Astrophysics Data System (ADS)
Acri, A.; Nijman, E.; Acri, A.; Offner, G.
2017-10-01
Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.
Gerencser, Kristina R; Higbee, Thomas S; Akers, Jessica S; Contreras, Bethany P
2017-07-01
Training parents of children with autism spectrum disorder can be a challenge due to limited resources, time, and money. Interactive computerized training (ICT)-a self-paced program that incorporates instructions, videos, and interactive questions-is one method professionals can use to disseminate trainings to broader populations. This study extends previous research on ICT by assessing the effect of ICT to teach three parents how to implement a photographic activity schedule using a systematic prompting procedure with their child. Following ICT, all parents increased their fidelity to implementation of an activity schedule during role-play sessions with an adult. Fidelity remained high during implementation with their child and maintained during a 2-week follow-up. © 2017 Society for the Experimental Analysis of Behavior.
A randomized controlled trial of the Cool Teens CD-ROM computerized program for adolescent anxiety.
Wuthrich, Viviana M; Rapee, Ronald M; Cunningham, Michael J; Lyneham, Heidi J; Hudson, Jennifer L; Schniering, Carolyn A
2012-03-01
Computerized cognitive behavioral interventions for anxiety disorders in adults have been shown to be efficacious, but limited data are available on the use of computerized interventions with young persons. Adolescents in particular are difficult to engage in treatment and may be especially suited to computerized technologies. This paper describes the results of a small randomized controlled trial of the Cool Teens program for adolescent anxiety, and examines potential barriers to treatment and user preferences of computerized technology in this population. Forty-three adolescents with a primary diagnosis of anxiety were randomly allocated to the Cool Teens program, a 12-week computerized cognitive-behavioral therapy program for anxiety management, or a 12-week wait list. Effects on symptoms, negative thoughts, and life interference were assessed at post-treatment and 3-month follow-up, based on diagnosis as well as self and maternal report. Using mixed-model analyses, at post-treatment and follow-up assessments, adolescents in the Cool Teens condition, compared with those on the wait list, were found to have significant reductions in the total number of anxiety disorders, the severity of the primary anxiety disorder, and the average severity for all disorders. These results were matched by significant reductions in mother and child questionnaire reports of anxiety, internalizing symptoms, automatic thoughts, and life interference. Further few barriers to treatment were found, and user preferences indicated that the computerized treatment was well suited to adolescents with anxiety. The Cool Teens program is efficacious for treatment of adolescent anxiety. Clinical trial registration information-A randomized controlled trial of the Cool Teens computerized program for anxious adolescents compared with waist list; http://www.anzctr.org.au; ACTRN12611000508976. Copyright © 2012 American Academy of Child and Adolescent Psychiatry. Published by Elsevier Inc. All rights reserved.
Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.
Meyer, Veronika R
2003-09-01
Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.
NASA Astrophysics Data System (ADS)
Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.
2014-01-01
The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.
Uncertainties in internal gas counting
NASA Astrophysics Data System (ADS)
Unterweger, M.; Johansson, L.; Karam, L.; Rodrigues, M.; Yunoki, A.
2015-06-01
The uncertainties in internal gas counting will be broken down into counting uncertainties and gas handling uncertainties. Counting statistics, spectrum analysis, and electronic uncertainties will be discussed with respect to the actual counting of the activity. The effects of the gas handling and quantities of counting and sample gases on the uncertainty in the determination of the activity will be included when describing the uncertainties arising in the sample preparation.
McCarthy, Jillian H.; Hogan, Tiffany P.; Beukelman, David R.; Schwarz, Ilsa E.
2015-01-01
Purpose Spelling is an important skill for individuals who rely on augmentative alternative communication (AAC). The purpose of this study was to investigate how computerized sounding out influenced spelling accuracy of pseudo-words. Computerized sounding out was defined as a word elongated, thus providing an opportunity for a child to hear all the sounds in the word at a slower rate. Methods Seven children with cerebral palsy, four who use AAC and three who do not, participated in a single subject AB design. Results The results of the study indicated that the use of computerized sounding out increased the phonologic accuracy of the pseudo-words produced by participants. Conclusion The study provides preliminary evidence for the use of computerized sounding out during spelling tasks for children with cerebral palsy who do and do not use AAC. Future directions and clinical implications are discussed. PMID:24512195
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
10 CFR 436.24 - Uncertainty analyses.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...
Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty
In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...
To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.
NASA Astrophysics Data System (ADS)
Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.
2016-12-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.
NASA Astrophysics Data System (ADS)
Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang
2014-08-01
Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.
Uncertainty Analysis of Instrument Calibration and Application
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.
Code of Federal Regulations, 2014 CFR
2014-10-01
... comprehensive Tribal IV-D agencies must have in place to ensure the security and privacy of Computerized Tribal... ensure the security and privacy of Computerized Tribal IV-D Systems and Office Automation? (a..., accuracy, completeness, access to, and use of data in the Computerized Tribal IV-D System and Office...
ERIC Educational Resources Information Center
Hol, A. Michiel; Vorst, Harrie C. M.; Mellenbergh, Gideon J.
2007-01-01
In a randomized experiment (n = 515), a computerized and a computerized adaptive test (CAT) are compared. The item pool consists of 24 polytomous motivation items. Although items are carefully selected, calibration data show that Samejima's graded response model did not fit the data optimally. A simulation study is done to assess possible…
ERIC Educational Resources Information Center
Skinner, Harvey A.; Allen, Barbara A.
1983-01-01
Compared histories of alcohol, drug, and tobacco use obtained by computerized interview, face-to-face interview, and self-report in clients (N=150) from an addiction treatment center. Multivariate analyses revealed no important differences. The computerized interview was rated less friendly but faster and more interesting. (Author/JAC)
ERIC Educational Resources Information Center
Vispoel, Walter P.; Boo, Jaeyool; Bleiler, Timothy
2001-01-01
Evaluated the characteristics of computerized and paper-and-pencil versions of the Rosenberg Self-Esteem Scale (SES) using scores for 224 college students. Results show that mode of administration has little effect on the psychometric properties of the SES although the computerized version took longer and was preferred by examinees. (SLD)
ERIC Educational Resources Information Center
Dori, Yehudit J.; Sasson, Irit
2008-01-01
The case-based computerized laboratory (CCL) is a chemistry learning environment that integrates computerized experiments with emphasis on scientific inquiry and comprehension of case studies. The research objective was to investigate chemical understanding and graphing skills of high school honors students via bidirectional visual and textual…
Overton, Edgar Turner; Kauwe, John S.K.; Paul, Rob; Tashima, Karen; Tate, David F.; Patel, Pragna; Carpenter, Chuck; Patty, David; Brooks, John T.; Clifford, David B
2013-01-01
HIV-associated neurocognitive disorders (HAND) remain prevalent but challenging to diagnose particularly among non-demented individuals. To determine whether a brief computerized battery correlates with formal neurocognitive testing, we identified 46 HIV-infected persons who had undergone both formal neurocognitive testing and a brief computerized battery. Simple detection tests correlated best with formal neuropsychological testing. By multivariable regression model, 53% of the variance in the composite Global Deficit Score was accounted for by elements from the brief computerized tool (p<0.01). These data confirm previous correlation data with the computerized battery, yet illustrate remaining challenges for neurocognitive screening. PMID:21877204
Schulenberg, S E; Yutrzenka, B A
1999-05-01
The use of computerized psychological assessment is a growing practice among contemporary mental health professionals. Many popular and frequently used paper-and-pencil instruments have been adapted into computerized versions. Although equivalence for many instruments has been evaluated and supported, this issue is far from resolved. This literature review deals with recent research findings that suggest that computer aversion negatively impacts computerized assessment, particularly as it relates to measures of negative affect. There is a dearth of equivalence studies that take into account computer aversion's potential impact on the measurement of negative affect. Recommendations are offered for future research in this area.
NASA Technical Reports Server (NTRS)
Unal, Resit
1999-01-01
Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.
ERIC Educational Resources Information Center
Utah State Univ., Logan. Center for Persons with Disabilities.
This project studied the effects of implementing a computerized management information system developed for special education administrators. The Intelligent Administration Support Program (IASP), an expert system and database program, assisted in information acquisition and analysis pertaining to the district's quality of decisions and procedures…
Interweaving Objects, Gestures, and Talk in Context
ERIC Educational Resources Information Center
Brassac, Christian; Fixmer, Pierre; Mondada, Lorenza; Vinck, Dominique
2008-01-01
In a large French hospital, a group of professional experts (including physicians and software engineers) are working on the computerization of a blood-transfusion traceability device. By focusing on a particular moment in this slow process of design, we analyze their collaborative practices during a work session. The analysis takes a…
1991-05-01
Marine Corps Tiaining Systems (CBESS) memorization training Inteligence Center, Dam Neck Threat memorization training Commander Tactical Wings, Atlantic...News Shipbuilding Technical training AEGIS Training Center, Dare Artificial Intelligence (Al) Tools Computerized firm-end analysis tools NETSCPAC...Technology Department and provides computational and electronic mail support for research in areas of artificial intelligence, computer-assisted instruction
ERIC Educational Resources Information Center
Cocks, Errol; Ng, Pin Chee
1983-01-01
The paper discusses an analysis of a computerized data bank on the mentally retarded population in Victoria, Australia. Prevalence rates, severity of handicap, age, sex, and residence type are reviewed and implications for community vocational and residential services for adults are noted. (CL)
Converting the H. W. Wilson Company Indexes to an Automated System: A Functional Analysis.
ERIC Educational Resources Information Center
Regazzi, John J.
1984-01-01
Description of the computerized information system that supports the editorial and manufacturing processes involved in creation of Wilson's subject indexes and catalogs includes the major subsystems--online data entry, batch input processing, validation and release, file generation and database management, online and offline retrieval, publication…
Computer Simulation of Human Behavior: Assessment of Creativity.
ERIC Educational Resources Information Center
Greene, John F.
The major purpose of this study is to further the development of procedures which minimize current limitations of creativity instruments, thus yielding a reliable and functional means for assessing creativity. Computerized content analysis and multiple regression are employed to simulate the creativity ratings of trained judges. The computerized…
Bibliography on Mathematical Abilities.
ERIC Educational Resources Information Center
Kilpatrick, Jeremy; Wagner, Sigrid
The items in this bibliography were collected as part of a project, "An Analysis of Research on Mathematical Abilities," conducted at the University of Georgia. The 1,491 entries in the bibliography are listed alphabetically by author. Each entry is preceded by a line containing a name and date code (used in computerized alphabetizing of…
Topographic Brain Mapping: A Window on Brain Function?
ERIC Educational Resources Information Center
Karniski, Walt M.
1989-01-01
The article reviews the method of topographic mapping of the brain's electrical activity. Multiple electroencephalogram (EEG) electrodes and computerized analysis of the EEG signal are used to generate maps of frequency and voltage (evoked potential). This relatively new technique holds promise in the evaluation of children with behavioral and…
A Markov Model Analysis of Problem-Solving Progress.
ERIC Educational Resources Information Center
Vendlinski, Terry
This study used a computerized simulation and problem-solving tool along with artificial neural networks (ANN) as pattern recognizers to identify the common types of strategies high school and college undergraduate chemistry students would use to solve qualitative chemistry problems. Participants were 134 high school chemistry students who used…
Public Domain Generic Tools: An Overview.
ERIC Educational Resources Information Center
Erjavec, Tomaz
This paper presents an introduction to language engineering software, especially for computerized language and text corpora. The focus of the paper is on small and relatively independent pieces of software designed for specific, often low-level language analysis tasks, and on tools in the public domain. Discussion begins with the application of…
The Alignment of Technology and Structure through Roles and Networks.
ERIC Educational Resources Information Center
Barley, Stephen R.
1990-01-01
Building on Nagel's theory of social structure, this paper argues that the microsocial dynamics occasioned by new technologies systematically reverberate up levels of analysis. This theory is illustrated by ethnographic and sociometric data drawn from a study comparing usage of traditional and computerized imaging devices in two radiology…
Speech Training for Inmate Rehabilitation.
ERIC Educational Resources Information Center
Parkinson, Michael G.; Dobkins, David H.
1982-01-01
Using a computerized content analysis, the authors demonstrate changes in speech behaviors of prison inmates. They conclude that two to four hours of public speaking training can have only limited effect on students who live in a culture in which "prison speech" is the expected and rewarded form of behavior. (PD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, S.; Toll, J.; Cothern, K.
1995-12-31
The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less
A stochastic approach to uncertainty quantification in residual moveout analysis
NASA Astrophysics Data System (ADS)
Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.
2015-06-01
Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.
Heetderks-Cox, M J; Alford, B B; Bednar, C M; Heiss, C J; Tauai, L A; Edgren, K K
2001-09-01
This study observed the effect of using a computerized vs manual method of self-monitoring among Air Force personnel receiving nutrition counseling for weight loss. Subjects who enrolled during the first 2 weeks of the 4-week recruitment period completed food records for 6 weeks using a CD-ROM nutrient database (intervention group) whereas those who enrolled during the last 2 weeks used a food record booklet (comparison group). Of the 42 subjects (n = 23 intervention group and n = 19 comparison group), only 113 intervention and 11 comparison group subjects (57% of study enrollees) submitted at least 1 food record during the study and were included in the analysis, which included review of pre- and poststudy questionnaires, food records, and focus group data. There were no significant differences between the number of days per week documented or average number of items recorded daily. All 9 intervention as compared to 2 comparison group subjects who completed a poststudy questionnaire searched for lower-energy and lower-fat items and reported changing their dietary intake as a result. All intervention group subjects who participated in a focus group (n=6) had favorable comments about using the CD-ROM for monitoring and changing eating habits, indicating that it is a beneficial self-monitoring tool. Participants enjoyed the immediate dietary feedback, and computerized food records may be easier to interpret by nutrition counselors. A number of computerized nutrient databases are available to assist patients and consumers in managing nutritional concerns.
NASA Technical Reports Server (NTRS)
Hanks, G. W.; Shomber, H. A.; Dethman, H. A.; Gratzer, L. B.; Maeshiro, A.; Gangsaas, D.; Blight, J. D.; Buchan, S. M.; Crumb, C. B.; Dorwart, R. J.
1981-01-01
The current status of the Active Controls Technology (ACT) for the advanced subsonic transport project is investigated through analysis of the systems technical data. Control systems technologies under examination include computerized reliability analysis, pitch axis fly by wire actuator, flaperon actuation system design trade study, control law synthesis and analysis, flutter mode control and gust load alleviation analysis, and implementation of alternative ACT systems. Extensive analysis of the computer techniques involved in each system is included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallimore, David L.
2012-06-13
The measurement uncertainty estimatino associated with trace element analysis of impurities in U and Pu was evaluated using the Guide to the Expression of Uncertainty Measurement (GUM). I this evalution the uncertainty sources were identified and standard uncertainties for the components were categorized as either Type A or B. The combined standard uncertainty was calculated and a coverage factor k = 2 was applied to obtain the expanded uncertainty, U. The ICP-AES and ICP-MS methods used were deveoped for the multi-element analysis of U and Pu samples. A typical analytical run consists of standards, process blanks, samples, matrix spiked samples,more » post digestion spiked samples and independent calibration verification standards. The uncertainty estimation was performed on U and Pu samples that have been analyzed previously as part of the U and Pu Sample Exchange Programs. Control chart results and data from the U and Pu metal exchange programs were combined with the GUM into a concentration dependent estimate of the expanded uncertainty. Comparison of trace element uncertainties obtained using this model was compared to those obtained for trace element results as part of the Exchange programs. This process was completed for all trace elements that were determined to be above the detection limit for the U and Pu samples.« less
Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach
NASA Astrophysics Data System (ADS)
Rodrigues, D. B. B.
2015-12-01
Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.
Assessing uncertainties in surface water security: An empirical multimodel approach
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.
2015-11-01
Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.
Lee, We-Kang; Su, Yi-An; Song, Tzu-Jiun; Chiu, Yao-Chu; Lin, Ching-Hung
2014-01-01
The Iowa Gambling Task (IGT) developed by Bechara et al. in 1994 is used to diagnose patients with Ventromedial Medial Prefrontal Cortex (VMPFC) lesions, and it has become a landmark in research on decision making. According to Bechara et al., the manipulation of progressive increments of monetary value can normalize the performance of patients with VMPFC lesions; thus, they developed a computerized version of the IGT. However, the empirical results showed that patients' performances did not improve as a result of this manipulation, which suggested that patients with VMPFC lesions performed myopically for future consequences. Using the original version of the IGT, some IGT studies have demonstrated that increments of monetary value significantly influence the performance of normal subjects in the IGT. However, other research has resulted in inconsistent findings. In this study, we used the computerized IGT (1X-IGT) and manipulated the value contrast of progressive increments (i.e., by designing the 10X-IGT, which contained 10 times of progressive increment) to investigate the influence of value contrast on the performance of normal subjects. The resulting empirical observations indicated that the value contrast (1X- vs. 10X-IGT) of the progressive increment had no effect on the performance of normal subjects. This study also provides a discussion of the issue of value in IGT-related studies. Moreover, we found the "prominent deck B phenomenon" in both versions of the IGT, which indicated that the normal subjects were guided mostly by the gain-loss frequency, rather than by the monetary value contrast. In sum, the behavioral performance of normal subjects demonstrated a low correlation with changes in monetary value, even in the 10X-IGT.
Jibaja‐Weiss, Maria L.; Volk, Robert J.; Friedman, Lois C.; Granchi, Thomas S.; Neff, Nancy E.; Spann, Stephen J.; Robinson, Emily K.; Aoki, Noriaki; Robert Beck, J.
2006-01-01
Abstract Objective To report on the initial testing of a values clarification exercise utilizing a jewellery box within a computerized patient decision aid (CPtDA) designed to assist women in making a surgical breast cancer treatment decision. Design Pre‐post design, with patients interviewed after diagnosis, and then after completing the CPtDA sometime later at their preoperative visit. Sample Fifty‐one female patients, who are low literate and naïve computer users, newly diagnosed with early stage breast cancer from two urban public hospitals. Intervention A computerized decision aid that combines entertainment‐education (edutainment) with enhanced (factual) content. An interactive jewellery box is featured to assist women in: (1) recording and reflecting over issues of concern with possible treatments, (2) deliberating over surgery decision, and (3) communicating with physician and significant others. Outcomes Patients’ use of the jewellery box to store issues during completion of the CPtDA, and perceived clarity of values in making a treatment decision, as measured by a low literacy version of the Decisional Conflict Scale (DCS). Results Over half of the participants utilized the jewellery box to store issues they found concerning about the treatments. On average, users flagged over 13 issues of concern with the treatments. Scores on the DCS Uncertainty and Feeling Unclear about Values subscales were lower after the intervention compared to before the decision was made. Conclusions A values clarification exercise using an interactive jewellery box may be a promising method for promoting informed treatment decision making by low literacy breast cancer patients. PMID:16911136
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.
2014-02-01
This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
NASA Astrophysics Data System (ADS)
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
NASA Astrophysics Data System (ADS)
Karaszi, Zoltan; Konya, Andrew; Dragan, Feodor; Jakli, Antal; CPIP/LCI; CS Dept. of Kent State University Collaboration
Polarizing optical microscopy (POM) is traditionally the best-established method of studying liquid crystals, and using POM started already with Otto Lehman in 1890. An expert, who is familiar with the science of optics of anisotropic materials and typical textures of liquid crystals, can identify phases with relatively large confidence. However, for unambiguous identification usually other expensive and time-consuming experiments are needed. Replacement of the subjective and qualitative human eye-based liquid crystal texture analysis with quantitative computerized image analysis technique started only recently and were used to enhance the detection of smooth phase transitions, determine order parameter and birefringence of specific liquid crystal phases. We investigate if the computer can recognize and name the phase where the texture was taken. To judge the potential of reliable image recognition based on this procedure, we used 871 images of liquid crystal textures belonging to five main categories: Nematic, Smectic A, Smectic C, Cholesteric and Crystal, and used a Neural Network Clustering Technique included in the data mining software package in Java ``WEKA''. A neural network trained on a set of 827 LC textures classified the remaining 44 textures with 80% accuracy.
Probability and possibility-based representations of uncertainty in fault tree analysis.
Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje
2013-01-01
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.
Mozaffar, Hajar; Williams, Robin; Cresswell, Kathrin; Morrison, Zoe; Bates, David W; Sheikh, Aziz
2016-03-01
To understand the evolving market of commercial off-the-shelf Computerized Physician Order Entry (CPOE) and Computerized Decision Support (CDS) applications and its effects on their uptake and implementation in English hospitals. Although CPOE and CDS vendors have been quick to enter the English market, uptake has been slow and uneven. To investigate this, the authors undertook qualitative ethnography of vendors and adopters of hospital CPOE/CDS systems in England. The authors collected data from semi-structured interviews with 11 individuals from 4 vendors, including the 2 most entrenched suppliers, and 6 adopter hospitals, and 21 h of ethnographic observation of 2 user groups, and 1 vendor event. The research and analysis was informed by insights from studies of the evolution of technology fields and the emergence of generic COTS enterprise solutions. Four key themes emerged: (1) adoption of systems that had been developed outside of England, (2) vendors' configuration and customization strategies, (3) localized adopter practices vs generic systems, and (4) unrealistic adopter demands. Evidence for our over-arching finding concerning the current immaturity of the market was derived from vendors' strategies, adopters' reactions to the technology, and policy makers' incomplete insights. The CPOE/CDS market in England is still in an emergent phase. The rapid entrance of diverse products, triggered by federal policy initiatives, has resulted in premature adoption of systems that do not yet adequately meet the needs of hospitals. Vendors and adopters lacked understanding of how to design and implement generic solutions to meet diverse user needs. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Evaluation of Spontaneous Spinal Cerebrospinal Fluid Leaks Disease by Computerized Image Processing.
Yıldırım, Mustafa S; Kara, Sadık; Albayram, Mehmet S; Okkesim, Şükrü
2016-05-17
Spontaneous Spinal Cerebrospinal Fluid Leaks (SSCFL) is a disease based on tears on the dura mater. Due to widespread symptoms and low frequency of the disease, diagnosis is problematic. Diagnostic lumbar puncture is commonly used for diagnosing SSCFL, though it is invasive and may cause pain, inflammation or new leakages. T2-weighted MR imaging is also used for diagnosis; however, the literature on T2-weighted MRI states that findings for diagnosis of SSCFL could be erroneous when differentiating the diseased and control. One another technique for diagnosis is CT-myelography, but this has been suggested to be less successful than T2-weighted MRI and it needs an initial lumbar puncture. This study aimed to develop an objective, computerized numerical analysis method using noninvasive routine Magnetic Resonance Images that can be used in the evaluation and diagnosis of SSCFL disease. Brain boundaries were automatically detected using methods of mathematical morphology, and a distance transform was employed. According to normalized distances, average densities of certain sites were proportioned and a numerical criterion related to cerebrospinal fluid distribution was calculated. The developed method was able to differentiate between 14 patients and 14 control subjects significantly with p = 0.0088 and d = 0.958. Also, the pre and post-treatment MRI of four patients was obtained and analyzed. The results were differentiated statistically (p = 0.0320, d = 0.853). An original, noninvasive and objective diagnostic test based on computerized image processing has been developed for evaluation of SSCFL. To our knowledge, this is the first computerized image processing method for evaluation of the disease. Discrimination between patients and controls shows the validity of the method. Also, post-treatment changes observed in four patients support this verdict.
Shah, T; Verdile, G; Sohrabi, H; Campbell, A; Putland, E; Cheetham, C; Dhaliwal, S; Weinborn, M; Maruff, P; Darby, D; Martins, R N
2014-12-02
Physical exercise interventions and cognitive training programs have individually been reported to improve cognition in the healthy elderly population; however, the clinical significance of using a combined approach is currently lacking. This study evaluated whether physical activity (PA), computerized cognitive training and/or a combination of both could improve cognition. In this nonrandomized study, 224 healthy community-dwelling older adults (60-85 years) were assigned to 16 weeks home-based PA (n=64), computerized cognitive stimulation (n=62), a combination of both (combined, n=51) or a control group (n=47). Cognition was assessed using the Rey Auditory Verbal Learning Test, Controlled Oral Word Association Test and the CogState computerized battery at baseline, 8 and 16 weeks post intervention. Physical fitness assessments were performed at all time points. A subset (total n=45) of participants underwent [(18)F] fluorodeoxyglucose positron emission tomography scans at 16 weeks (post-intervention). One hundred and ninety-one participants completed the study and the data of 172 participants were included in the final analysis. Compared with the control group, the combined group showed improved verbal episodic memory and significantly higher brain glucose metabolism in the left sensorimotor cortex after controlling for age, sex, premorbid IQ, apolipoprotein E (APOE) status and history of head injury. The higher cerebral glucose metabolism in this brain region was positively associated with improved verbal memory seen in the combined group only. Our study provides evidence that a specific combination of physical and mental exercises for 16 weeks can improve cognition and increase cerebral glucose metabolism in cognitively intact healthy older adults.
Shah, T; Verdile, G; Sohrabi, H; Campbell, A; Putland, E; Cheetham, C; Dhaliwal, S; Weinborn, M; Maruff, P; Darby, D; Martins, R N
2014-01-01
Physical exercise interventions and cognitive training programs have individually been reported to improve cognition in the healthy elderly population; however, the clinical significance of using a combined approach is currently lacking. This study evaluated whether physical activity (PA), computerized cognitive training and/or a combination of both could improve cognition. In this nonrandomized study, 224 healthy community-dwelling older adults (60–85 years) were assigned to 16 weeks home-based PA (n=64), computerized cognitive stimulation (n=62), a combination of both (combined, n=51) or a control group (n=47). Cognition was assessed using the Rey Auditory Verbal Learning Test, Controlled Oral Word Association Test and the CogState computerized battery at baseline, 8 and 16 weeks post intervention. Physical fitness assessments were performed at all time points. A subset (total n=45) of participants underwent [18F] fluorodeoxyglucose positron emission tomography scans at 16 weeks (post-intervention). One hundred and ninety-one participants completed the study and the data of 172 participants were included in the final analysis. Compared with the control group, the combined group showed improved verbal episodic memory and significantly higher brain glucose metabolism in the left sensorimotor cortex after controlling for age, sex, premorbid IQ, apolipoprotein E (APOE) status and history of head injury. The higher cerebral glucose metabolism in this brain region was positively associated with improved verbal memory seen in the combined group only. Our study provides evidence that a specific combination of physical and mental exercises for 16 weeks can improve cognition and increase cerebral glucose metabolism in cognitively intact healthy older adults. PMID:25463973
ERIC Educational Resources Information Center
Ouellon, Conrad, Comp.
Presentations from a colloquium on applications of research on natural languages to computer science address the following topics: (1) analysis of complex adverbs; (2) parser use in computerized text analysis; (3) French language utilities; (4) lexicographic mapping of official language notices; (5) phonographic codification of Spanish; (6)…
1987-09-01
a useful average for population studies, do not delay data processing , and is relatively Inexpensive. Using MVEN and observing recipe preparation...for population studies, do not delay data processing , and is relatively inexpensive. Using HVEM and observing recipe preparation procedures improve the...extensive review of the procedures and problems in design, collection, analysis, processing and interpretation of dietary survey data for individuals
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 2 2010-10-01 2010-10-01 false Under what circumstances would FFP be suspended or... SYSTEMS AND OFFICE AUTOMATION Funding for Computerized Tribal IV-D Systems and Office Automation § 310.30 Under what circumstances would FFP be suspended or disallowed in the costs of Computerized Tribal IV-D...
Computerized Biomechanical Man-Model
1976-07-01
Force Systems Command Wright-Patterson AFB, Ohio ABSTRACT The COMputerized BIomechanical MAN-Model (called COMBIMAN) is a computer interactive graphics...concept was to build a mock- The use of mock-ups for biomechanical evalua- up which permitted the designer to visualize the tion has long been a tool...of the can become an obstacle to design change. Aerospace Medical Research Laboratory, we are developing a computerized biomechanical man-model
2010-10-01
facial trustworthiness; facial displays of anger) presented subliminally . Furthermore, the responsiveness of these regions to subliminal stimulation ...develop, or program the computerized stimulation paradigms for use during functional neuroimaging (i.e., MJT; BMAT; EFAT). These paradigms will be...programming began on the computerized functional MRI stimulation paradigms using e-prime software. • Quarter #2: Programming of all computerized functional
2017-10-01
AWARD NUMBER: W81XWH-15-1-0508 TITLE: Multimodal Intervention Trial for Cognitive Deficits in Neurofibromatosis Type 1: Efficacy of...Computerized Cognitive Training and Stimulant Medication PRINCIPAL INVESTIGATOR: Maria T. Acosta, M.D. CONTRACTING ORGANIZATION: Children’s National Health...database. 15. SUBJECT TERMS Neurofibromatosis, cognition , pediatric, computerized training programs, working memory 16. SECURITY CLASSIFICATION OF: 17
ERIC Educational Resources Information Center
Bennett, Stephanie J.; Holmes, Joni; Buckley, Sue
2013-01-01
This study evaluated the impact of a computerized visuospatial memory training intervention on the memory and behavioral skills of children with Down syndrome. Teaching assistants were trained to support the delivery of a computerized intervention program to individual children over a 10-16 week period in school. Twenty-one children aged 7-12…
ERIC Educational Resources Information Center
Carriedo, Ruben; And Others
The San Diego Unified School District (California) began operating a computerized routing and scheduling system for its pupil transportation services at the beginning of the 1985-86 academic school year. The computerized system, provided by Ecotran Systems, Inc. (ECO) of Cleveland, Ohio experienced an inordinate number of difficulties. A…
NASA Astrophysics Data System (ADS)
Sun, Wenqing; Tseng, Tzu-Liang B.; Zheng, Bin; Zhang, Jianying; Qian, Wei
2015-03-01
A novel breast cancer risk analysis approach is proposed for enhancing performance of computerized breast cancer risk analysis using bilateral mammograms. Based on the intensity of breast area, five different sub-regions were acquired from one mammogram, and bilateral features were extracted from every sub-region. Our dataset includes 180 bilateral mammograms from 180 women who underwent routine screening examinations, all interpreted as negative and not recalled by the radiologists during the original screening procedures. A computerized breast cancer risk analysis scheme using four image processing modules, including sub-region segmentation, bilateral feature extraction, feature selection, and classification was designed to detect and compute image feature asymmetry between the left and right breasts imaged on the mammograms. The highest computed area under the curve (AUC) is 0.763 ± 0.021 when applying the multiple sub-region features to our testing dataset. The positive predictive value and the negative predictive value were 0.60 and 0.73, respectively. The study demonstrates that (1) features extracted from multiple sub-regions can improve the performance of our scheme compared to using features from whole breast area only; (2) a classifier using asymmetry bilateral features can effectively predict breast cancer risk; (3) incorporating texture and morphological features with density features can boost the classification accuracy.
Webb, S M; Ruscalleda, J; Schwarzstein, D; Calaf-Alsina, J; Rovira, A; Matos, G; Puig-Domingo, M; de Leiva, A
1992-05-01
We wished to analyse the relative value of computerized tomography and magnetic resonance in patients referred for evaluation of pituitary and parasellar lesions. We performed a separate evaluation by two independent neuroradiologists of computerized tomography and magnetic resonance images ordered numerically and anonymously, with no clinical data available. We studied 40 patients submitted for hypothalamic-pituitary study; 31 were carried out preoperatively, of which histological confirmation later became available in 14. The remaining nine patients were evaluated postoperatively. Over 40 parameters relating to the bony margins, cavernous sinuses, carotid arteries, optic chiasm, suprasellar cisterns, pituitary, pituitary stalk and extension of the lesion were evaluated. These reports were compared with the initial ones offered when the scans were ordered, and with the final diagnosis. Concordance between initial computerized tomography and magnetic resonance was observed in 27 cases (67.5%); among the discordant cases computerized tomography showed the lesion in two, magnetic resonance in 10, while in the remaining case reported to harbour a microadenoma on computerized tomography the differential diagnosis between a true TSH-secreting microadenoma and pituitary resistance to thyroid hormones is still unclear. Both neuroradiologists coincided in their reports in 32 patients (80%); when the initial report was compared with those of the neuroradiologists, concordance was observed with at least one of them in 34 instances (85%). Discordant results were observed principally in microadenomas secreting ACTH or PRL and in delayed puberty. In the eight patients with Cushing's disease (histologically confirmed in six) magnetic resonance was positive in five and computerized tomography in two; the abnormal image correctly identified the side of the lesion at surgery. In patients referred for evaluation of Cushing's syndrome or hyperprolactinaemia (due to microadenomas) or after surgery, magnetic resonance is clearly preferable to computerized tomography. In macroadenomas both scans are equally diagnostic but magnetic resonance offers more information on pituitary morphology and neighbouring structures. Nevertheless, there are cases in which the results of computerized tomography and magnetic resonance will complement each other, since different parameters are analysed with each examination and discordant results are encountered.
Robustness analysis of non-ordinary Petri nets for flexible assembly systems
NASA Astrophysics Data System (ADS)
Hsieh, Fu-Shiung
2010-05-01
Non-ordinary controlled Petri nets (NCPNs) have the advantages to model flexible assembly systems in which multiple identical resources may be required to perform an operation. However, existing studies on NCPNs are still limited. For example, the robustness properties of NCPNs have not been studied. This motivates us to develop an analysis method for NCPNs. Robustness analysis concerns the ability for a system to maintain operation in the presence of uncertainties. It provides an alternative way to analyse a perturbed system without reanalysis. In our previous research, we have analysed the robustness properties of several subclasses of ordinary controlled Petri nets. To study the robustness properties of NCPNs, we augment NCPNs with an uncertainty model, which specifies an upper bound on the uncertainties for each reachable marking. The resulting PN models are called non-ordinary controlled Petri nets with uncertainties (NCPNU). Based on NCPNU, the problem is to characterise the maximal tolerable uncertainties for each reachable marking. The computational complexities to characterise maximal tolerable uncertainties for each reachable marking grow exponentially with the size of the nets. Instead of considering general NCPNU, we limit our scope to a subclass of PN models called non-ordinary controlled flexible assembly Petri net with uncertainties (NCFAPNU) for assembly systems and study its robustness. We will extend the robustness analysis to NCFAPNU. We identify two types of uncertainties under which the liveness of NCFAPNU can be maintained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Chen, Xingyuan; Ye, Ming
Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less
NASA Astrophysics Data System (ADS)
Ruiz, Rafael O.; Meruane, Viviana
2017-06-01
The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.
Joshi, Anuradha; Buch, Jatin; Kothari, Nitin; Shah, Nishal
2016-06-01
Prescription order is an important therapeutic transaction between physician and patient. A good quality prescription is an extremely important factor for minimizing errors in dispensing medication and it should be adherent to guidelines for prescription writing for benefit of the patient. To evaluate frequency and type of prescription errors in outpatient prescriptions and find whether prescription writing abides with WHO standards of prescription writing. A cross-sectional observational study was conducted at Anand city. Allopathic private practitioners practising at Anand city of different specialities were included in study. Collection of prescriptions was started a month after the consent to minimize bias in prescription writing. The prescriptions were collected from local pharmacy stores of Anand city over a period of six months. Prescriptions were analysed for errors in standard information, according to WHO guide to good prescribing. Descriptive analysis was performed to estimate frequency of errors, data were expressed as numbers and percentage. Total 749 (549 handwritten and 200 computerised) prescriptions were collected. Abundant omission errors were identified in handwritten prescriptions e.g., OPD number was mentioned in 6.19%, patient's age was mentioned in 25.50%, gender in 17.30%, address in 9.29% and weight of patient mentioned in 11.29%, while in drug items only 2.97% drugs were prescribed by generic name. Route and Dosage form was mentioned in 77.35%-78.15%, dose mentioned in 47.25%, unit in 13.91%, regimens were mentioned in 72.93% while signa (direction for drug use) in 62.35%. Total 4384 errors out of 549 handwritten prescriptions and 501 errors out of 200 computerized prescriptions were found in clinicians and patient details. While in drug item details, total number of errors identified were 5015 and 621 in handwritten and computerized prescriptions respectively. As compared to handwritten prescriptions, computerized prescriptions appeared to be associated with relatively lower rates of error. Since out-patient prescription errors are abundant and often occur in handwritten prescriptions, prescribers need to adapt themselves to computerized prescription order entry in their daily practice.
New analysis strategies for micro aspheric lens metrology
NASA Astrophysics Data System (ADS)
Gugsa, Solomon Abebe
Effective characterization of an aspheric micro lens is critical for understanding and improving processing in micro-optic manufacturing. Since most microlenses are plano-convex, where the convex geometry is a conic surface, current practice is often limited to obtaining an estimate of the lens conic constant, which average out the surface geometry that departs from an exact conic surface and any addition surface irregularities. We have developed a comprehensive approach of estimating the best fit conic and its uncertainty, and in addition propose an alternative analysis that focuses on surface errors rather than best-fit conic constant. We describe our new analysis strategy based on the two most dominant micro lens metrology methods in use today, namely, scanning white light interferometry (SWLI) and phase shifting interferometry (PSI). We estimate several parameters from the measurement. The major uncertainty contributors for SWLI are the estimates of base radius of curvature, the aperture of the lens, the sag of the lens, noise in the measurement, and the center of the lens. In the case of PSI the dominant uncertainty contributors are noise in the measurement, the radius of curvature, and the aperture. Our best-fit conic procedure uses least squares minimization to extract a best-fit conic value, which is then subjected to a Monte Carlo analysis to capture combined uncertainty. In our surface errors analysis procedure, we consider the surface errors as the difference between the measured geometry and the best-fit conic surface or as the difference between the measured geometry and the design specification for the lens. We focus on a Zernike polynomial description of the surface error, and again a Monte Carlo analysis is used to estimate a combined uncertainty, which in this case is an uncertainty for each Zernike coefficient. Our approach also allows us to investigate the effect of individual uncertainty parameters and measurement noise on both the best-fit conic constant analysis and the surface errors analysis, and compare the individual contributions to the overall uncertainty.
Estimating Uncertainty in N2O Emissions from US Cropland Soils
USDA-ARS?s Scientific Manuscript database
A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...
NASA Technical Reports Server (NTRS)
Tripp, John S.; Tcheng, Ping
1999-01-01
Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.
Mixed results in the safety performance of computerized physician order entry.
Metzger, Jane; Welebob, Emily; Bates, David W; Lipsitz, Stuart; Classen, David C
2010-04-01
Computerized physician order entry is a required feature for hospitals seeking to demonstrate meaningful use of electronic medical record systems and qualify for federal financial incentives. A national sample of sixty-two hospitals voluntarily used a simulation tool designed to assess how well safety decision support worked when applied to medication orders in computerized order entry. The simulation detected only 53 percent of the medication orders that would have resulted in fatalities and 10-82 percent of the test orders that would have caused serious adverse drug events. It is important to ascertain whether actual implementations of computerized physician order entry are achieving goals such as improved patient safety.
Information technology and medication safety: what is the benefit?
Kaushal, R; Bates, D
2002-01-01
Medication errors occur frequently and have significant clinical and financial consequences. Several types of information technologies can be used to decrease rates of medication errors. Computerized physician order entry with decision support significantly reduces serious inpatient medication error rates in adults. Other available information technologies that may prove effective for inpatients include computerized medication administration records, robots, automated pharmacy systems, bar coding, "smart" intravenous devices, and computerized discharge prescriptions and instructions. In outpatients, computerization of prescribing and patient oriented approaches such as personalized web pages and delivery of web based information may be important. Public and private mandates for information technology interventions are growing, but further development, application, evaluation, and dissemination are required. PMID:12486992
The role of preoperative CT scan in patients with tracheoesophageal fistula: a review.
Garge, Saurabh; Rao, K L N; Bawa, Monika
2013-09-01
The morbidity and mortality associated with esophageal atresia with or without a fistula make it a challenging congenital abnormality for the pediatric surgeon. Anatomic factors like inter-pouch gap and origin of fistula are not taken into consideration in various prognostic classifications. The preoperative evaluation of these cases with computerized tomography (CT) has been used by various investigators to delineate these factors. We reviewed these studies to evaluate the usefulness of this investigation in the intra operative and post operative period. A literature search was done on all peer-reviewed articles published on preoperative computed tomography (CT) in cases of tracheoesophageal fistula using the PUBMED and MEDLINE search engines. Key words included tracheoesophageal fistula, computerized tomography, virtual bronchoscopy, and 3D computerized tomography reconstruction. Further, additional articles were selected from the list of references obtained from the retrieved publications. A total of 8 articles were selected for analysis. In most of the studies, comprising 96 patients, observations noted in preoperative CT were confirmed during surgery. In a study by Mahalik et al [Mahalik SK, Sodhi KS, Narasimhan KL, Rao KL. Role of preoperative 3D CT reconstruction for evaluation of patients with esophageal atresia and tracheoesophageal fistula. Pediatr Surg Int. 2012 Jun 22. [Epub ahead of print
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti
2017-08-01
Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasai, Satoshi; Li Feng; Shiraishi, Junji
Vertebral fracture (or vertebral deformity) is a very common outcome of osteoporosis, which is one of the major public health concerns in the world. Early detection of vertebral fractures is important because timely pharmacologic intervention can reduce the risk of subsequent additional fractures. Chest radiographs are used routinely for detection of lung and heart diseases, and vertebral fractures can be visible on lateral chest radiographs. However, investigators noted that about 50% of vertebral fractures visible on lateral chest radiographs were underdiagnosed or under-reported, even when the fractures were severe. Therefore, our goal was to develop a computerized method for detectionmore » of vertebral fractures on lateral chest radiographs in order to assist radiologists' image interpretation and thus allow the early diagnosis of osteoporosis. The cases used in this study were 20 patients with severe vertebral fractures and 118 patients without fractures, as confirmed by the consensus of two radiologists. Radiologists identified the locations of fractured vertebrae, and they provided morphometric data on the vertebral shape for evaluation of the accuracy of detecting vertebral end plates by computer. In our computerized method, a curved search area, which included a number of vertebral end plates, was first extracted automatically, and was straightened so that vertebral end plates became oriented horizontally. Edge candidates were enhanced by use of a horizontal line-enhancement filter in the straightened image, and a multiple thresholding technique, followed by feature analysis, was used for identification of the vertebral end plates. The height of each vertebra was determined from locations of identified vertebral end plates, and fractured vertebrae were detected by comparison of the measured vertebral height with the expected height. The sensitivity of our computerized method for detection of fracture cases was 95% (19/20), with 1.03 (139/135) false-positive fractures per image. The accuracy of identifying vertebral end plates, marked by radiologists in a morphometric study, was 76.6% (400/522) and 70.9% (420/592) for cases used for training and those for testing, respectively. We prepared 32 additional fracture cases for a validation test, and we examined the detection accuracy of our computerized method. The sensitivity for these cases was 75% (24/32) at 1.03 (33/32) false-positive fractures per image. Our preliminary results show that the automated computerized scheme for detecting vertebral fractures on lateral chest radiographs has the potential to assist radiologists in detecting vertebral fractures.« less
MDCT for Computerized Volumetry of Pneumothoraces in Pediatric Patients
Cai, Wenli; Lee, Edward Y.; Vij, Abhinav; Mahmood, Soran A.; Yoshida, Hiroyuki
2010-01-01
OBJECTIVE Our purpose in this study was to develop an automated computer-aided volumetry (CAV) scheme for quantifying pneumothorax in MDCT images for pediatric patients and to investigate the imaging parameters that may affect its accuracy. MATERIALS AND METHODS Fifty-eight consecutive pediatric patients (mean age 12±6 years) with pneumothorax who underwent MDCT for evaluation were collected retrospectively for this study. All cases were imaged by a 16- or 64-MDCT scanner with weight-based kilovoltage, low-dose tube current, 1.0 ~ 1.5 pitch, 0.6 ~ 5.0 mm slice thickness, and a B70f (sharp) or B31f (soft) reconstruction kernel. Sixty-three pneumothoraces ≥1 cc were visually identified in the left (n = 30) or/and right (n = 33) lungs. Each identified pneumothorax was contoured manually on an Amira workstation V4.1.1 (Mercury Computer Systems, Chelmsford, Massachusetts) by two radiologists in consensus. The computerized volumes of the pneumothoraces were determined by application of our CAV scheme. The accuracy of our automated CAV scheme was evaluated by comparison between computerized volumetry and manual volumetry, for the total volume of pneumothoraces in the left and right lungs. RESULTS The mean difference between the computerized volumetry and the manual volumetry for all 63 pneumothoraces ≥1 cc was 8.2%. For pneumothoraces ≥10 cc, ≥50 cc, and ≥200 cc, the mean differences were 7.7% (n=57), 7.3% (n=33), and 6.4% (n=13), respectively. The correlation coefficient was 0.99 between the computerized volume and the manual volume of pneumothoraces. Bland-Altman analysis showed that computerized volumetry has a mean difference of −5.1% compared to manual volumetry. For all pneumothoraces ≥10 cc, the mean differences for slice thickness ≤1.25 mm, =1.5 mm, and =5.0 mm were 6.1% (n=28), 3.5% (n=10), and 12.2% (n=19), respectively. For the two reconstruction kernels, B70f and B31f, the mean differences were 6.3% (n=42, B70f) and 11.7% (n=15, B31f), respectively. CONCLUSION Our automated CAV scheme provides an accurate measurement of pneumothorax volume in MDCT images of pediatric patients. For accurate volumetric quantification of pneumothorax in children in MDCT images by use of the automated CAV scheme, we recommended reconstruction parameters based on a slice thickness ≤1.5 mm and the reconstruction kernel B70f. PMID:21216160
MDCT for computerized volumetry of pneumothoraces in pediatric patients.
Cai, Wenli; Lee, Edward Y; Vij, Abhinav; Mahmood, Soran A; Yoshida, Hiroyuki
2011-03-01
Our purpose in this study was to develop an automated computer-aided volumetry (CAV) scheme for quantifying pneumothorax in multidetector computed tomography (MDCT) images for pediatric patients and to investigate the imaging parameters that may affect its accuracy. Fifty-eight consecutive pediatric patients (mean age 12 ± 6 years) with pneumothorax who underwent MDCT for evaluation were collected retrospectively for this study. All cases were imaged by a 16- or 64-MDCT scanner with weight-based kilovoltage, low-dose tube current, 1.0-1.5 pitch, 0.6-5.0 mm slice thickness, and a B70f (sharp) or B31f (soft) reconstruction kernel. Sixty-three pneumothoraces ≥1 mL were visually identified in the left (n = 30) and right (n = 33) lungs. Each identified pneumothorax was contoured manually on an Amira workstation V4.1.1 (Mercury Computer Systems, Chelmsford, MA) by two radiologists in consensus. The computerized volumes of the pneumothoraces were determined by application of our CAV scheme. The accuracy of our automated CAV scheme was evaluated by comparison between computerized volumetry and manual volumetry, for the total volume of pneumothoraces in the left and right lungs. The mean difference between the computerized volumetry and the manual volumetry for all 63 pneumothoraces ≥1 mL was 8.2%. For pneumothoraces ≥10 mL, ≥50 mL, and ≥200 mL, the mean differences were 7.7% (n = 57), 7.3% (n = 33), and 6.4% (n = 13), respectively. The correlation coefficient was 0.99 between the computerized volume and the manual volume of pneumothoraces. Bland-Altman analysis showed that computerized volumetry has a mean difference of -5.1% compared to manual volumetry. For all pneumothoraces ≥10 mL, the mean differences for slice thickness ≤1.25 mm, = 1.5 mm, and = 5.0 mm were 6.1% (n = 28), 3.5% (n = 10), and 12.2% (n = 19), respectively. For the two reconstruction kernels, B70f and B31f, the mean differences were 6.3% (n = 42, B70f) and 11.7% (n = 15, B31f), respectively. Our automated CAV scheme provides an accurate measurement of pneumothorax volume in MDCT images of pediatric patients. For accurate volumetric quantification of pneumothorax in children in MDCT images by use of the automated CAV scheme, we recommended reconstruction parameters based on a slice thickness ≤1.5 mm and the reconstruction kernel B70f. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.
Kitamura, Takayuki; Hoshimoto, Hiroyuki; Yamada, Yoshitsugu
2009-10-01
The computerized anesthesia-recording systems are expensive and the introduction of the systems takes time and requires huge effort. Generally speaking, the efficacy of the computerized anesthesia-recording systems on the anesthetic managements is focused on the ability to automatically input data from the monitors to the anesthetic records, and tends to be underestimated. However, once the computerized anesthesia-recording systems are integrated into the medical information network, several features, which definitely contribute to improve the quality of the anesthetic management, can be developed; for example, to prevent misidentification of patients, to prevent mistakes related to blood transfusion, and to protect patients' personal information. Here we describe our experiences of the introduction of the computerized anesthesia-recording systems and the construction of the comprehensive medical information network for patients undergoing surgery in The University of Tokyo Hospital. We also discuss possible efficacy of the comprehensive medical information network for patients during surgery under anesthetic managements.
Antipsychotic treatment in schizophrenia: the role of computerized neuropsychological assessment.
Kertzman, Semion; Reznik, Ilya; Grinspan, Haim; Weizman, Abraham; Kotler, Moshe
2008-01-01
The present study analyzes the role of neurocognitive assessment instruments in the detection of the contribution of antipsychotic treatment to cognitive functioning. Recently, a panel of experts suggested six main domains (working memory; attention/vigilance; verbal/visual learning and memory; reasoning and problem solving; speed of processing) implicated in schizophrenia-related cognitive deficits, which serve as a theoretical base for creation of real-time computerized neurocognitive batteries. The high sensitivity of computerized neuropsychological testing is based on their ability to adopt the reaction time (RT) paradigm for the assessment of brain function in a real-time regime. This testing is highly relevant for the monitoring of the cognitive effects of antipsychotics. Computerized assessment assists in the identification of state- and trait-related cognitive impairments. The optimal real-time computerized neurocognitive battery should composite balance between broad and narrow coverage of cognitive domains relevant to the beneficial effects of antipsychotics and will enable better planning of treatment and rehabilitation programs.
Baldini, Alberto
2010-01-01
Summary This article details a case report of a subject chosen from among patients treated in the author’s clinic in the Posturology and Gnathology Section of the University Milano-Bicocca. It shows how the indispensable clinical analysis of the stomatognathic system and the connections between posture can be supported by instrumental analysis, such as the computerized occlusal analysis system and the force platform, to diagnose and treat dysfunctional patients. PMID:22238703
Feasibility analysis of reciprocating magnetic heat pumps
NASA Technical Reports Server (NTRS)
Larson, A. V.; Hartley, J. G.; Shelton, S. V.; Smith, M. M.
1986-01-01
The conceptual design selected for detailed system analysis and optimization is the reciprocating gadolinium core in a regenerative fluid column within the bore of a superconducting magnet. The thermodynamic properties of gadolinium are given. A computerized literature search for relevant papers was conducted and is being analyzed. Contact was made with suppliers of superconducting magnets and accessories, magnetic materials, and various types of hardware. A description of the model for the thermal analysis of the core and regenerator fluids is included.
[Assessment of gestures and their psychiatric relevance].
Bulucz, Judit; Simon, Lajos
2008-01-01
Analyzing and investigating non-verbal behavior and gestures has been receiving much attention since the last century. Thanks to the pioneer work of Ekman and Friesen we have a number of descriptive-analytic, categorizing and semantic content related scales and scoring systems. Generation of gestures, the integrative system with speech and the inter-cultural differences are in the focus of interest. Furthermore, analysis of the gestural changes caused by lesions of distinct neurological areas point toward to formation of new diagnostic approaches. The more widespread application of computerized methods resulted in an increasing number of experiments which study gesture generation, reproduction in mechanical and virtual reality. Increasing efforts are directed towards the understanding of human and computerized recognition of human gestures. In this review we describe the results emphasizing the relations of those results with psychiatric and neuropsychiatric disorders, specifically schizophrenia and affective spectrum.
Using concept mapping for assessing and promoting relational conceptual change in science
NASA Astrophysics Data System (ADS)
Liu, Xiufeng
2004-05-01
In this article, we adopted the relational conceptual change as our theoretical framework to accommodate current views of conceptual change such as ontological beliefs, epistemological commitment, and social/affective contexts commonly mentioned in the literature. We used a specific concept mapping format and process - digraphs and digraphing - as an operational framework for assessing and promoting relational conceptual change. We wanted to find out how concept mapping can be used to account for relational conceptual change. We collected data from a Grade 12 chemistry class using collaborative computerized concept mapping on an ongoing basis during a unit of instruction. Analysis of progressive concept maps and interview transcripts of representative students and the teacher showed that ongoing and collaborative computerized concept mapping is able to account for student conceptual change in ontological, epistemological, and social/affective domains.
Research on ionospheric tomography based on variable pixel height
NASA Astrophysics Data System (ADS)
Zheng, Dunyong; Li, Peiqing; He, Jie; Hu, Wusheng; Li, Chaokui
2016-05-01
A novel ionospheric tomography technique based on variable pixel height was developed for the tomographic reconstruction of the ionospheric electron density distribution. The method considers the height of each pixel as an unknown variable, which is retrieved during the inversion process together with the electron density values. In contrast to conventional computerized ionospheric tomography (CIT), which parameterizes the model with a fixed pixel height, the variable-pixel-height computerized ionospheric tomography (VHCIT) model applies a disturbance to the height of each pixel. In comparison with conventional CIT models, the VHCIT technique achieved superior results in a numerical simulation. A careful validation of the reliability and superiority of VHCIT was performed. According to the results of the statistical analysis of the average root mean square errors, the proposed model offers an improvement by 15% compared with conventional CIT models.
ERIC Educational Resources Information Center
Al Sarhan, Khaled Ali; AlZboon, Saleem Odeh; Olimat, Khalaf Mufleh; Al-Zboon, Mohammad Saleem
2013-01-01
The study aims at introducing the features of the computerized educational games in sciences at the elementary school in Jordan according to the specialists in teaching science and computer subjects, through answering some questions such as: What are the features of the computerized educational games in sciences at the elementary schools in Jordan…
ERIC Educational Resources Information Center
Schumaker, Jean B.; Fisher, Joseph B.; Walsh, Lisa D.
2010-01-01
Effects of a computerized professional development (PD) program were investigated in two studies. For each, teachers were randomly assigned to either a Virtual Workshop (VW) group that used a computerized program for PD or to an Actual Workshop (AW) group that participated in a live PD session. In Study 1, the teachers' knowledge about and…
2009-01-01
Current care guidelines recommend glucose control (GC) in critically ill patients. To achieve GC, many ICUs have implemented a (nurse-based) protocol on paper. However, such protocols are often complex, time-consuming, and can cause iatrogenic hypoglycemia. Computerized glucose regulation protocols may improve patient safety, efficiency, and nurse compliance. Such computerized clinical decision support systems (Cuss) use more complex logic to provide an insulin infusion rate based on previous blood glucose levels and other parameters. A computerized CDSS for glucose control has the potential to reduce overall workload, reduce the chance of human cognitive failure, and improve glucose control. Several computer-assisted glucose regulation programs have been published recently. In order of increasing complexity, the three main types of algorithms used are computerized flowcharts, Proportional-Integral-Derivative (PID), and Model Predictive Control (MPC). PID is essentially a closed-loop feedback system, whereas MPC models the behavior of glucose and insulin in ICU patients. Although the best approach has not yet been determined, it should be noted that PID controllers are generally thought to be more robust than MPC systems. The computerized Cuss that are most likely to emerge are those that are fully a part of the routine workflow, use patient-specific characteristics and apply variable sampling intervals. PMID:19849827
Gurung, Arati; Scrafford, Carolyn G; Tielsch, James M; Levine, Orin S; Checkley, William
2011-01-01
Rationale The standardized use of a stethoscope for chest auscultation in clinical research is limited by its inherent inter-listener variability. Electronic auscultation and automated classification of recorded lung sounds may help prevent some these shortcomings. Objective We sought to perform a systematic review and meta-analysis of studies implementing computerized lung sounds analysis (CLSA) to aid in the detection of abnormal lung sounds for specific respiratory disorders. Methods We searched for articles on CLSA in MEDLINE, EMBASE, Cochrane Library and ISI Web of Knowledge through July 31, 2010. Following qualitative review, we conducted a meta-analysis to estimate the sensitivity and specificity of CLSA for the detection of abnormal lung sounds. Measurements and Main Results Of 208 articles identified, we selected eight studies for review. Most studies employed either electret microphones or piezoelectric sensors for auscultation, and Fourier Transform and Neural Network algorithms for analysis and automated classification of lung sounds. Overall sensitivity for the detection of wheezes or crackles using CLSA was 80% (95% CI 72–86%) and specificity was 85% (95% CI 78–91%). Conclusions While quality data on CLSA are relatively limited, analysis of existing information suggests that CLSA can provide a relatively high specificity for detecting abnormal lung sounds such as crackles and wheezes. Further research and product development could promote the value of CLSA in research studies or its diagnostic utility in clinical setting. PMID:21676606
Gurung, Arati; Scrafford, Carolyn G; Tielsch, James M; Levine, Orin S; Checkley, William
2011-09-01
The standardized use of a stethoscope for chest auscultation in clinical research is limited by its inherent inter-listener variability. Electronic auscultation and automated classification of recorded lung sounds may help prevent some of these shortcomings. We sought to perform a systematic review and meta-analysis of studies implementing computerized lung sound analysis (CLSA) to aid in the detection of abnormal lung sounds for specific respiratory disorders. We searched for articles on CLSA in MEDLINE, EMBASE, Cochrane Library and ISI Web of Knowledge through July 31, 2010. Following qualitative review, we conducted a meta-analysis to estimate the sensitivity and specificity of CLSA for the detection of abnormal lung sounds. Of 208 articles identified, we selected eight studies for review. Most studies employed either electret microphones or piezoelectric sensors for auscultation, and Fourier Transform and Neural Network algorithms for analysis and automated classification of lung sounds. Overall sensitivity for the detection of wheezes or crackles using CLSA was 80% (95% CI 72-86%) and specificity was 85% (95% CI 78-91%). While quality data on CLSA are relatively limited, analysis of existing information suggests that CLSA can provide a relatively high specificity for detecting abnormal lung sounds such as crackles and wheezes. Further research and product development could promote the value of CLSA in research studies or its diagnostic utility in clinical settings. Copyright © 2011 Elsevier Ltd. All rights reserved.
Anconina, Reut; Zur, Dinah; Kesler, Anat; Lublinsky, Svetlana; Toledano, Ronen; Novack, Victor; Benkobich, Elya; Novoa, Rosa; Novic, Evelyne Farkash; Shelef, Ilan
2017-06-01
Dural sinuses vary in size and shape in many pathological conditions with abnormal intracranial pressure. Size and shape normograms of dural brain sinuses are not available. The creation of such normograms may enable computer-assisted comparison to pathologic exams and facilitate diagnoses. The purpose of this study was to quantitatively evaluate normal magnetic resonance venography (MRV) studies in order to create normograms of dural sinuses using a computerized algorithm for vessel cross-sectional analysis. This was a retrospective analysis of MRV studies of 30 healthy persons. Data were analyzed using a specially developed Matlab algorithm for vessel cross-sectional analysis. The cross-sectional area and shape measurements were evaluated to create normograms. Mean cross-sectional size was 53.27±13.31 for the right transverse sinus (TS), 46.87+12.57 for the left TS (p=0.089) and 36.65+12.38 for the superior sagittal sinus. Normograms were created. The distribution of cross-sectional areas along the vessels showed distinct patterns and a parallel course for the median, 25th, 50th and 75th percentiles. In conclusion, using a novel computerized method for vessel cross-sectional analysis we were able to quantitatively characterize dural sinuses of healthy persons and create normograms. Copyright © 2017 Elsevier Ltd. All rights reserved.
[Biosensor development in clinical analysis].
Boitieux, J L; Desmet, G; Thomas, D
1985-01-01
The use of enzymes immobilized or as markers formed the subject of more than thousand publications in the field of industry or biomedical applications, during the last five years. Recently, some authors published works concerning immobilization of total microorganisms for catalytic purposes, others use the enzymatic activity for marking molecules involved in immunological analysis processes. Together industrial biotechnology and medical analysis laboratory are interested with the evolution of these procedures involving the activity of immobilized enzymes. Enzyme immobilization allowed the lowering of analysis costs for, in this case, the enzyme can be used several times. We take account of the two main cases which are encountered during utilization of immobilized enzymes of analytical purposes. The enzyme is used directly for the catalysed reaction or it is used as enzymatic marker. These both aspects are developed mainly for the elaboration of enzymatic and immunoenzymatic electrodes and the realization of automatic computerized devices allowing continuous estimation of numerous biological blood parameters. From these two precise examples, glucose and antigen determination, the authors show the evolution of these technologies in the field of immobilized enzymes or captors and the analysis of signals given by these electrodes requiring a computerized treatment. This new technology opens to important potentialities in the analytical field. The automatization of these devices allowing the control in real time, will probably make easier the optimization steps of procedures actually used in the biomedical sphere.
NASA Astrophysics Data System (ADS)
Devendran, A. A.; Lakshmanan, G.
2014-11-01
Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.
Pain Perception: Computerized versus Traditional Local Anesthesia in Pediatric Patients.
Mittal, M; Kumar, A; Srivastava, D; Sharma, P; Sharma, S
2015-01-01
Local anesthetic injection is one of the most anxiety- provoking procedure for both children and adult patients in dentistry. A computerized system for slow delivery of local anesthetic has been developed as a possible solution to reduce the pain related to the local anesthetic injection. The present study was conducted to evaluate and compare pain perception rates in pediatric patients with computerized system and traditional methods, both objectively and subjectively. It was a randomized controlled study in one hundred children aged 8-12 years in healthy physical and mental state, assessed as being cooperative, requiring extraction of maxillary primary molars. Children were divided into two groups by random sampling - Group A received buccal and palatal infiltration injection using Wand, while Group B received buccal and palatal infiltration using traditional syringe. Visual Analog scale (VAS) was used for subjective evaluation of pain perception by patient. Sound, Eye, Motor (SEM) scale was used as an objective method where sound, eye and motor reactions of patient were observed and heart rate measurement using pulse oximeter was used as the physiological parameter for objective evaluation. Patients experienced significantly less pain of injection with the computerized method during palatal infiltration, while less pain was not statistically significant during buccal infiltration. Heart rate increased during both buccal and palatal infiltration in traditional and computerized local anesthesia, but difference between traditional and computerized method was not statistically significant. It was concluded that pain perception was significantly more during traditional palatal infiltration injection as compared to computerized palatal infiltration, while there was no difference in pain perception during buccal infiltration in both the groups.
Decision-support information system to manage mass casualty incidents at a level 1 trauma center.
Bar-El, Yaron; Tzafrir, Sara; Tzipori, Idan; Utitz, Liora; Halberthal, Michael; Beyar, Rafael; Reisner, Shimon
2013-12-01
Mass casualty incidents are probably the greatest challenge to a hospital. When such an event occurs, hospitals are required to instantly switch from their routine activity to conditions of great uncertainty and confront needs that exceed resources. We describe an information system that was uniquely designed for managing mass casualty events. The web-based system is activated when a mass casualty event is declared; it displays relevant operating procedures, checklists, and a log book. The system automatically or semiautomatically initiates phone calls and public address announcements. It collects real-time data from computerized clinical and administrative systems in the hospital, and presents them to the managing team in a clear graphic display. It also generates periodic reports and summaries of available or scarce resources that are sent to predefined recipients. When the system was tested in a nationwide exercise, it proved to be an invaluable tool for informed decision making in demanding and overwhelming situations such as mass casualty events.
NASA Astrophysics Data System (ADS)
Marukhina, O. V.; Berestneva, O. G.; Emelyanova, Yu A.; Romanchukov, S. V.; Petrova, L.; Lombardo, C.; Kozlova, N. V.
2018-05-01
The healthcare computerization creates opportunities to the clinical decision support system development. In the course of diagnosis, doctor manipulates a considerable amount of data and makes a decision in the context of uncertainty basing upon the first-hand experience and knowledge. The situation is exacerbated by the fact that the knowledge scope in medicine is incrementally growing, but the decision-making time does not increase. The amount of medical malpractice is growing and it leads to various negative effects, even the mortality rate increase. IT-solution's development for clinical purposes is one of the most promising and efficient ways to prevent these effects. That is why the efforts of many IT specialists are directed to the doctor's heuristics simulating software or expert-based medical decision-making algorithms development. Thus, the objective of this study is to develop techniques and approaches for the body physiological system's informative value assessment index for the obesity degree evaluation based on the diagnostic findings.
3D shape measurements with a single interferometric sensor for in-situ lathe monitoring
NASA Astrophysics Data System (ADS)
Kuschmierz, R.; Huang, Y.; Czarske, J.; Metschke, S.; Löffler, F.; Fischer, A.
2015-05-01
Temperature drifts, tool deterioration, unknown vibrations as well as spindle play are major effects which decrease the achievable precision of computerized numerically controlled (CNC) lathes and lead to shape deviations between the processed work pieces. Since currently no measurement system exist for fast, precise and in-situ 3d shape monitoring with keyhole access, much effort has to be made to simulate and compensate these effects. Therefore we introduce an optical interferometric sensor for absolute 3d shape measurements, which was integrated into a working lathe. According to the spindle rotational speed, a measurement rate of 2,500 Hz was achieved. In-situ absolute shape, surface profile and vibration measurements are presented. While thermal drifts of the sensor led to errors of several mµm for the absolute shape, reference measurements with a coordinate machine show, that the surface profile could be measured with an uncertainty below one micron. Additionally, the spindle play of 0.8 µm was measured with the sensor.
NASA Astrophysics Data System (ADS)
Arnbjerg-Nielsen, Karsten; Zhou, Qianqian
2014-05-01
There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.
Traceable Coulomb blockade thermometry
NASA Astrophysics Data System (ADS)
Hahtela, O.; Mykkänen, E.; Kemppinen, A.; Meschke, M.; Prunnila, M.; Gunnarsson, D.; Roschier, L.; Penttilä, J.; Pekola, J.
2017-02-01
We present a measurement and analysis scheme for determining traceable thermodynamic temperature at cryogenic temperatures using Coulomb blockade thermometry. The uncertainty of the electrical measurement is improved by utilizing two sampling digital voltmeters instead of the traditional lock-in technique. The remaining uncertainty is dominated by that of the numerical analysis of the measurement data. Two analysis methods are demonstrated: numerical fitting of the full conductance curve and measuring the height of the conductance dip. The complete uncertainty analysis shows that using either analysis method the relative combined standard uncertainty (k = 1) in determining the thermodynamic temperature in the temperature range from 20 mK to 200 mK is below 0.5%. In this temperature range, both analysis methods produced temperature estimates that deviated from 0.39% to 0.67% from the reference temperatures provided by a superconducting reference point device calibrated against the Provisional Low Temperature Scale of 2000.
NASA Technical Reports Server (NTRS)
Litvin, Faydor L.; Fuentes, Alfonso; Hawkins, J. M.; Handschuh, Robert F.
2001-01-01
A new type of face gear drive for application in transmissions, particularly in helicopters, has been developed. The new geometry differs from the existing geometry by application of asymmetric profiles and double-crowned pinion of the face gear mesh. The paper describes the computerized design, simulation of meshing and contact, and stress analysis by finite element method. Special purpose computer codes have been developed to conduct the analysis. The analysis of this new type of face gear is illustrated with a numerical example.
Development of a Prototype Model-Form Uncertainty Knowledge Base
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2016-01-01
Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.
UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E
A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...
Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate
NASA Technical Reports Server (NTRS)
Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.
2013-01-01
There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable
ERIC Educational Resources Information Center
Lorenz, Tierney Ahrold; Meston, Cindy May
2012-01-01
Objectives: To better understand the link between childhood sexual abuse (CSA) and adult sexual functioning and satisfaction, we examined cognitive differences between women with (N = 128) and without (NSA, N = 99) CSA histories. Methods: We used the Linguistic Inquiry Word Count, a computerized text analysis program, to investigate language…
Computerized and Networked Government Information Column
ERIC Educational Resources Information Center
Stratford, Juri
2004-01-01
The efforts of the U.S. federal government to develop E-Government services have been the subject of many recent news stories. In February 2002, Arthur Andersen's Office of Government Services released a usability analysis of federal government Web sites. In addition, in February, Vice-President Cheney announced the release of a report detailing…
Industrial and Biological Analogies Used Creatively by Business Professionals
ERIC Educational Resources Information Center
Kennedy, Emily B.; Miller, Derek J.; Niewiarowski, Peter H.
2018-01-01
The objective of this study was to test the effect of far-field industrial (i.e., man-made) versus biological analogies on creativity of business professionals from two organizations engaged in the idea generation phase of new product development. Psychological effects, as reflected in language use, were measured via computerized text analysis of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanham, R.; Vogt, W.G.; Mickle, M.H.
1986-01-01
This book presents the papers given at a conference on computerized simulation. Topics considered at the conference included expert systems, modeling in electric power systems, power systems operating strategies, energy analysis, a linear programming approach to optimum load shedding in transmission systems, econometrics, simulation in natural gas engineering, solar energy studies, artificial intelligence, vision systems, hydrology, multiprocessors, and flow models.
Super and parallel computers and their impact on civil engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamat, M.P.
1986-01-01
This book presents the papers given at a conference on the use of supercomputers in civil engineering. Topics considered at the conference included solving nonlinear equations on a hypercube, a custom architectured parallel processing system, distributed data processing, algorithms, computer architecture, parallel processing, vector processing, computerized simulation, and cost benefit analysis.
New Tools for "New" History: Computers and the Teaching of Quantitative Historical Methods.
ERIC Educational Resources Information Center
Burton, Orville Vernon; Finnegan, Terence
1989-01-01
Explains the development of an instructional software package and accompanying workbook which teaches students to apply computerized statistical analysis to historical data, improving the study of social history. Concludes that the use of microcomputers and supercomputers to manipulate historical data enhances critical thinking skills and the use…
Outlier Detection in High-Stakes Certification Testing. Research Report.
ERIC Educational Resources Information Center
Meijer, Rob R.
Recent developments of person-fit analysis in computerized adaptive testing (CAT) are discussed. Methods from statistical process control are presented that have been proposed to classify an item score pattern as fitting or misfitting the underlying item response theory (IRT) model in a CAT. Most person-fit research in CAT is restricted to…
Comparative Minicolumnar Morphometry of Three Distinguished Scientists
ERIC Educational Resources Information Center
Casanova, Manuel F.; Switala, Andrew E.; Trippe, Juan; Fitzgerald, Michael
2007-01-01
It has been suggested that the cell minicolumn is the smallest module capable of information processing within the brain. In this case series, photomicrographs of six regions of interests (Brodmann areas 4, 9, 17, 21, 22, and 40) were analyzed by computerized image analysis for minicolumnar morphometry in the brains of three distinguished…
ERIC Educational Resources Information Center
Palaich, Robert M.; Griffin Good, Dixie; van der Ploeg, Arie
2004-01-01
Driven by growing accountability pressures, states and districts have invested in a variety of computerized systems for data storage, analysis, and reporting. As accountability policies demand access to more transparent and accurate data about every aspect of the education process, developing linkages among historically disparate systems is…
2016-01-01
Pennsylvania, we completed computerized volumetric analysis of the structural MRI scans of the brain collected from the study subjects, using the... pharmacological management. Brain Injury 2001;15(2):139-48. 07. Wroblewski BA, Joseph AB, Kupfer J, Kalliel K. Effectiveness of valproic acid on
ERIC Educational Resources Information Center
Cunningham, Charles E.; Vaillancourt, Tracy; Rimas, Heather; Deal, Ken; Cunningham, Lesley; Short, Kathy; Chen, Yvonne
2009-01-01
We used discrete choice conjoint analysis to model the bullying prevention program preferences of educators. Using themes from computerized decision support lab focus groups (n = 45 educators), we composed 20 three-level bullying prevention program design attributes. Each of 1,176 educators completed 25 choice tasks presenting experimentally…
Towards On-Line Services Based on a Holistic Analysis of Human Activities
NASA Technical Reports Server (NTRS)
Clancey, William J.
2004-01-01
Very often computer scientists view computerization of services in terms of the logistics of human-machine interaction, including establishing a contract, accessing records, and of course designing an interface. But this analysis often moves too quickly to tactical details, failing to frame the entire service in human terms, and not recognizing the mutual learning required to define and relate goals, constraints, and the personalized value of available services. In particular, on-line services that "computerize communication" can be improved by constructing an activity model of what the person is trying to do, not just filtering, comparing, and selling piece-meal services. For example, from the customer s perspective the task of an on-line travel service is not merely to establish confirmed reservations, but to have a complete travel plan, usually integrating many days of transportation, lodging, and recreation into a happy experience. The task of the travel agent is not merely "ticketing", but helping the customer understand what they want and providing services that will connect everything together in an enjoyable way.
A methodology to estimate uncertainty for emission projections through sensitivity analysis.
Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación
2015-04-01
Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
Analysis of uncertainties in turbine metal temperature predictions
NASA Technical Reports Server (NTRS)
Stepka, F. S.
1980-01-01
An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.
Relating Data and Models to Characterize Parameter and Prediction Uncertainty
Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans
2015-04-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.
2015-01-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract
NASA Astrophysics Data System (ADS)
Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang
2017-01-01
This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.
Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry
NASA Technical Reports Server (NTRS)
West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat
2016-01-01
The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.
Kriston, Levente; Meister, Ramona
2014-03-01
Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.
Uncertainty in BRCA1 cancer susceptibility testing.
Baty, Bonnie J; Dudley, William N; Musters, Adrian; Kinney, Anita Y
2006-11-15
This study investigated uncertainty in individuals undergoing genetic counseling/testing for breast/ovarian cancer susceptibility. Sixty-three individuals from a single kindred with a known BRCA1 mutation rated uncertainty about 12 items on a five-point Likert scale before and 1 month after genetic counseling/testing. Factor analysis identified a five-item total uncertainty scale that was sensitive to changes before and after testing. The items in the scale were related to uncertainty about obtaining health care, positive changes after testing, and coping well with results. The majority of participants (76%) rated reducing uncertainty as an important reason for genetic testing. The importance of reducing uncertainty was stable across time and unrelated to anxiety or demographics. Yet, at baseline, total uncertainty was low and decreased after genetic counseling/testing (P = 0.004). Analysis of individual items showed that after genetic counseling/testing, there was less uncertainty about the participant detecting cancer early (P = 0.005) and coping well with their result (P < 0.001). Our findings support the importance to clients of genetic counseling/testing as a means of reducing uncertainty. Testing may help clients to reduce the uncertainty about items they can control, and it may be important to differentiate the sources of uncertainty that are more or less controllable. Genetic counselors can help clients by providing anticipatory guidance about the role of uncertainty in genetic testing. (c) 2006 Wiley-Liss, Inc.
Interactive computer graphics system for structural sizing and analysis of aircraft structures
NASA Technical Reports Server (NTRS)
Bendavid, D.; Pipano, A.; Raibstein, A.; Somekh, E.
1975-01-01
A computerized system for preliminary sizing and analysis of aircraft wing and fuselage structures was described. The system is based upon repeated application of analytical program modules, which are interactively interfaced and sequence-controlled during the iterative design process with the aid of design-oriented graphics software modules. The entire process is initiated and controlled via low-cost interactive graphics terminals driven by a remote computer in a time-sharing mode.
Assessment of Severity of Ovine Smoke Inhalation Injury by Analysis of Computed Tomographic Scans
2003-09-01
Computerized analysis of three- dimensional reconstructed scans was also performed, based on Hounsfield unit ranges: hyperinflated, 1,000 to 900; normal...the interactive segmentation function of the software. The pulmonary parenchyma was separated into four regions based on the Hounsfield unit (HU...SII) severity. Methods: Twenty anesthetized sheep underwent graded SII: group I, no smoke; group II, 5 smoke units ; group III, 10 units ; and group IV