NASA Astrophysics Data System (ADS)
Makahinda, T.
2018-02-01
The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.
Precision Learning Assessment: An Alternative to Traditional Assessment Techniques.
ERIC Educational Resources Information Center
Caltagirone, Paul J.; Glover, Christopher E.
1985-01-01
A continuous and curriculum-based assessment method, Precision Learning Assessment (PLA), which integrates precision teaching and norm-referenced techniques, was applied to a math computation curriculum for 214 third graders. The resulting districtwide learning curves defining average annual progress through the computation curriculum provided…
Objective Assessment of Patient Inhaler User Technique Using an Audio-Based Classification Approach.
Taylor, Terence E; Zigel, Yaniv; Egan, Clarice; Hughes, Fintan; Costello, Richard W; Reilly, Richard B
2018-02-01
Many patients make critical user technique errors when using pressurised metered dose inhalers (pMDIs) which reduce the clinical efficacy of respiratory medication. Such critical errors include poor actuation coordination (poor timing of medication release during inhalation) and inhaling too fast (peak inspiratory flow rate over 90 L/min). Here, we present a novel audio-based method that objectively assesses patient pMDI user technique. The Inhaler Compliance Assessment device was employed to record inhaler audio signals from 62 respiratory patients as they used a pMDI with an In-Check Flo-Tone device attached to the inhaler mouthpiece. Using a quadratic discriminant analysis approach, the audio-based method generated a total frame-by-frame accuracy of 88.2% in classifying sound events (actuation, inhalation and exhalation). The audio-based method estimated the peak inspiratory flow rate and volume of inhalations with an accuracy of 88.2% and 83.94% respectively. It was detected that 89% of patients made at least one critical user technique error even after tuition from an expert clinical reviewer. This method provides a more clinically accurate assessment of patient inhaler user technique than standard checklist methods.
Soft computing-based terrain visual sensing and data fusion for unmanned ground robotic systems
NASA Astrophysics Data System (ADS)
Shirkhodaie, Amir
2006-05-01
In this paper, we have primarily discussed technical challenges and navigational skill requirements of mobile robots for traversability path planning in natural terrain environments similar to Mars surface terrains. We have described different methods for detection of salient terrain features based on imaging texture analysis techniques. We have also presented three competing techniques for terrain traversability assessment of mobile robots navigating in unstructured natural terrain environments. These three techniques include: a rule-based terrain classifier, a neural network-based terrain classifier, and a fuzzy-logic terrain classifier. Each proposed terrain classifier divides a region of natural terrain into finite sub-terrain regions and classifies terrain condition exclusively within each sub-terrain region based on terrain visual clues. The Kalman Filtering technique is applied for aggregative fusion of sub-terrain assessment results. The last two terrain classifiers are shown to have remarkable capability for terrain traversability assessment of natural terrains. We have conducted a comparative performance evaluation of all three terrain classifiers and presented the results in this paper.
Authoring of Adaptive Computer Assisted Assessment of Free-Text Answers
ERIC Educational Resources Information Center
Alfonseca, Enrique; Carro, Rosa M.; Freire, Manuel; Ortigosa, Alvaro; Perez, Diana; Rodriguez, Pilar
2005-01-01
Adaptation techniques can be applied not only to the multimedia contents or navigational possibilities of a course, but also to the assessment. In order to facilitate the authoring of adaptive free-text assessment and its integration within adaptive web-based courses, Adaptive Hypermedia techniques and Free-text Computer Assisted Assessment are…
Visual terrain mapping for traversable path planning of mobile robots
NASA Astrophysics Data System (ADS)
Shirkhodaie, Amir; Amrani, Rachida; Tunstel, Edward W.
2004-10-01
In this paper, we have primarily discussed technical challenges and navigational skill requirements of mobile robots for traversability path planning in natural terrain environments similar to Mars surface terrains. We have described different methods for detection of salient terrain features based on imaging texture analysis techniques. We have also presented three competing techniques for terrain traversability assessment of mobile robots navigating in unstructured natural terrain environments. These three techniques include: a rule-based terrain classifier, a neural network-based terrain classifier, and a fuzzy-logic terrain classifier. Each proposed terrain classifier divides a region of natural terrain into finite sub-terrain regions and classifies terrain condition exclusively within each sub-terrain region based on terrain visual clues. The Kalman Filtering technique is applied for aggregative fusion of sub-terrain assessment results. The last two terrain classifiers are shown to have remarkable capability for terrain traversability assessment of natural terrains. We have conducted a comparative performance evaluation of all three terrain classifiers and presented the results in this paper.
Computational technique for stepwise quantitative assessment of equation correctness
NASA Astrophysics Data System (ADS)
Othman, Nuru'l Izzah; Bakar, Zainab Abu
2017-04-01
Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.
Digital education and dynamic assessment of tongue diagnosis based on Mashup technique.
Tsai, Chin-Chuan; Lo, Yen-Cheng; Chiang, John Y; Sainbuyan, Natsagdorj
2017-01-24
To assess the digital education and dynamic assessment of tongue diagnosis based on Mashup technique (DEDATD) according to specifific user's answering pattern, and provide pertinent information tailored to user's specifific needs supplemented by the teaching materials constantly updated through the Mashup technique. Fifty-four undergraduate students were tested with DEDATD developed. The effificacy of the DEDATD was evaluated based on the pre- and post-test performance, with interleaving training sessions targeting on the weakness of the student under test. The t-test demonstrated that signifificant difference was reached in scores gained during pre- and post-test sessions, and positive correlation between scores gained and length of time spent on learning, while no signifificant differences between the gender and post-test score, and the years of students in school and the progress in score gained. DEDATD, coupled with Mashup technique, could provide updated materials fifiltered through diverse sources located across the network. The dynamic assessment could tailor each individual learner's needs to offer custom-made learning materials. DEDATD poses as a great improvement over the traditional teaching methods.
NASA Technical Reports Server (NTRS)
Lakshminarayana, B.
1991-01-01
Various computational fluid dynamic techniques are reviewed focusing on the Euler and Navier-Stokes solvers with a brief assessment of boundary layer solutions, and quasi-3D and quasi-viscous techniques. Particular attention is given to a pressure-based method, explicit and implicit time marching techniques, a pseudocompressibility technique for incompressible flow, and zonal techniques. Recommendations are presented with regard to the most appropriate technique for various flow regimes and types of turbomachinery, incompressible and compressible flows, cascades, rotors, stators, liquid-handling, and gas-handling turbomachinery.
Baldwin, Lydia J L; Jones, Christopher M; Hulme, Jonathan; Owen, Andrew
2015-11-01
Feedback is vital for the effective delivery of skills-based education. We sought to compare the sandwich technique and learning conversation structured methods of feedback delivery in competency-based basic life support (BLS) training. Open randomised crossover study undertaken between October 2014 and March 2015 at the University of Birmingham, United Kingdom. Six-hundred and forty healthcare students undertaking a European Resuscitation Council (ERC) BLS course were enrolled, each of whom was randomised to receive teaching using either the sandwich technique or the learning conversation. Fifty-eight instructors were randomised to initially teach using either the learning conversation or sandwich technique, prior to crossing-over and teaching with the alternative technique after a pre-defined time period. Outcome measures included skill acquisition as measured by an end-of-course competency assessment, instructors' perception of teaching with each feedback technique and candidates' perception of the feedback they were provided with. Scores assigned to use of the learning conversation by instructors were significantly more favourable than for the sandwich technique across all but two assessed domains relating to instructor perception of the feedback technique, including all skills-based domains. No difference was seen in either assessment pass rates (80.9% sandwich technique vs. 77.2% learning conversation; OR 1.2, 95% CI 0.85-1.84; p=0.29) or any domain relating to candidates' perception of their teaching technique. This is the first direct comparison of two feedback techniques in clinical medical education using both quantitative and qualitative methodology. The learning conversation is preferred by instructors providing competency-based life support training and is perceived to favour skills acquisition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Use of Formative Classroom Assessment Techniques in a Project Management Course
ERIC Educational Resources Information Center
Purcell, Bernice M.
2014-01-01
Formative assessment is considered to be an evaluation technique that informs the instructor of the level of student learning, giving evidence when it may be necessary for the instructor to make a change in delivery based upon the results. Several theories of formative assessment exist, all which propound the importance of feedback to the student.…
Attitudes of Nigerian Secondary School Teachers towards Media-Based Learning.
ERIC Educational Resources Information Center
Ekpo, C. M.
This document presents results of a study assessing the attitudes of secondary school teachers towards media based learning. The study explores knowledge of and exposure to media based learning techniques of a cross section of Nigerian secondary school teachers. Factors that affect the use of media based learning technique are sought. Media based…
Program risk analysis handbook
NASA Technical Reports Server (NTRS)
Batson, R. G.
1987-01-01
NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.
Cost considerations in using simulations for medical training.
Fletcher, J D; Wind, Alexander P
2013-10-01
This article reviews simulation used for medical training, techniques for assessing simulation-based training, and cost analyses that can be included in such assessments. Simulation in medical training appears to take four general forms: human actors who are taught to simulate illnesses and ailments in standardized ways; virtual patients who are generally presented via computer-controlled, multimedia displays; full-body manikins that simulate patients using electronic sensors, responders, and controls; and part-task anatomical simulations of various body parts and systems. Techniques for assessing costs include benefit-cost analysis, return on investment, and cost-effectiveness analysis. Techniques for assessing the effectiveness of simulation-based medical training include the use of transfer effectiveness ratios and incremental transfer effectiveness ratios to measure transfer of knowledge and skill provided by simulation to the performance of medical procedures. Assessment of costs and simulation effectiveness can be combined with measures of transfer using techniques such as isoperformance analysis to identify ways of minimizing costs without reducing performance effectiveness or maximizing performance without increasing costs. In sum, economic analysis must be considered in training assessments if training budgets are to compete successfully with other requirements for funding. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Fantini, Sergio; Sassaroli, Angelo; Tgavalekos, Kristen T.; Kornbluth, Joshua
2016-01-01
Abstract. Cerebral blood flow (CBF) and cerebral autoregulation (CA) are critically important to maintain proper brain perfusion and supply the brain with the necessary oxygen and energy substrates. Adequate brain perfusion is required to support normal brain function, to achieve successful aging, and to navigate acute and chronic medical conditions. We review the general principles of CBF measurements and the current techniques to measure CBF based on direct intravascular measurements, nuclear medicine, X-ray imaging, magnetic resonance imaging, ultrasound techniques, thermal diffusion, and optical methods. We also review techniques for arterial blood pressure measurements as well as theoretical and experimental methods for the assessment of CA, including recent approaches based on optical techniques. The assessment of cerebral perfusion in the clinical practice is also presented. The comprehensive description of principles, methods, and clinical requirements of CBF and CA measurements highlights the potentially important role that noninvasive optical methods can play in the assessment of neurovascular health. In fact, optical techniques have the ability to provide a noninvasive, quantitative, and continuous monitor of CBF and autoregulation. PMID:27403447
Optical coherence tomography angiography in glaucoma care.
Chansangpetch, Sunee; Lin, Shan C
2018-05-14
Rapid improvements in optical coherence tomography (OCT) technology have allowed for enhancement of both image resolution and scanning speed, and the development of vascular assessment modality. Optical coherence tomography angiography (OCTA) is the non-invasive in vivo imaging of the vasculature located within the retina and optic nerve head area. The principle of OCTA is to use the variations in OCT signals caused by moving particles as the contrast mechanism for imaging of flow. Several algorithms which aim to maximize the contrast signal and minimize the noise have been developed including the phase-based techniques, intensity-based techniques (e.g., split-spectrum amplitude decorrelation angiography (SSADA)), and complex-based techniques (e.g., optical microangiography (OMAG)). With its reliable technique, high image resolution, and current availability, OCTA has been widely used in the assessment of posterior segment diseases including glaucoma in which ocular perfusion dysfunction has been proposed as a pathophysiological mechanism. This review will provide the reader with information on the principle techniques of OCTA; the current literature on OCTA reproducibility; its applications to glaucoma detection and monitoring of progression; and the role of OCTA in the assessment of the vascular component in glaucoma pathogenesis.
Health Risk-Based Assessment and Management of Heavy Metals-Contaminated Soil Sites in Taiwan
Lai, Hung-Yu; Hseu, Zeng-Yei; Chen, Ting-Chien; Chen, Bo-Ching; Guo, Horng-Yuh; Chen, Zueng-Sang
2010-01-01
Risk-based assessment is a way to evaluate the potential hazards of contaminated sites and is based on considering linkages between pollution sources, pathways, and receptors. These linkages can be broken by source reduction, pathway management, and modifying exposure of the receptors. In Taiwan, the Soil and Groundwater Pollution Remediation Act (SGWPR Act) uses one target regulation to evaluate the contamination status of soil and groundwater pollution. More than 600 sites contaminated with heavy metals (HMs) have been remediated and the costs of this process are always high. Besides using soil remediation techniques to remove contaminants from these sites, the selection of possible remediation methods to obtain rapid risk reduction is permissible and of increasing interest. This paper discusses previous soil remediation techniques applied to different sites in Taiwan and also clarified the differences of risk assessment before and after soil remediation obtained by applying different risk assessment models. This paper also includes many case studies on: (1) food safety risk assessment for brown rice growing in a HMs-contaminated site; (2) a tiered approach to health risk assessment for a contaminated site; (3) risk assessment for phytoremediation techniques applied in HMs-contaminated sites; and (4) soil remediation cost analysis for contaminated sites in Taiwan. PMID:21139851
Wood and Wood-Based Materials as Sensors—A Review of the Piezoelectric Effect in Wood
Robert J. Ross; Jiangming Kan; Xiping Wang; Julie Blankenburg; Janet I. Stockhausen; Roy F. Pellerin
2012-01-01
A variety of techniques have been investigated for use in assessing the physical and mechanical properties of wood products and structures. Ultrasound, transverse vibration, and stress-wave based methods are all techniques that have shown promise for many nondestructive evaluation applications. These techniques and others rely on the use of measurement systems to...
In Vivo Methods for the Assessment of Topical Drug Bioavailability
Herkenne, Christophe; Alberti, Ingo; Naik, Aarti; Kalia, Yogeshvar N.; Mathy, François-Xavier; Préat, Véronique
2007-01-01
This paper reviews some current methods for the in vivo assessment of local cutaneous bioavailability in humans after topical drug application. After an introduction discussing the importance of local drug bioavailability assessment and the limitations of model-based predictions, the focus turns to the relevance of experimental studies. The available techniques are then reviewed in detail, with particular emphasis on the tape stripping and microdialysis methodologies. Other less developed techniques, including the skin biopsy, suction blister, follicle removal and confocal Raman spectroscopy techniques are also described. PMID:17985216
Olyaeemanesh, Alireza; Bavandpour, Elahe; Mobinizadeh, Mohammadreza; Ashrafinia, Mansoor; Bavandpour, Maryam; Nouhi, Mojtaba
2017-01-01
Background: Caesarean section (C-section) is the most common surgery among women worldwide, and the global rate of this surgical procedure has been continuously rising. Hence, it is significantly crucial to develop and apply highly effective and safe caesarean section techniques. In this review study, we aimed at assessing the safety and effectiveness of the Joel-Cohen-based technique and comparing the results with the transverse Pfannenstiel incision for C-section. Methods: In this study, various reliable databases such as the PubMed Central, COCHRANE, DARE, and Ovid MEDLINE were targeted. Reviews, systematic reviews, and randomized clinical trial studies comparing the Joel-Cohen-based technique and the transverse Pfannenstiel incision were selected based on the inclusion criteria. Selected studies were checked by 2 independent reviewers based on the inclusion criteria, and the quality of these studies was assessed. Then, their data were extracted and analyzed. Results: Five randomized clinical trial studies met the inclusion criteria. According to the exiting evidence, statistical results of the Joel-Cohen-based technique showed that this technique is more effective compared to the transverse Pfannenstiel incision. Metaanalysis results of the 3 outcomes were as follow: operation time (5 trials, 764 women; WMD -9.78; 95% CI:-14.49-5.07 minutes, p<0.001), blood loss (3 trials, 309 women; WMD -53.23ml; 95% -CI: 90.20-16.26 ml, p= 0.004), and post-operative hospital stay (3 trials, 453 women; WMD -.69 day; 95% CI: 1.4-0.03 day, p<0.001). Statistical results revealed a significant difference between the 2 techniques. Conclusion: According to the literature, despite having a number of side effects, the Joel-Cohen-based technique is generally more effective than the Pfannenstiel incision technique. In addition, it was recommended that the Joel-Cohen-based technique be used as a replacement for the Pfannenstiel incision technique according to the surgeons' preferences and the patients' conditions.
Design Considerations of a Compounded Sterile Preparations Course
Petraglia, Christine; Mattison, Melissa J.
2016-01-01
Objective. To design a comprehensive learning and assessment environment for the practical application of compounded sterile preparations using a constructivist approach. Design. Compounded Sterile Preparations Laboratory is a required 1-credit course that builds upon the themes of training aseptic technique typically used in health system settings and threads application of concepts from other courses in the curriculum. Students used critical-thinking skills to devise appropriate strategies to compound sterile preparations. Assessment. Aseptic technique skills were assessed with objective, structured, checklist-based rubrics. Most students successfully completed practical assessments using appropriate technique (mean assessment grade=83.2%). Almost all students passed the practical media fill (98%) and gloved fingertip sampling (86%) tests on the first attempt; all passed on the second attempt. Conclusion. Employing a constructivist scaffold approach to teaching proper hygiene and aseptic technique prepared students to pass media fill and gloved fingertip tests and to perform well on practical compounding assessments. PMID:26941438
Combining heuristic and statistical techniques in landslide hazard assessments
NASA Astrophysics Data System (ADS)
Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni
2014-05-01
As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.
Designing a Successful Bidding Strategy Using Fuzzy Sets and Agent Attitudes
NASA Astrophysics Data System (ADS)
Ma, Jun; Goyal, Madhu Lata
To be successful in a multi-attribute auction, agents must be capable of adapting to continuously changing bidding price. This chapter presents a novel fuzzy attitude-based bidding strategy (FA-Bid), which employs dual assessment technique, i.e., assessment of multiple attributes of the goods as well as assessment of agents' attitude (eagerness) to procure an item in automated auction. The assessment of attributes adapts the fuzzy sets technique to handle uncertainty of the bidding process as well use heuristic rules to determine the attitude of bidding agents in simulated auctions to procure goods. The overall assessment is used to determine a price range based on current bid, which finally selects the best one as the new bid.
Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang
2016-08-01
Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.
NASA Astrophysics Data System (ADS)
Lestariani, Ida; Sujadi, Imam; Pramudya, Ikrar
2018-05-01
Portfolio assessment can shows the development of the ability of learners in a period through the work so that can be seen progress monitored learning of each learner. The purpose of research to describe and know the implementation of portfolio assessment on the mathematics learning process with the Senior High school math teacher class X as the subject because of the importance of applying the assessment for the progress of learning outcomes of learners. This research includes descriptive qualitative research type. Techniques of data collecting is done by observation method, interview and documentation. Data collection then validated using triangulation technique that is observation technique, interview and documentation. Data analysis technique is done by data reduction, data presentation and conclusion. The results showed that the steps taken by teachers in applying portfolio assessment obtained focused on learning outcomes. Student learning outcomes include homework and daily tests. Based on the results of research can be concluded that the implementation of portfolio assessment is the form of learning results are scored. Teachers have not yet implemented other portfolio assessment techniques such as student work.
Lee, Yi-Hsuan; von Davier, Alina A
2013-07-01
Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.
Alternative evaluation of innovations’ effectiveness in mechanical engineering
NASA Astrophysics Data System (ADS)
Puryaev, A. S.
2017-09-01
The aim of present work is approbation of the developed technique for assessing innovations’ effectiveness. We demonstrate an alternative assessment of innovations’ effectiveness (innovation projects) in mechanical engineering on illustrative example. It is proposed as an alternative to the traditional method technique based on the value concept and the method of “Cash flow”.
NASA Astrophysics Data System (ADS)
Vanvyve, E.; Magontier, P.; Vandenberghe, F. C.; Delle Monache, L.; Dickinson, K.
2012-12-01
Wind energy is amongst the fastest growing sources of renewable energy in the U.S. and could supply up to 20 % of the U.S power production by 2030. An accurate and reliable wind resource assessment for prospective wind farm sites is a challenging task, yet is crucial for evaluating the long-term profitability and feasibility of a potential development. We have developed an accurate and computationally efficient wind resource assessment technique for prospective wind farm sites, which incorporates innovative statistical techniques and the new NASA Earth science dataset MERRA. This technique produces a wind resource estimate that is more accurate than that obtained by the wind energy industry's standard technique, while providing a reliable quantification of its uncertainty. The focus now is on evaluating the socio-economic value of this new technique upon using the industry's standard technique. Would it yield lower financing costs? Could it result in lower electricity prices? Are there further down-the-line positive consequences, e.g. job creation, time saved, greenhouse gas decrease? Ultimately, we expect our results will inform efforts to refine and disseminate the new technique to support the development of the U.S. renewable energy infrastructure. In order to address the above questions, we are carrying out a cost-benefit analysis based on the net present worth of the technique. We will describe this approach, including the cash-flow process of wind farm financing, how the wind resource assessment factors in, and will present current results for various hypothetical candidate wind farm sites.
Model averaging techniques for quantifying conceptual model uncertainty.
Singh, Abhishek; Mishra, Srikanta; Ruskauff, Greg
2010-01-01
In recent years a growing understanding has emerged regarding the need to expand the modeling paradigm to include conceptual model uncertainty for groundwater models. Conceptual model uncertainty is typically addressed by formulating alternative model conceptualizations and assessing their relative likelihoods using statistical model averaging approaches. Several model averaging techniques and likelihood measures have been proposed in the recent literature for this purpose with two broad categories--Monte Carlo-based techniques such as Generalized Likelihood Uncertainty Estimation or GLUE (Beven and Binley 1992) and criterion-based techniques that use metrics such as the Bayesian and Kashyap Information Criteria (e.g., the Maximum Likelihood Bayesian Model Averaging or MLBMA approach proposed by Neuman 2003) and Akaike Information Criterion-based model averaging (AICMA) (Poeter and Anderson 2005). These different techniques can often lead to significantly different relative model weights and ranks because of differences in the underlying statistical assumptions about the nature of model uncertainty. This paper provides a comparative assessment of the four model averaging techniques (GLUE, MLBMA with KIC, MLBMA with BIC, and AIC-based model averaging) mentioned above for the purpose of quantifying the impacts of model uncertainty on groundwater model predictions. Pros and cons of each model averaging technique are examined from a practitioner's perspective using two groundwater modeling case studies. Recommendations are provided regarding the use of these techniques in groundwater modeling practice.
NASA Astrophysics Data System (ADS)
Straub, Jeremy
2016-05-01
Quality control is critical to manufacturing. Frequently, techniques are used to define object conformity bounds, based on historical quality data. This paper considers techniques for bespoke and small batch jobs that are not statistical model based. These techniques also serve jobs where 100% validation is needed due to the mission or safety critical nature of particular parts. One issue with this type of system is alignment discrepancies between the generated model and the physical part. This paper discusses and evaluates techniques for characterizing and correcting alignment issues between the projected and perceived data sets to prevent errors attributable to misalignment.
A probability-based approach for assessment of roadway safety hardware.
DOT National Transportation Integrated Search
2017-03-14
This report presents a general probability-based approach for assessment of roadway safety hardware (RSH). It was achieved using a reliability : analysis method and computational techniques. With the development of high-fidelity finite element (FE) m...
Personalized Multi-Student Improvement Based on Bayesian Cybernetics
ERIC Educational Resources Information Center
Kaburlasos, Vassilis G.; Marinagi, Catherine C.; Tsoukalas, Vassilis Th.
2008-01-01
This work presents innovative cybernetics (feedback) techniques based on Bayesian statistics for drawing questions from an Item Bank towards personalized multi-student improvement. A novel software tool, namely "Module for Adaptive Assessment of Students" (or, "MAAS" for short), implements the proposed (feedback) techniques. In conclusion, a pilot…
NASA Astrophysics Data System (ADS)
Desa, M. S. M.; Ibrahim, M. H. W.; Shahidan, S.; Ghadzali, N. S.; Misri, Z.
2018-04-01
Acoustic emission (AE) technique is one of the non-destructive (NDT) testing, where it can be used to determine the damage of concrete structures such as crack, corrosion, stability, sensitivity, as structure monitoring and energy formed within cracking opening growth in the concrete structure. This article gives a comprehensive review of the acoustic emission (AE) technique testing due to its application in concrete structure for structural health monitoring (SHM). Assessment of AE technique used for structural are reviewed to give the perception of its structural engineering such as dam, bridge and building, where the previous research has been reviewed based on AE application. The assessment of AE technique focusing on basic fundamental of parametric and signal waveform analysis during analysis process and its capability in structural monitoring. Moreover, the assessment and application of AE due to its function have been summarized and highlighted for future references
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-12
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
NASA Astrophysics Data System (ADS)
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-01
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina
2016-06-01
We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.
Flynn, Priscilla; Acharya, Amit; Schwei, Kelsey; VanWormer, Jeffrey; Skrzypcak, Kaitlyn
2016-06-01
This primary aim of this study was to assess communication techniques used with low oral health literacy patients by dental hygienists in rural Wisconsin dental clinics. A secondary aim was to determine the utility of the survey instrument used in this study. A mixed methods study consisting of a cross-sectional survey, immediately followed by focus groups, was conducted among dental hygienists in the Marshfield Clinic (Wisconsin) service area. The survey quantified the routine use of 18 communication techniques previously shown to be effective with low oral health literacy patients. Linear regression was used to analyze the association between routine use of each communication technique and several indicator variables, including geographic practice region, oral health literacy familiarity, communication skills training and demographic indicators. Qualitative analyses included code mapping to the 18 communication techniques identified in the survey, and generating new codes based on discussion content. On average, the 38 study participants routinely used 6.3 communication techniques. Dental hygienists who used an oral health literacy assessment tool reported using significantly more communication techniques compared to those who did not use an oral health literacy assessment tool. Focus group results differed from survey responses as few dental hygienists stated familiarity with the term "oral health literacy." Motivational interviewing techniques and using an integrated electronic medical-dental record were additional communication techniques identified as useful with low oral health literacy patients. Dental hygienists in this study routinely used approximately one-third of the communication techniques recommended for low oral health literacy patients supporting the need for training on this topic. Based on focus group results, the survey used in this study warrants modification and psychometric testing prior to further use. Copyright © 2016 The American Dental Hygienists’ Association.
Zeiler, Frederick A; Donnelly, Joseph; Calviello, Leanne; Menon, David K; Smielewski, Peter; Czosnyka, Marek
2017-12-01
The purpose of this study was to perform a systematic, scoping review of commonly described intermittent/semi-intermittent autoregulation measurement techniques in adult traumatic brain injury (TBI). Nine separate systematic reviews were conducted for each intermittent technique: computed tomographic perfusion (CTP)/Xenon-CT (Xe-CT), positron emission tomography (PET), magnetic resonance imaging (MRI), arteriovenous difference in oxygen (AVDO 2 ) technique, thigh cuff deflation technique (TCDT), transient hyperemic response test (THRT), orthostatic hypotension test (OHT), mean flow index (Mx), and transfer function autoregulation index (TF-ARI). MEDLINE ® , BIOSIS, EMBASE, Global Health, Scopus, Cochrane Library (inception to December 2016), and reference lists of relevant articles were searched. A two tier filter of references was conducted. The total number of articles utilizing each of the nine searched techniques for intermittent/semi-intermittent autoregulation techniques in adult TBI were: CTP/Xe-CT (10), PET (6), MRI (0), AVDO 2 (10), ARI-based TCDT (9), THRT (6), OHT (3), Mx (17), and TF-ARI (6). The premise behind all of the intermittent techniques is manipulation of systemic blood pressure/blood volume via either chemical (such as vasopressors) or mechanical (such as thigh cuffs or carotid compression) means. Exceptionally, Mx and TF-ARI are based on spontaneous fluctuations of cerebral perfusion pressure (CPP) or mean arterial pressure (MAP). The method for assessing the cerebral circulation during these manipulations varies, with both imaging-based techniques and TCD utilized. Despite the limited literature for intermittent/semi-intermittent techniques in adult TBI (minus Mx), it is important to acknowledge the availability of such tests. They have provided fundamental insight into human autoregulatory capacity, leading to the development of continuous and more commonly applied techniques in the intensive care unit (ICU). Numerous methods of intermittent/semi-intermittent pressure autoregulation assessment in adult TBI exist, including: CTP/Xe-CT, PET, AVDO 2 technique, TCDT-based ARI, THRT, OHT, Mx, and TF-ARI. MRI-based techniques in adult TBI are yet to be described, with the main focus of MRI techniques on metabolic-based cerebrovascular reactivity (CVR) and not pressure-based autoregulation.
EFL Teachers' Formal Assessment Practices Based on Exam Papers
ERIC Educational Resources Information Center
Kiliçkaya, Ferit
2016-01-01
This study reports initial findings from a small-scale qualitative study aimed at gaining insights into English language teachers' assessment practices in Turkey by examining the formal exam papers. Based on the technique of content analysis, formal exam papers were analyzed in terms of assessment items, language skills tested as well as the…
Mobile Formative Assessment Tool Based on Data Mining Techniques for Supporting Web-Based Learning
ERIC Educational Resources Information Center
Chen, Chih-Ming; Chen, Ming-Chuan
2009-01-01
Current trends clearly indicate that online learning has become an important learning mode. However, no effective assessment mechanism for learning performance yet exists for e-learning systems. Learning performance assessment aims to evaluate what learners learned during the learning process. Traditional summative evaluation only considers final…
DOT National Transportation Integrated Search
1974-02-17
A number of satellite system techniques have been suggested as candidates to provide ATC surveillance, communication, and/or navigation service over CONUS. All techniques determine the aircraft positions by multilateration based on the arrival times ...
Assessment and Classification of Attention Deficit Hyperactive Disorders.
ERIC Educational Resources Information Center
Schaughency, Elizabeth A.; Rothlind, Johannes
1991-01-01
Issues concerning evaluation, assessment, and classification of Attention-Deficit Hyperactive Disorders (ADHD) are discussed. The diagnosis of ADHD should be a best-estimate diagnosis, based on a behavioral assessment strategy with multimethod assessment. The selection and use of assessment techniques are discussed. (SLD)
Idea Sharing: Using Peer Assessment to Teach How to Make Oral Summaries in English Language Classes
ERIC Educational Resources Information Center
Ivanova, Olimpiada F.
2014-01-01
In this "Idea Sharing" article, the author describes the techniques used when teaching oral summary making to second-year students studying Business English at the Faculty of World Economy and International Affairs of the National Research University Higher School of Economics, Moscow. The techniques are based on peer assessment, which…
Formative Assessment and the Intuitive Incorporation of Research-Based Instruction Techniques
ERIC Educational Resources Information Center
Kuiper, Paula; VanOeffelen, Rachel; Veldkamp, Simon; Bokma, Isaac; Breems, Luke; Fynewever, Herb
2015-01-01
Using Max Weber's theory of ideal types, the authors classify the formative assessment techniques used by 12 college instructors. Their data reveal two pairs of opposing preferences: (1) highly preplanned vs. highly emergent and (2) focused on individual students vs. focused on the class as a whole. Using interview data, they illustrate how each…
NASA Astrophysics Data System (ADS)
Kumar, Shashi; Khati, Unmesh G.; Chandola, Shreya; Agrawal, Shefali; Kushwaha, Satya P. S.
2017-08-01
The regulation of the carbon cycle is a critical ecosystem service provided by forests globally. It is, therefore, necessary to have robust techniques for speedy assessment of forest biophysical parameters at the landscape level. It is arduous and time taking to monitor the status of vast forest landscapes using traditional field methods. Remote sensing and GIS techniques are efficient tools that can monitor the health of forests regularly. Biomass estimation is a key parameter in the assessment of forest health. Polarimetric SAR (PolSAR) remote sensing has already shown its potential for forest biophysical parameter retrieval. The current research work focuses on the retrieval of forest biophysical parameters of tropical deciduous forest, using fully polarimetric spaceborne C-band data with Polarimetric SAR Interferometry (PolInSAR) techniques. PolSAR based Interferometric Water Cloud Model (IWCM) has been used to estimate aboveground biomass (AGB). Input parameters to the IWCM have been extracted from the decomposition modeling of SAR data as well as PolInSAR coherence estimation. The technique of forest tree height retrieval utilized PolInSAR coherence based modeling approach. Two techniques - Coherence Amplitude Inversion (CAI) and Three Stage Inversion (TSI) - for forest height estimation are discussed, compared and validated. These techniques allow estimation of forest stand height and true ground topography. The accuracy of the forest height estimated is assessed using ground-based measurements. PolInSAR based forest height models showed enervation in the identification of forest vegetation and as a result height values were obtained in river channels and plain areas. Overestimation in forest height was also noticed at several patches of the forest. To overcome this problem, coherence and backscatter based threshold technique is introduced for forest area identification and accurate height estimation in non-forested regions. IWCM based modeling for forest AGB retrieval showed R2 value of 0.5, RMSE of 62.73 (t ha-1) and a percent accuracy of 51%. TSI based PolInSAR inversion modeling showed the most accurate result for forest height estimation. The correlation between the field measured forest height and the estimated tree height using TSI technique is 62% with an average accuracy of 91.56% and RMSE of 2.28 m. The study suggested that PolInSAR coherence based modeling approach has significant potential for retrieval of forest biophysical parameters.
NASA Astrophysics Data System (ADS)
Lin, Yuan; Choudhury, Kingshuk R.; McAdams, H. Page; Foos, David H.; Samei, Ehsan
2014-03-01
We previously proposed a novel image-based quality assessment technique1 to assess the perceptual quality of clinical chest radiographs. In this paper, an observer study was designed and conducted to systematically validate this technique. Ten metrics were involved in the observer study, i.e., lung grey level, lung detail, lung noise, riblung contrast, rib sharpness, mediastinum detail, mediastinum noise, mediastinum alignment, subdiaphragm-lung contrast, and subdiaphragm area. For each metric, three tasks were successively presented to the observers. In each task, six ROI images were randomly presented in a row and observers were asked to rank the images only based on a designated quality and disregard the other qualities. A range slider on the top of the images was used for observers to indicate the acceptable range based on the corresponding perceptual attribute. Five boardcertificated radiologists from Duke participated in this observer study on a DICOM calibrated diagnostic display workstation and under low ambient lighting conditions. The observer data were analyzed in terms of the correlations between the observer ranking orders and the algorithmic ranking orders. Based on the collected acceptable ranges, quality consistency ranges were statistically derived. The observer study showed that, for each metric, the averaged ranking orders of the participated observers were strongly correlated with the algorithmic orders. For the lung grey level, the observer ranking orders completely accorded with the algorithmic ranking orders. The quality consistency ranges derived from this observer study were close to these derived from our previous study. The observer study indicates that the proposed image-based quality assessment technique provides a robust reflection of the perceptual image quality of the clinical chest radiographs. The derived quality consistency ranges can be used to automatically predict the acceptability of a clinical chest radiograph.
ERIC Educational Resources Information Center
Brown, Nicola
2017-01-01
While teaching methods tend to be updated frequently, the implementation of new innovative assessment tools is much slower. For example project based learning has become popular as a teaching technique, however, the assessment tends to be via traditional reports. This paper reports on the implementation and evaluation of using website development…
Mulhall, Aaron M; Zafar, Muhammad A; Record, Samantha; Channell, Herman; Panos, Ralph J
2017-02-01
Although inhaled medications are effective therapies for COPD, many patients and providers use them incorrectly. We recruited providers who prescribe inhalers or teach inhaler technique and assessed their use of metered-dose inhalers (MDIs), various dry powder inhalers (DPIs), and Respimat using predefined checklists. Then they watched tablet-based multimedia educational videos that demonstrated correct inhaler technique by a clinical pharmacist with teach-back from a patient and were re-evaluated. We also recruited patients with COPD and assessed their use of their prescribed inhalers and then retested them after 3-6 months. Baseline and follow-up respiratory symptoms were measured by the COPD Assessment Test. Fifty-eight providers and 50 subjects participated. For all providers, correct inhaler technique (reported as percentage correct steps) increased after the videos: MDI without a spacer (72% vs 97%) MDI with a spacer (72% vs 96%), formoterol DPI (50% vs 94%), mometasone DPI (43% vs 95%), tiotropium DPI (73% vs 99%), and Respimat (32% vs 93%) (before vs after, P < .001 for all comparisons). Subjects also improved their inhaler use technique after viewing the educational videos: MDI without a spacer (69% vs 92%), MDI with a spacer (73% vs 95%), and tiotropium DPI (83% vs 96%) (before vs after, P < .001 for all comparisons). The beneficial effect of this educational intervention declined slightly for subjects but was durably improved after several months. COPD Assessment Test scores did not demonstrate any change in respiratory symptoms. A tablet-based inhaler education tool improved inhaler technique for both providers and subjects. Although this intervention did show durable efficacy for improving inhaler use by patients, it did not reduce their respiratory symptoms. Copyright © 2017 by Daedalus Enterprises.
Olyaeemanesh, Alireza; Bavandpour, Elahe; Mobinizadeh, Mohammadreza; Ashrafinia, Mansoor; Bavandpour, Maryam; Nouhi, Mojtaba
2017-01-01
Background: Caesarean section (C-section) is the most common surgery among women worldwide, and the global rate of this surgical procedure has been continuously rising. Hence, it is significantly crucial to develop and apply highly effective and safe caesarean section techniques. In this review study, we aimed at assessing the safety and effectiveness of the Joel-Cohen-based technique and comparing the results with the transverse Pfannenstiel incision for C-section. Methods: In this study, various reliable databases such as the PubMed Central, COCHRANE, DARE, and Ovid MEDLINE were targeted. Reviews, systematic reviews, and randomized clinical trial studies comparing the Joel-Cohen-based technique and the transverse Pfannenstiel incision were selected based on the inclusion criteria. Selected studies were checked by 2 independent reviewers based on the inclusion criteria, and the quality of these studies was assessed. Then, their data were extracted and analyzed. Results: Five randomized clinical trial studies met the inclusion criteria. According to the exiting evidence, statistical results of the Joel-Cohen-based technique showed that this technique is more effective compared to the transverse Pfannenstiel incision. Metaanalysis results of the 3 outcomes were as follow: operation time (5 trials, 764 women; WMD -9.78; 95% CI:-14.49-5.07 minutes, p<0.001), blood loss (3 trials, 309 women; WMD -53.23ml; 95% –CI: 90.20-16.26 ml, p= 0.004), and post-operative hospital stay (3 trials, 453 women; WMD -.69 day; 95% CI: 1.4-0.03 day, p<0.001). Statistical results revealed a significant difference between the 2 techniques. Conclusion: According to the literature, despite having a number of side effects, the Joel-Cohen-based technique is generally more effective than the Pfannenstiel incision technique. In addition, it was recommended that the Joel-Cohen-based technique be used as a replacement for the Pfannenstiel incision technique according to the surgeons’ preferences and the patients’ conditions. PMID:29445683
Concrete Condition Assessment Using Impact-Echo Method and Extreme Learning Machines
Zhang, Jing-Kui; Yan, Weizhong; Cui, De-Mi
2016-01-01
The impact-echo (IE) method is a popular non-destructive testing (NDT) technique widely used for measuring the thickness of plate-like structures and for detecting certain defects inside concrete elements or structures. However, the IE method is not effective for full condition assessment (i.e., defect detection, defect diagnosis, defect sizing and location), because the simple frequency spectrum analysis involved in the existing IE method is not sufficient to capture the IE signal patterns associated with different conditions. In this paper, we attempt to enhance the IE technique and enable it for full condition assessment of concrete elements by introducing advanced machine learning techniques for performing comprehensive analysis and pattern recognition of IE signals. Specifically, we use wavelet decomposition for extracting signatures or features out of the raw IE signals and apply extreme learning machine, one of the recently developed machine learning techniques, as classification models for full condition assessment. To validate the capabilities of the proposed method, we build a number of specimens with various types, sizes, and locations of defects and perform IE testing on these specimens in a lab environment. Based on analysis of the collected IE signals using the proposed machine learning based IE method, we demonstrate that the proposed method is effective in performing full condition assessment of concrete elements or structures. PMID:27023563
Objective. Epidemiologic and community health studies of traffic-related air pollution and childhood asthma have been limited by resource intensive exposure assessment techniques. The current study utilized a novel participant-based approach to collect air monitoring data f...
Techniques for Liquid Rocket Combustion Spontaneous Stability and Rough Combustion Assessments
NASA Technical Reports Server (NTRS)
Kenny, R. J.; Giacomoni, C.; Casiano, M. J.; Fischbach, S. R.
2016-01-01
This work presents techniques for liquid rocket engine combustion stability assessments with respect to spontaneous stability and rough combustion. Techniques covering empirical parameter extraction, which were established in prior works, are applied for three additional programs: the F-1 Gas Generator (F1GG) component test program, the RS-84 preburner component test program, and the Marshall Integrated Test Rig (MITR) program. Stability assessment parameters from these programs are compared against prior established spontaneous stability metrics and updates are identified. Also, a procedure for comparing measured with predicted mode shapes is presented, based on an extension of the Modal Assurance Criterion (MAC).
High-speed engine/component performance assessment using exergy and thrust-based methods
NASA Technical Reports Server (NTRS)
Riggins, D. W.
1996-01-01
This investigation summarizes a comparative study of two high-speed engine performance assessment techniques based on energy (available work) and thrust-potential (thrust availability). Simple flow-fields utilizing Rayleigh heat addition and one-dimensional flow with friction are used to demonstrate the fundamental inability of conventional energy techniques to predict engine component performance, aid in component design, or accurately assess flow losses. The use of the thrust-based method on these same examples demonstrates its ability to yield useful information in all these categories. Energy and thrust are related and discussed from the stand-point of their fundamental thermodynamic and fluid dynamic definitions in order to explain the differences in information obtained using the two methods. The conventional definition of energy is shown to include work which is inherently unavailable to an aerospace Brayton engine. An engine-based energy is then developed which accurately accounts for this inherently unavailable work; performance parameters based on this quantity are then shown to yield design and loss information equivalent to the thrust-based method.
Proton magnetic resonance spectroscopy for assessment of human body composition.
Kamba, M; Kimura, K; Koda, M; Ogawa, T
2001-02-01
The usefulness of magnetic resonance spectroscopy (MRS)-based techniques for assessment of human body composition has not been established. We compared a proton MRS-based technique with the total body water (TBW) method to determine the usefulness of the former technique for assessment of human body composition. Proton magnetic resonance spectra of the chest to abdomen, abdomen to pelvis, and pelvis to thigh regions were obtained from 16 volunteers by using single, free induction decay measurement with a clinical magnetic resonance system operating at 1.5 T. The MRS-derived metabolite ratio was determined as the ratio of fat methyl and methylene proton resonance to water proton resonance. The peak areas for the chest to abdomen and the pelvis to thigh regions were normalized to an external reference (approximately 2200 g benzene) and a weighted average of the MRS-derived metabolite ratios for the 2 positions was calculated. TBW for each subject was determined by the deuterium oxide dilution technique. The MRS-derived metabolite ratios were significantly correlated with the ratio of body fat to lean body mass estimated by TBW. The MRS-derived metabolite ratio for the abdomen to pelvis region correlated best with the ratio of body fat to lean body mass on simple regression analyses (r = 0.918). The MRS-derived metabolite ratio for the abdomen to pelvis region and that for the pelvis to thigh region were selected for a multivariate regression model (R = 0.947, adjusted R(2) = 0.881). This MRS-based technique is sufficiently accurate for assessment of human body composition.
Lü, Fan; Shao, Li-Ming; Zhang, Hua; Fu, Wen-Ding; Feng, Shi-Jin; Zhan, Liang-Tong; Chen, Yun-Min; He, Pin-Jing
2018-01-01
Bio-stability is a key feature for the utilization and final disposal of biowaste-derived residues, such as aerobic compost or vermicompost of food waste, bio-dried waste, anaerobic digestate or landfilled waste. The present paper reviews conventional methods and advanced techniques used for the assessment of bio-stability. The conventional methods are reclassified into two categories. Advanced techniques, including spectroscopic (fluorescent, ultraviolet-visible, infrared, Raman, nuclear magnetic resonance), thermogravimetric and thermochemolysis analysis, are emphasized for their application in bio-stability assessment in recent years. Their principles, pros and cons are critically discussed. These advanced techniques are found to be convenient in sample preparation and to supply diversified information. However, the viability of these techniques as potential indicators for bio-stability assessment ultimately lies in the establishment of the relationship of advanced ones with the conventional methods, especially with the methods based on biotic response. Furthermore, some misuses in data explanation should be noted. Copyright © 2017 Elsevier Ltd. All rights reserved.
GIS based procedure of cumulative environmental impact assessment.
Balakrishna Reddy, M; Blah, Baiantimon
2009-07-01
Scale and spatial limits of impact assessment study in a GIS platform are two very important factors that could have a bearing on the genuineness and quality of impact assessment. While effect of scale has been documented and well understood, no significant study has been carried out on spatial considerations in an impact assessment study employing GIS technique. A novel technique of impact assessment demonstrable through GIS approach termed hereby as 'spatial data integrated GIS impact assessment method (SGIAM)' is narrated in this paper. The technique makes a fundamental presumption that the importance of environmental impacts is dependent, among other things, on spatial distribution of the effects of the proposed action and of the affected receptors in a study area. For each environmental component considered (e.g., air quality), impact indices are calculated through aggregation of impact indicators which are measures of the severity of the impact. The presence and spread of environmental descriptors are suitably quantified through modeling techniques and depicted. The environmental impact index is calculated from data exported from ArcINFO, thus giving significant importance to spatial data in the impact assessment exercise.
Multi-intelligence critical rating assessment of fusion techniques (MiCRAFT)
NASA Astrophysics Data System (ADS)
Blasch, Erik
2015-06-01
Assessment of multi-intelligence fusion techniques includes credibility of algorithm performance, quality of results against mission needs, and usability in a work-domain context. Situation awareness (SAW) brings together low-level information fusion (tracking and identification), high-level information fusion (threat and scenario-based assessment), and information fusion level 5 user refinement (physical, cognitive, and information tasks). To measure SAW, we discuss the SAGAT (Situational Awareness Global Assessment Technique) technique for a multi-intelligence fusion (MIF) system assessment that focuses on the advantages of MIF against single intelligence sources. Building on the NASA TLX (Task Load Index), SAGAT probes, SART (Situational Awareness Rating Technique) questionnaires, and CDM (Critical Decision Method) decision points; we highlight these tools for use in a Multi-Intelligence Critical Rating Assessment of Fusion Techniques (MiCRAFT). The focus is to measure user refinement of a situation over the information fusion quality of service (QoS) metrics: timeliness, accuracy, confidence, workload (cost), and attention (throughput). A key component of any user analysis includes correlation, association, and summarization of data; so we also seek measures of product quality and QuEST of information. Building a notion of product quality from multi-intelligence tools is typically subjective which needs to be aligned with objective machine metrics.
Intramuscular injection technique: an evidence-based approach.
Ogston-Tuck, Sherri
2014-09-30
Intramuscular injections require a thorough and meticulous approach to patient assessment and injection technique. This article, the second in a series of two, reviews the evidence base to inform safer practice and to consider the evidence for nursing practice in this area. A framework for safe practice is included, identifying important points for safe technique, patient care and clinical decision making. It also highlights the ongoing debate in selection of intramuscular injection sites, predominately the ventrogluteal and dorsogluteal muscles.
Data-Mining Techniques in Detecting Factors Linked to Academic Achievement
ERIC Educational Resources Information Center
Martínez Abad, Fernando; Chaparro Caso López, Alicia A.
2017-01-01
In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…
USDA-ARS?s Scientific Manuscript database
Ultra high resolution digital aerial photography has great potential to complement or replace ground measurements of vegetation cover for rangeland monitoring and assessment. We investigated object-based image analysis (OBIA) techniques for classifying vegetation in southwestern U.S. arid rangelands...
In vivo optical elastography: stress and strain imaging of human skin lesions
NASA Astrophysics Data System (ADS)
Es'haghian, Shaghayegh; Gong, Peijun; Kennedy, Kelsey M.; Wijesinghe, Philip; Sampson, David D.; McLaughlin, Robert A.; Kennedy, Brendan F.
2015-03-01
Probing the mechanical properties of skin at high resolution could aid in the assessment of skin pathologies by, for example, detecting the extent of cancerous skin lesions and assessing pathology in burn scars. Here, we present two elastography techniques based on optical coherence tomography (OCT) to probe the local mechanical properties of skin. The first technique, optical palpation, is a high-resolution tactile imaging technique, which uses a complaint silicone layer positioned on the tissue surface to measure spatially-resolved stress imparted by compressive loading. We assess the performance of optical palpation, using a handheld imaging probe on a skin-mimicking phantom, and demonstrate its use on human skin. The second technique is a strain imaging technique, phase-sensitive compression OCE that maps depth-resolved mechanical variations within skin. We show preliminary results of in vivo phase-sensitive compression OCE on a human skin lesion.
NASA Astrophysics Data System (ADS)
Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon
Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software
ERIC Educational Resources Information Center
Otani, Akira
1989-01-01
Examines several basic hypnotherapeutic techniques (rapport building, problem assessment, resistance management, and behavior change) based on Milton H. Erickson's hypnotherapeutic principles that can be translated into the general framework of counseling. (Author/CM)
ERIC Educational Resources Information Center
Montoya, Isaac D.
2008-01-01
Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…
Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika
2017-01-01
Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.
Automatic limb identification and sleeping parameters assessment for pressure ulcer prevention.
Baran Pouyan, Maziyar; Birjandtalab, Javad; Nourani, Mehrdad; Matthew Pompeo, M D
2016-08-01
Pressure ulcers (PUs) are common among vulnerable patients such as elderly, bedridden and diabetic. PUs are very painful for patients and costly for hospitals and nursing homes. Assessment of sleeping parameters on at-risk limbs is critical for ulcer prevention. An effective assessment depends on automatic identification and tracking of at-risk limbs. An accurate limb identification can be used to analyze the pressure distribution and assess risk for each limb. In this paper, we propose a graph-based clustering approach to extract the body limbs from the pressure data collected by a commercial pressure map system. A robust signature-based technique is employed to automatically label each limb. Finally, an assessment technique is applied to evaluate the experienced stress by each limb over time. The experimental results indicate high performance and more than 94% average accuracy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Molecular profiling--a tool for addressing emerging gaps in the comparative risk assessment of GMOs.
Heinemann, Jack A; Kurenbach, Brigitta; Quist, David
2011-10-01
Assessing the risks of genetically modified organisms (GMOs) is required by both international agreement and domestic legislation. Many view the use of the "omics" tools for profiling classes of molecules as useful in risk assessment, but no consensus has formed on the need or value of these techniques for assessing the risks of all GMOs. In this and many other cases, experts support case-by-case use of molecular profiling techniques for risk assessment. We review the latest research on the applicability and usefulness of molecular profiling techniques for GMO risk assessment. As more and more kinds of GMOs and traits are developed, broader use of molecular profiling in a risk assessment may be required to supplement the comparative approach to risk assessment. The literature-based discussions on the use of profiling appear to have settled on two findings: 1. profiling techniques are reliable and relevant, at least no less so than other techniques used in risk assessment; and 2. although not required routinely, regulators should be aware of when they are needed. The dismissal of routine molecular profiling may be confusing to regulators who then lack guidance on when molecular profiling might be worthwhile. Molecular profiling is an important way to increase confidence in risk assessments if the profiles are properly designed to address relevant risks and are applied at the correct stage of the assessment. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Graves, Scott L., Jr., Ed.; Blake, Jamilia J., Ed.
2016-01-01
School-based mental health professionals receive extensive training in assessment and treatment techniques with children. However, most of this training is based on research with white, middle-class populations, whose experiences are hardly universal. In the next decade, ethnic minority students are projected to become the numerical majority in…
Competence Assessment Integrating Reflective Practice in a Professional Psychology Program
ERIC Educational Resources Information Center
Lewis, Deborah; Virden, Tom; Hutchings, Philinda Smith; Bhargava, Ruchi
2011-01-01
The Midwestern University Clinical Psychology Program--Glendale Campus (MWU) created a Comprehensive Assessment Method in Psychology (CAMP) comprised of 35 different "tasks" of authentic work products representing a variety of assessment techniques based on pedagogical theory. Each task assesses one or more components of one of the…
Assessing Students in the Margin: Challenges, Strategies, and Techniques
ERIC Educational Resources Information Center
Russell, Michael; Kavanaugh, Maureen
2011-01-01
The importance of student assessment, particularly for summative purposes, has increased greatly over the past thirty years. At the same time, emphasis on including all students in assessment programs has also increased. Assessment programs, whether they are large-scale, district-based, or teacher developed, have traditionally attempted to assess…
Trautwein, C.M.; Rowan, L.C.
1987-01-01
Linear structural features and hydrothermally altered rocks that were interpreted from Landsat data have been used by the U.S. Geological Survey (USGS) in regional mineral resource appraisals for more than a decade. In the past, linear features and alterations have been incorporated into models for assessing mineral resources potential by manually overlaying these and other data sets. Recently, USGS research into computer-based geographic information systems (GIS) for mineral resources assessment programs has produced several new techniques for data analysis, quantification, and integration to meet assessment objectives.
ERIC Educational Resources Information Center
McNamara, Tim; Hill, Kathryn; May, Lynette
2002-01-01
Focuses on the increase in discourse-based studies of oral proficiency assessment techniques. Discusses research carried out on a number of factors in the assessment setting, including the role of interlocutor, candidate, and rater, and the impact of tasks, task performance conditions, and rating criteria. (Author/VWL)
A new scenario-based approach to damage detection using operational modal parameter estimates
NASA Astrophysics Data System (ADS)
Hansen, J. B.; Brincker, R.; López-Aenlle, M.; Overgaard, C. F.; Kloborg, K.
2017-09-01
In this paper a vibration-based damage localization and quantification method, based on natural frequencies and mode shapes, is presented. The proposed technique is inspired by a damage assessment methodology based solely on the sensitivity of mass-normalized experimental determined mode shapes. The present method differs by being based on modal data extracted by means of Operational Modal Analysis (OMA) combined with a reasonable Finite Element (FE) representation of the test structure and implemented in a scenario-based framework. Besides a review of the basic methodology this paper addresses fundamental theoretical as well as practical considerations which are crucial to the applicability of a given vibration-based damage assessment configuration. Lastly, the technique is demonstrated on an experimental test case using automated OMA. Both the numerical study as well as the experimental test case presented in this paper are restricted to perturbations concerning mass change.
The role of optimization in the next generation of computer-based design tools
NASA Technical Reports Server (NTRS)
Rogan, J. Edward
1989-01-01
There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.
Confidence Intervals from Realizations of Simulated Nuclear Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younes, W.; Ratkiewicz, A.; Ressler, J. J.
2017-09-28
Various statistical techniques are discussed that can be used to assign a level of confidence in the prediction of models that depend on input data with known uncertainties and correlations. The particular techniques reviewed in this paper are: 1) random realizations of the input data using Monte-Carlo methods, 2) the construction of confidence intervals to assess the reliability of model predictions, and 3) resampling techniques to impose statistical constraints on the input data based on additional information. These techniques are illustrated with a calculation of the keff value, based on the 235U(n, f) and 239Pu (n, f) cross sections.
Rath, Hemamalini; Rath, Rachna; Mahapatra, Sandeep; Debta, Tribikram
2017-01-01
The age of an individual can be assessed by a plethora of widely available tooth-based techniques, among which radiological methods prevail. The Demirjian's technique of age assessment based on tooth development stages has been extensively investigated in different populations of the world. The present study is to assess the applicability of Demirjian's modified 8-teeth technique in age estimation of population of East India (Odisha), utilizing Acharya's Indian-specific cubic functions. One hundred and six pretreatment orthodontic radiographs of patients in an age group of 7-23 years with representation from both genders were assessed for eight left mandibular teeth and scored as per the Demirjian's 9-stage criteria for teeth development stages. Age was calculated on the basis of Acharya's Indian formula. Statistical analysis was performed to compare the estimated and actual age. All data were analyzed using SPSS 20.0 (SPSS Inc., Chicago, Illinois, USA) and MS Excel Package. The results revealed that the mean absolute error (MAE) in age estimation of the entire sample was 1.3 years with 50% of the cases having an error rate within ± 1 year. The MAE in males and females (7-16 years) was 1.8 and 1.5, respectively. Likewise, the MAE in males and females (16.1-23 years) was 1.1 and 1.3, respectively. The low error rate in estimating age justifies the application of this modified technique and Acharya's Indian formulas in the present East Indian population.
A novel optical investigation technique for railroad track inspection and assessment
NASA Astrophysics Data System (ADS)
Sabato, Alessandro; Beale, Christopher H.; Niezrecki, Christopher
2017-04-01
Track failures due to cross tie degradation or loss in ballast support may result in a number of problems ranging from simple service interruptions to derailments. Structural Health Monitoring (SHM) of railway track is important for safety reasons and to reduce downtime and maintenance costs. For this reason, novel and cost-effective track inspection technologies for assessing tracks' health are currently insufficient and needed. Advancements achieved in recent years in cameras technology, optical sensors, and image-processing algorithms have made machine vision, Structure from Motion (SfM), and three-dimensional (3D) Digital Image Correlation (DIC) systems extremely appealing techniques for extracting structural deformations and geometry profiles. Therefore, optically based, non-contact measurement techniques may be used for assessing surface defects, rail and tie deflection profiles, and ballast condition. In this study, the design of two camera-based measurement systems is proposed for crossties-ballast condition assessment and track examination purposes. The first one consists of four pairs of cameras installed on the underside of a rail car to detect the induced deformation and displacement on the whole length of the track's cross tie using 3D DIC measurement techniques. The second consists of another set of cameras using SfM techniques for obtaining a 3D rendering of the infrastructure from a series of two-dimensional (2D) images to evaluate the state of the track qualitatively. The feasibility of the proposed optical systems is evaluated through extensive laboratory tests, demonstrating their ability to measure parameters of interest (e.g. crosstie's full-field displacement, vertical deflection, shape, etc.) for assessment and SHM of railroad track.
NASA Technical Reports Server (NTRS)
Dickson, B.; Cronkhite, J.; Bielefeld, S.; Killian, L.; Hayden, R.
1996-01-01
The objective of this study was to evaluate two techniques, Flight Condition Recognition (FCR) and Flight Load Synthesis (FLS), for usage monitoring and assess the potential benefits of extending the retirement intervals of life-limited components, thus reducing the operator's maintenance and replacement costs. Both techniques involve indirect determination of loads using measured flight parameters and subsequent fatigue analysis to calculate the life expended on the life-limited components. To assess the potential benefit of usage monitoring, the two usage techniques were compared to current methods of component retirement. In addition, comparisons were made with direct load measurements to assess the accuracy of the two techniques. The data that was used for the evaluation of the usage monitoring techniques was collected under an independent HUMS Flight trial program, using a commercially available HUMS and data recording system. The usage data collect from the HUMS trial aircraft was analyzed off-line using PC-based software that included the FCR and FLS techniques. In the future, if the technique prove feasible, usage monitoring would be incorporated into the onboard HUMS.
NASA Astrophysics Data System (ADS)
Sentenac, Philippe; Benes, Vojtech; Budinsky, Vladimir; Keenan, Helen; Baron, Ron
2017-11-01
This paper describes the use of four geophysical techniques to map the structural integrity of historical earth reservoir embankments which are susceptible to natural decay with time. The four techniques that were used to assess the post flood damage were 1. A fast scanning technique using a dipole electromagnetic profile apparatus (GEM2), 2. Electrical Resistivity Tomography (ERT) in order to obtain a high resolution image of the shape of the damaged/seepage zone, 3. Self-Potential surveys were carried out to relate the detected seepage evolution and change of the water displacement inside the embankment, 4. The washed zone in the areas with piping was characterised with microgravimetry. The four geophysical techniques used were evaluated against the case studies of two reservoirs in South Bohemia, Czech Republic. A risk approach based on the Geophysical results was undertaken for the reservoir embankments. The four techniques together enabled a comprehensive non-invasive assessment whereby remedial action could be recommended where required. Conclusions were also drawn on the efficiency of the techniques to be applied for embankments with wood structures.
ERIC Educational Resources Information Center
Horm-Wingerd, Diane M.; Winter, Phoebe C.; Plofchan, Paula
The purpose of this paper is twofold: to review appropriate assessment techniques in prekindergarten through grade 3 settings and to serve as a catalyst for further discussion and work on the topic of developmentally appropriate accountability assessment. The discussion is based on the thesis that developmentally appropriate assessment and…
Katz, Josh M; Winter, Carl K; Buttrey, Samuel E; Fadel, James G
2012-03-01
Western and guideline based diets were compared to determine if dietary improvements resulting from following dietary guidelines reduce acrylamide intake. Acrylamide forms in heat treated foods and is a human neurotoxin and animal carcinogen. Acrylamide intake from the Western diet was estimated with probabilistic techniques using teenage (13-19 years) National Health and Nutrition Examination Survey (NHANES) food consumption estimates combined with FDA data on the levels of acrylamide in a large number of foods. Guideline based diets were derived from NHANES data using linear programming techniques to comport to recommendations from the Dietary Guidelines for Americans, 2005. Whereas the guideline based diets were more properly balanced and rich in consumption of fruits, vegetables, and other dietary components than the Western diets, acrylamide intake (mean±SE) was significantly greater (P<0.001) from consumption of the guideline based diets (0.508±0.003 μg/kg/day) than from consumption of the Western diets (0.441±0.003 μg/kg/day). Guideline based diets contained less acrylamide contributed by French fries and potato chips than Western diets. Overall acrylamide intake, however, was higher in guideline based diets as a result of more frequent breakfast cereal intake. This is believed to be the first example of a risk assessment that combines probabilistic techniques with linear programming and results demonstrate that linear programming techniques can be used to model specific diets for the assessment of toxicological and nutritional dietary components. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Hall, Pippa; O'Reilly, Jane; Dojeiji, Sue; Blair, Richard; Harley, Anne
2009-01-01
The purpose of this study was to assess the ethical and professional learning needs of medical trainees on clinical placements at a care-based facility, as they shifted from acute care to care-based philosophy. Using qualitative data analysis and grounded theory techniques, 12 medical learners and five clinical supervisors were interviewed. Five…
NASA Astrophysics Data System (ADS)
Hardyanti, R. C.; Hartono; Fianti
2018-03-01
Physics Learning in Curriculum of 2013 is closely related to the implementation of scientific approach and authentic assessment in learning. This study aims to analyze the implementation of scientific approaches and authentic assessment in physics learning, as well as to analyze the constraints of scientific approach and authentic assessment in physics learning. The data collection techniques used in this study are questionnaires, observations, interviews, and documentation. The calculation results used are percentage techniques and analyzed by using qualitative descriptive approach. Based on the results of research and discussion, the implementation of physics learning based on the scientific approach goes well with the percentage of 84.60%. Physical learning activity based on authentic assessment also goes well with the percentage of 88%. The results of the percentage of scientific approaches and authentic assessment approaches are less than 100%. It shows that there are obstacles to the implementation of the scientific approach and the constraints of authentic assessment. The obstacles to the implementation of scientific approach include time, heavy load of material, input or ability of learners, the willingness of learners in asking questions, laboratory support, and the ability of students to process data. While the obstacles to the implementation of authentic assessment include the limited time for carrying out of authentic assessment, the components of the criteria in carrying out the authentic assessment, the lack of discipline in administering the administration, the difficulty of changing habits in carrying out the assessment from traditional assessment to the authentic assessment, the obstacle to process the score in accordance with the format Curriculum of 2013.
NASA Astrophysics Data System (ADS)
Arumugam, Vinodiran
2013-08-01
Breast cancer remains a significant cause of morbidity and mortality. Assessment of the axillary lymph nodes is part of the staging of the disease. Advances in surgical management of breast cancer have seen a move towards intra-operative lymph node assessment that facilitates an immediate axillary clearance if it is indicated. Raman spectroscopy, a technique based on the inelastic scattering of light, has previously been shown to be capable of differentiating between normal and malignant tissue. These results, based on the biochemical composition of the tissue, potentially allow for this technique to be utilised in this clinical context. The aim of this study was to evaluate the facility of Raman spectroscopy to both assess axillary lymph node tissue within the theatre setting and to achieve results that were comparable to other intra-operative techniques within a clinically relevant time frame. Initial experiments demonstrated that these aims were feasible within the context of both the theatre environment and current surgical techniques. A laboratory based feasibility study involving 17 patients and 38 lymph node samples achieved sensivities and specificities of >90% in unsupervised testing. 339 lymph node samples from 66 patients were subsequently assessed within the theatre environment. Chemometric analysis of this data demonstrated sensitivities of up to 94% and specificities of up to 99% in unsupervised testing. The best results were achieved when comparing negative nodes from N0 patients and nodes containing macrometastases. Spectral analysis revealed increased levels of lipid in the negative nodes and increased DNA and protein levels in the positive nodes. Further studies highlighted the reproducibility of these results using different equipment, users and time from excision. This study uses Raman spectroscopy for the first time in an operating theatre and demonstrates that the results obtained, in real-time, are comparable, if not superior, to current intra-operative techniques of lymph nodes assessment.
Damage Assessment of Aerospace Structural Components by Impedance Based Health Monitoring
NASA Technical Reports Server (NTRS)
Gyekenyesi, Andrew L.; Martin, Richard E.; Sawicki, Jerzy T.; Baaklini, George Y.
2005-01-01
This paper addresses recent efforts at the NASA Glenn Research Center at Lewis Field relating to the set-up and assessment of electro-mechanical (E/M) impedance based structural health monitoring. The overall aim is the application of the impedance based technique to aeronautic and space based structural components. As initial steps, a laboratory was created, software written, and experiments conducted on aluminum plates in undamaged and damaged states. A simulated crack, in the form of a narrow notch at various locations, was analyzed using piezoelectric-ceramic (PZT: lead, zirconate, titarate) patches as impedance measuring transducers. Descriptions of the impedance quantifying hardware and software are provided as well as experimental results. In summary, an impedance based health monitoring system was assembled and tested. The preliminary data showed that the impedance based technique was successful in recognizing the damage state of notched aluminum plates.
Bujar, Magdalena; McAuslane, Neil; Walker, Stuart R; Salek, Sam
2017-01-01
Introduction: Although pharmaceutical companies, regulatory authorities, and health technology assessment (HTA) agencies have been increasingly using decision-making frameworks, it is not certain whether these enable better quality decision making. This could be addressed by formally evaluating the quality of decision-making process within those organizations. The aim of this literature review was to identify current techniques (tools, questionnaires, surveys, and studies) for measuring the quality of the decision-making process across the three stakeholders. Methods: Using MEDLINE, Web of Knowledge, and other Internet-based search engines, a literature review was performed to systematically identify techniques for assessing quality of decision making in medicines development, regulatory review, and HTA. A structured search was applied using key words and a secondary review was carried out. In addition, the measurement properties of each technique were assessed and compared. Ten Quality Decision-Making Practices (QDMPs) developed previously were then used as a framework for the evaluation of techniques identified in the review. Due to the variation in studies identified, meta-analysis was inappropriate. Results: This review identified 13 techniques, where 7 were developed specifically to assess decision making in medicines' development, regulatory review, or HTA; 2 examined corporate decision making, and 4 general decision making. Regarding how closely each technique conformed to the 10 QDMPs, the 13 techniques assessed a median of 6 QDMPs, with a mode of 3 QDMPs. Only 2 techniques evaluated all 10 QDMPs, namely the Organizational IQ and the Quality of Decision Making Orientation Scheme (QoDoS), of which only one technique, QoDoS could be applied to assess decision making of both individuals and organizations, and it possessed generalizability to capture issues relevant to companies as well as regulatory authorities. Conclusion: This review confirmed a general paucity of research in this area, particularly regarding the development and systematic application of techniques for evaluating quality decision making, with no consensus around a gold standard. This review has identified QoDoS as the most promising available technique for assessing decision making in the lifecycle of medicines and the next steps would be to further test its validity, sensitivity, and reliability.
Isotopic Techniques for Assessment of Groundwater Discharge to the Coastal Ocean
2003-09-30
estimates of the pore water Rn activity. The red line (based on an average groundwater concentration of 170 dpm/L) is considered our best estimate and...Isotopic Techniques For Assessment of Groundwater Discharge to the Coastal Ocean William C. Burnett Department of Oceanography Florida State...evaluating the influence of submarine groundwater discharge (SGD) into the ocean. Our long-term goal is to develop geochemical tools (e.g., radon and
NASA Technical Reports Server (NTRS)
Black, D. C. (Editor); Brunk, W. E. (Editor)
1980-01-01
The feasibility and limitations of ground-based techniques for detecting other planetary systems are discussed as well as the level of accuracy at which these limitations would occur and the extent to which they can be overcome by new technology and instrumenation. Workshop conclusions and recommendations are summarized and a proposed high priority program is considered.
EFFECTS-BASED CUMULATIVE RISK ASSESSMENT IN A LOW-INCOME URBAN COMMUNITY NEAR A SUPERFUND SITE
We will introduce into the cumulative risk assessment framework novel methods for non-cancer risk assessment, techniques for dose-response modeling that extend insights from chemical mixtures frameworks to non-chemical stressors, multilevel statistical methods used to address ...
Mobile phone based laser speckle contrast imager for assessment of skin blood flow
NASA Astrophysics Data System (ADS)
Jakovels, Dainis; Saknite, Inga; Krievina, Gita; Zaharans, Janis; Spigulis, Janis
2014-10-01
Assessment of skin blood flow is of interest for evaluation of skin viability as well as for reflection of the overall condition of the circulatory system. Laser Doppler perfusion imaging (LDPI) and laser speckle contrast imaging (LASCI) are optical techniques used for assessment of skin perfusion. However, these systems are still too expensive and bulky to be widely available. Implementation of such techniques as connection kits for mobile phones have a potential for primary diagnostics. In this work we demonstrate simple and low cost LASCI connection kit for mobile phone and its comparison to laser Doppler perfusion imager. Post-occlusive hyperemia and local thermal hyperemia tests are used to compare both techniques and to demonstrate the potential of LASCI device.
Tablet-Based Math Assessment: What Can We Learn from Math Apps?
ERIC Educational Resources Information Center
Cayton-Hodges, Gabrielle A.; Feng, Gary; Pan, Xingyu
2015-01-01
In this report, we describe a survey of mathematics education apps in the Apple App Store, conducted as part of a research project to develop a tablet-based assessment prototype for elementary mathematics. This survey was performed with the goal of understanding the design principles and techniques used in mathematics apps designed for tablets. We…
Decision support tool to assess importance of transportation facilities.
DOT National Transportation Integrated Search
2008-01-01
Assessing the importance of transportation facilities is an increasingly growing topic of interest to federal and state transportation agencies. This work proposes an optimization based model that uses concepts and techniques of complex networks scie...
Probst, Yasmine; Nguyen, Duc Thanh; Tran, Minh Khoi; Li, Wanqing
2015-07-27
Dietary assessment, while traditionally based on pen-and-paper, is rapidly moving towards automatic approaches. This study describes an Australian automatic food record method and its prototype for dietary assessment via the use of a mobile phone and techniques of image processing and pattern recognition. Common visual features including scale invariant feature transformation (SIFT), local binary patterns (LBP), and colour are used for describing food images. The popular bag-of-words (BoW) model is employed for recognizing the images taken by a mobile phone for dietary assessment. Technical details are provided together with discussions on the issues and future work.
Bilek, Maciej; Namieśnik, Jacek
2016-01-01
For a long time, chromatographic techniques and techniques related to them have stimulated the development of new procedures in the field of pharmaceutical analysis. The newly developed methods, characterized by improved metrological parameters, allow for more accurate testing of, among others, the composition of raw materials, intermediates and final products. The chromatographic techniques also enable studies on waste generated in research laboratories and factories producing pharmaceuticals and parapharmaceuticals. Based on the review of reports published in Polish pharmaceutical journals, we assessed the impact of chromatographic techniques on the development of pharmaceutical analysis. The first chromatographic technique used in pharmaceutical analysis was a so-called capillary analysis. It was applied in the 1930s to control the identity of pharmaceutical formulations. In the 1940s and 1950s, the chromatographic techniques were mostly a subject of review publications, while their use in experimental work was rare. Paper chromatography and thin layer chromatography were introduced in the 1960s and 1970s, respectively. These new analytical tools have contributed to the intensive development of research in the field of phytochemistry and the analysis of herbal medicines. The development of colunm chromatography-based techniques, i.e., gas chromatography and high performance liquid chromatography took place in the end of 20th century. Both aforementioned techniques were widely applied in pharmaceutical analysis, for example, to assess the stability of drugs, test for impurities and degradation products as well as in pharmacokinetics studies. The first decade of 21" century was the time of new detection methods in gas and liquid chromatography. The information sources used to write this article were Polish pharmaceutical journals, both professional and scientific, originating from the interwar and post-war period, i.e., "Kronika Farmaceutyczna", "Farmacja Współczesna", "Wiadomości Farmaceutyczne", "Acta Poloniae Pharmaceutica", "Farmacja Polska", "Dissertationes Pharmaceuticae", "Annales UMCS sectio DDD Phamacia". The number of published works using various chromatography techniques was assessed based on the content description of individual issues of the journal "Acta Poloniae Pharmaceutica".
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, D.P.; Richardson, C.F.
Three mercury measurement techniques were performed on synthesis gas streams before and after an amine-based sulfur removal system. The syngas was sampled using (1) gas impingers containing a nitric acid-hydrogen peroxide solution, (2) coconut-based charcoal sorbent, and (3) an on-line atomic absorption spectrophotometer equipped with a gold amalgamation trap and cold vapor cell. Various impinger solutions were applied upstream of the gold amalgamation trap to remove hydrogen sulfide and isolate oxidized and elemental species of mercury. The results from these three techniques are compared to provide an assessment of these measurement techniques in reducing gas atmospheres.
Preservice Elementary Teachers' Beliefs about Science Teaching
ERIC Educational Resources Information Center
Yilmaz-Tuzun, Ozgul
2008-01-01
In this study, a Beliefs About Teaching (BAT) scale was created to examine preservice elementary science teachers' self-reported comfort level with both traditional and reform-based teaching methods, assessment techniques, classroom management techniques, and science content. Participants included 166 preservice teachers from three different US…
The US EPA, Environmental Sciences Division-Las Vegas is using a variety of geopspatical and statistical modeling approaches to locate and assess the complex functions of wetland ecosystems. These assessments involve measuring landscape characteristrics and change, at multiple s...
Model-based RSA of a femoral hip stem using surface and geometrical shape models.
Kaptein, Bart L; Valstar, Edward R; Spoor, Cees W; Stoel, Berend C; Rozing, Piet M
2006-07-01
Roentgen stereophotogrammetry (RSA) is a highly accurate three-dimensional measuring technique for assessing micromotion of orthopaedic implants. A drawback is that markers have to be attached to the implant. Model-based techniques have been developed to prevent using special marked implants. We compared two model-based RSA methods with standard marker-based RSA techniques. The first model-based RSA method used surface models, and the second method used elementary geometrical shape (EGS) models. We used a commercially available stem to perform experiments with a phantom as well as reanalysis of patient RSA radiographs. The data from the phantom experiment indicated the accuracy and precision of the elementary geometrical shape model-based RSA method is equal to marker-based RSA. For model-based RSA using surface models, the accuracy is equal to the accuracy of marker-based RSA, but its precision is worse. We found no difference in accuracy and precision between the two model-based RSA techniques in clinical data. For this particular hip stem, EGS model-based RSA is a good alternative for marker-based RSA.
Innovative Techniques for Evaluating Behavioral Nutrition Interventions1234
Laugero, Kevin D; Cunningham, Brian T; Lora, Karina R; Reicks, Marla
2017-01-01
Assessing outcomes and the impact from behavioral nutrition interventions has remained challenging because of the lack of methods available beyond traditional nutrition assessment tools and techniques. With the current high global obesity and related chronic disease rates, novel methods to evaluate the impact of behavioral nutrition-based interventions are much needed. The objective of this narrative review is to describe and review the current status of knowledge as it relates to 4 different innovative methods or tools to assess behavioral nutrition interventions. Methods reviewed include 1) the assessment of stress and stress responsiveness to enhance the evaluation of nutrition interventions, 2) eye-tracking technology in nutritional interventions, 3) smartphone biosensors to assess nutrition and health-related outcomes, and 4) skin carotenoid measurements to assess fruit and vegetable intake. Specifically, the novel use of functional magnetic resonance imaging, by characterizing the brain’s responsiveness to an intervention, can help researchers develop programs with greater efficacy. Similarly, if eye-tracking technology can enable researchers to get a better sense as to how participants view materials, the materials may be better tailored to create an optimal impact. The latter 2 techniques reviewed, smartphone biosensors and methods to detect skin carotenoids, can provide the research community with portable, effective, nonbiased ways to assess dietary intake and quality and more in the field. The information gained from using these types of methodologies can improve the efficacy and assessment of behavior-based nutrition interventions. PMID:28096132
Improving Initial Assessment: Guide to Good Practice
ERIC Educational Resources Information Center
Knasel, Eddy; Meed, John; Rossetti, Anna; Read, Hilary
2006-01-01
This guide is aimed at anyone in work-based training who is responsible for learners during their first few weeks. Readers will (1) understand the value and purpose of initial assessment in key skills and Skills for Life; (2) become familiar with a range of techniques for the initial assessment; (3) plan an initial assessment system that is…
Proposal for a new trajectory for subaxial cervical lateral mass screws.
Amhaz-Escanlar, Samer; Jorge-Mora, Alberto; Jorge-Mora, Teresa; Febrero-Bande, Manuel; Diez-Ulloa, Maximo-Alberto
2018-06-20
Lateral mass screws combined with rods are the standard method for posterior cervical spine subaxial fixation. Several techniques have been described, among which the most used are Roy Camille, Magerl, Anderson and An. All of them are based on tridimensional angles. Reliability of freehand angle estimation remains poorly investigated. We propose a new technique based on on-site spatial references and compare it with previously described ones assessing screw length and neurovascular potential complications. Four different lateral mass screw insertion techniques (Magerl, Anderson, An and the new described technique) were performed bilaterally, from C3 to C6, in ten human spine specimens. A drill tip guide wire was inserted as originally described for each trajectory, and screw length was measured. Exit point was examined, and potential vertebral artery or nerve root injury was assessed. Mean screw length was 14.05 mm using Magerl's technique, 13.47 mm using Anderson's, 12.8 mm using An's and 17.03 mm using the new technique. Data analysis showed significantly longer lateral mass screw length using the new technique (p value < 0.00001). Nerve potential injury occurred 37 times using Magerl's technique, 28 using Anderson's, 13 using An's and twice using the new technique. Vertebral artery potential injury occurred once using Magerl's technique, 8 times using Anderson's and none using either An's or the new proposed technique. The risk of neurovascular complication was significantly lower using the new technique (p value < 0.01). The new proposed technique allows for longer screws, maximizing purchase and stability, while lowering the complication rate.
NASA Astrophysics Data System (ADS)
Robins, Marthony; Solomon, Justin; Sahbaee, Pooyan; Sedlmair, Martin; Choudhury, Kingshuk Roy; Pezeshk, Aria; Sahiner, Berkman; Samei, Ehsan
2017-09-01
Virtual nodule insertion paves the way towards the development of standardized databases of hybrid CT images with known lesions. The purpose of this study was to assess three methods (an established and two newly developed techniques) for inserting virtual lung nodules into CT images. Assessment was done by comparing virtual nodule volume and shape to the CT-derived volume and shape of synthetic nodules. 24 synthetic nodules (three sizes, four morphologies, two repeats) were physically inserted into the lung cavity of an anthropomorphic chest phantom (KYOTO KAGAKU). The phantom was imaged with and without nodules on a commercial CT scanner (SOMATOM Definition Flash, Siemens) using a standard thoracic CT protocol at two dose levels (1.4 and 22 mGy CTDIvol). Raw projection data were saved and reconstructed with filtered back-projection and sinogram affirmed iterative reconstruction (SAFIRE, strength 5) at 0.6 mm slice thickness. Corresponding 3D idealized, virtual nodule models were co-registered with the CT images to determine each nodule’s location and orientation. Virtual nodules were voxelized, partial volume corrected, and inserted into nodule-free CT data (accounting for system imaging physics) using two methods: projection-based Technique A, and image-based Technique B. Also a third Technique C based on cropping a region of interest from the acquired image of the real nodule and blending it into the nodule-free image was tested. Nodule volumes were measured using a commercial segmentation tool (iNtuition, TeraRecon, Inc.) and deformation was assessed using the Hausdorff distance. Nodule volumes and deformations were compared between the idealized, CT-derived and virtual nodules using a linear mixed effects regression model which utilized the mean, standard deviation, and coefficient of variation (Mea{{n}RHD} , ST{{D}RHD} and C{{V}RHD}{) }~ of the regional Hausdorff distance. Overall, there was a close concordance between the volumes of the CT-derived and virtual nodules. Percent differences between them were less than 3% for all insertion techniques and were not statistically significant in most cases. Correlation coefficient values were greater than 0.97. The deformation according to the Hausdorff distance was also similar between the CT-derived and virtual nodules with minimal statistical significance in the (C{{V}RHD} ) for Techniques A, B, and C. This study shows that both projection-based and image-based nodule insertion techniques yield realistic nodule renderings with statistical similarity to the synthetic nodules with respect to nodule volume and deformation. These techniques could be used to create a database of hybrid CT images containing nodules of known size, location and morphology.
Robins, Marthony; Solomon, Justin; Sahbaee, Pooyan; Sedlmair, Martin; Choudhury, Kingshuk Roy; Pezeshk, Aria; Sahiner, Berkman; Samei, Ehsan
2017-01-01
Virtual nodule insertion paves the way towards the development of standardized databases of hybrid CT images with known lesions. The purpose of this study was to assess three methods (an established and two newly developed techniques) for inserting virtual lung nodules into CT images. Assessment was done by comparing virtual nodule volume and shape to the CT-derived volume and shape of synthetic nodules. 24 synthetic nodules (three sizes, four morphologies, two repeats) were physically inserted into the lung cavity of an anthropomorphic chest phantom (KYOTO KAGAKU). The phantom was imaged with and without nodules on a commercial CT scanner (SOMATOM Definition Flash, Siemens) using a standard thoracic CT protocol at two dose levels (1.4 and 22 mGy CTDIvol). Raw projection data were saved and reconstructed with filtered back-projection and sinogram affirmed iterative reconstruction (SAFIRE, strength 5) at 0.6 mm slice thickness. Corresponding 3D idealized, virtual nodule models were co-registered with the CT images to determine each nodule’s location and orientation. Virtual nodules were voxelized, partial volume corrected, and inserted into nodule-free CT data (accounting for system imaging physics) using two methods: projection-based Technique A, and image-based Technique B. Also a third Technique C based on cropping a region of interest from the acquired image of the real nodule and blending it into the nodule-free image was tested. Nodule volumes were measured using a commercial segmentation tool (iNtuition, TeraRecon, Inc.) and deformation was assessed using the Hausdorff distance. Nodule volumes and deformations were compared between the idealized, CT-derived and virtual nodules using a linear mixed effects regression model which utilized the mean, standard deviation, and coefficient of variation (MeanRHD, and STDRHD CVRHD) of the regional Hausdorff distance. Overall, there was a close concordance between the volumes of the CT-derived and virtual nodules. Percent differences between them were less than 3% for all insertion techniques and were not statistically significant in most cases. Correlation coefficient values were greater than 0.97. The deformation according to the Hausdorff distance was also similar between the CT-derived and virtual nodules with minimal statistical significance in the (CVRHD) for Techniques A, B, and C. This study shows that both projection-based and image-based nodule insertion techniques yield realistic nodule renderings with statistical similarity to the synthetic nodules with respect to nodule volume and deformation. These techniques could be used to create a database of hybrid CT images containing nodules of known size, location and morphology. PMID:28786399
Checklists for powder inhaler technique: a review and recommendations.
Basheti, Iman A; Bosnic-Anticevich, Sinthia Z; Armour, Carol L; Reddel, Helen K
2014-07-01
Turbuhaler and Diskus are commonly used powder inhaler devices for patients with respiratory disease. Their effectiveness is limited in part by a patient's ability to use them correctly. This has led to numerous studies being conducted over the last decade to assess the correct use of these devices by patients and health care professionals. These studies have generally used device-specific checklists to assess technique, this being the most feasible and accessible method for assessment. However, divergence between the checklists and scoring systems for the same device in different studies makes direct comparison of results difficult and at times inappropriate. Little evidence is available to assess the relative importance of different criteria; however, brief patient training based on specific inhaler technique checklists leads to significant improvement in asthma outcomes. This paper reviews common checklists and scoring systems used for Turbuhaler and Diskus, discusses the problem of heterogeneity between different checklists, and finally recommends suitable checklists and scoring systems for these devices based on the literature and previous findings. Only when similar checklists are used across different research studies will accurate comparisons and meta-analysis be possible. Copyright © 2014 by Daedalus Enterprises.
Exploration of a variation of the bottle buoyancy technique for the assessment of body composition.
Gulick, Dawn T; Geigle, Paula Richley
2003-05-01
Hydrostatic weighing has long been recognized as a reliable and valid method for the assessment of body composition. An alternative method known as bottle buoyancy (BB) was introduced by Katch, Hortobagyi, and Denahan in 1989. The purpose of this clinical investigation was to determine the accuracy of the BB technique using an 11-L container. Sixteen individuals (8 men, 8 women) were weighed hydrostatically using a chair/scale and the BB technique. The overall intraclass correlation coefficient for the two techniques was 0.9537. A 2-variable ANOVA was significant for gender but not for technique, and there was no interaction between variables. Thus, the BB technique appears to be an accurate substitute for the chair/scale technique for hydrostatic weighing. The BB method does not involve elaborate equipment and is portable. It could be improved with the use of multiple bottles of various volumes or a calibrated bottle to minimize the number of trials needed for accurate measurements. BB is a valuable, simple clinical tool for assessing body composition based on the principles of hydrostatic weighing and can be performed in any high school, college, or community swimming pool.
Elastography in Chronic Liver Disease: Modalities, Techniques, Limitations, and Future Directions
Srinivasa Babu, Aparna; Wells, Michael L.; Teytelboym, Oleg M.; Mackey, Justin E.; Miller, Frank H.; Yeh, Benjamin M.; Ehman, Richard L.
2016-01-01
Chronic liver disease has multiple causes, many of which are increasing in prevalence. The final common pathway of chronic liver disease is tissue destruction and attempted regeneration, a pathway that triggers fibrosis and eventual cirrhosis. Assessment of fibrosis is important not only for diagnosis but also for management, prognostic evaluation, and follow-up of patients with chronic liver disease. Although liver biopsy has traditionally been considered the reference standard for assessment of liver fibrosis, noninvasive techniques are the emerging focus in this field. Ultrasound-based elastography and magnetic resonance (MR) elastography are gaining popularity as the modalities of choice for quantifying hepatic fibrosis. These techniques have been proven superior to conventional cross-sectional imaging for evaluation of fibrosis, especially in the precirrhotic stages. Moreover, elastography has added utility in the follow-up of previously diagnosed fibrosis, the assessment of treatment response, evaluation for the presence of portal hypertension (spleen elastography), and evaluation of patients with unexplained portal hypertension. In this article, a brief overview is provided of chronic liver disease and the tools used for its diagnosis. Ultrasound-based elastography and MR elastography are explored in depth, including a brief glimpse into the evolution of elastography. Elastography is based on the principle of measuring tissue response to a known mechanical stimulus. Specific elastographic techniques used to exploit this principle include MR elastography and ultrasonography-based static or quasistatic strain imaging, one-dimensional transient elastography, point shear-wave elastography, and supersonic shear-wave elastography. The advantages, limitations, and pitfalls of each modality are emphasized. ©RSNA, 2016 PMID:27689833
Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas
2017-03-01
Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Kucha, Christopher T.; Liu, Li; Ngadi, Michael O.
2018-01-01
Fat is one of the most important traits determining the quality of pork. The composition of the fat greatly influences the quality of pork and its processed products, and contribute to defining the overall carcass value. However, establishing an efficient method for assessing fat quality parameters such as fatty acid composition, solid fat content, oxidative stability, iodine value, and fat color, remains a challenge that must be addressed. Conventional methods such as visual inspection, mechanical methods, and chemical methods are used off the production line, which often results in an inaccurate representation of the process because the dynamics are lost due to the time required to perform the analysis. Consequently, rapid, and non-destructive alternative methods are needed. In this paper, the traditional fat quality assessment techniques are discussed with emphasis on spectroscopic techniques as an alternative. Potential spectroscopic techniques include infrared spectroscopy, nuclear magnetic resonance and Raman spectroscopy. Hyperspectral imaging as an emerging advanced spectroscopy-based technology is introduced and discussed for the recent development of assessment for fat quality attributes. All techniques are described in terms of their operating principles and the research advances involving their application for pork fat quality parameters. Future trends for the non-destructive spectroscopic techniques are also discussed. PMID:29382092
Whatever Happened to School-Based Assessment in England's GCSEs and A Levels?
ERIC Educational Resources Information Center
Opposs, Dennis
2016-01-01
For the past 30 years, school-based assessment (SBA) has been a major feature of GCSEs and A levels, the main school examinations in England. SBA has allowed teachers to allocate marks to their students for the level of skills that they show in their work. Such skills include for example, experimental techniques in science, performance in drama…
Unterseher, Martin; Schnittler, Martin
2009-05-01
Two cultivation-based isolation techniques - the incubation of leaf fragments (fragment plating) and dilution-to-extinction culturing on malt extract agar - were compared for recovery of foliar endophytic fungi from Fagus sylvatica near Greifswald, north-east Germany. Morphological-anatomical characters of vegetative and sporulating cultures and ITS sequences were used to assign morphotypes and taxonomic information to the isolates. Data analysis included species-accumulation curves, richness estimators, multivariate statistics and null model testing. Fragment plating and extinction culturing were significantly complementary with regard to species composition, because around two-thirds of the 35 fungal taxa were isolated with only one of the two cultivation techniques. The difference in outcomes highlights the need for caution in assessing fungal biodiversity based upon single isolation techniques. The efficiency of cultivation-based studies of fungal endophytes was significantly increased with the combination of the two isolation methods and estimations of species richness, when compared with a 20-years old reference study, which needed three times more isolates with fragment plating to attain the same species richness. Intensified testing and optimisation of extinction culturing in endophyte research is advocated.
Development of the implant surgical technique and assessment rating system
Park, Jung-Chul; Hwang, Ji-Wan; Lee, Jung-Seok; Jung, Ui-Won; Choi, Seong-Ho; Cho, Kyoo-Sung; Chai, Jung-Kiu
2012-01-01
Purpose There has been no attempt to establish an objective implant surgical evaluation protocol to assess residents' surgical competence and improve their surgical outcomes. The present study presents a newly developed assessment and rating system and simulation model that can assist the teaching staffs to evaluate the surgical events and surgical skills of residents objectively. Methods Articles published in peer-reviewed English journals were selected using several scientific databases and subsequently reviewed regarding surgical competence and assessment tools. Particularly, medical journals reporting rating and evaluation protocols for various types of medical surgeries were thoroughly analyzed. Based on these studies, an implant surgical technique assessment and rating system (iSTAR) has been developed. Also, a specialized dental typodont was developed for the valid and reliable assessment of surgery. Results The iSTAR consists of two parts including surgical information and task-specific checklists. Specialized simulation model was subsequently produced and can be used in combination with iSTAR. Conclusions The assessment and rating system provided may serve as a reference guide for teaching staffs to evaluate the residents' implant surgical techniques. PMID:22413071
NASA Astrophysics Data System (ADS)
Abdenov, A. Zh; Trushin, V. A.; Abdenova, G. A.
2018-01-01
The paper considers the questions of filling the relevant SIEM nodes based on calculations of objective assessments in order to improve the reliability of subjective expert assessments. The proposed methodology is necessary for the most accurate security risk assessment of information systems. This technique is also intended for the purpose of establishing real-time operational information protection in the enterprise information systems. Risk calculations are based on objective estimates of the adverse events implementation probabilities, predictions of the damage magnitude from information security violations. Calculations of objective assessments are necessary to increase the reliability of the proposed expert assessments.
Assessment of scaffold porosity: the new route of micro-CT.
Bertoldi, Serena; Farè, Silvia; Tanzi, Maria Cristina
2011-01-01
A complete morphologic characterization of porous scaffolds for tissue engineering application is fundamental, as the architectural parameters, in particular porosity, strongly affect the mechanical and biological performance of the structures. Therefore, appropriate techniques for this purpose need to be selected. Several techniques for the assessment of scaffold porosity have been proposed, including Scanning Electron Microscopy observation, mercury and liquid extrusion porosimetry, gas pycnometry, and capillary flow porometry. Each of these techniques has several drawbacks and, a combination of different techniques is often required so as to achieve an in depth study of the morphologic properties of the scaffold. A single technique is often limited and suitable only for the assessment of a specific parameter. To overcome this limit, the most attractive option would be a single nondestructive technique, yet capable of providing a comprehensive set of data. It appears that micro-computed tomography (micro-CT) can potentially fulfill this role. Initially developed to characterize the 3D trabecular microarchitecture of bone, its use has been recently exploited by researchers for the morphologic characterization of porous biomaterials, as it enables obtaining a full assessment of the porous structures both in terms of pore size and interconnected porosity. This review aims to explore the use of micro-CT in scaffold characterization, comparing it with other previously developed techniques; we also focus on the contribution of this innovative tool to the development of scaffold-based tissue engineering application.
Assessing clutter reduction in parallel coordinates using image processing techniques
NASA Astrophysics Data System (ADS)
Alhamaydh, Heba; Alzoubi, Hussein; Almasaeid, Hisham
2018-01-01
Information visualization has appeared as an important research field for multidimensional data and correlation analysis in recent years. Parallel coordinates (PCs) are one of the popular techniques to visual high-dimensional data. A problem with the PCs technique is that it suffers from crowding, a clutter which hides important data and obfuscates the information. Earlier research has been conducted to reduce clutter without loss in data content. We introduce the use of image processing techniques as an approach for assessing the performance of clutter reduction techniques in PC. We use histogram analysis as our first measure, where the mean feature of the color histograms of the possible alternative orderings of coordinates for the PC images is calculated and compared. The second measure is the extracted contrast feature from the texture of PC images based on gray-level co-occurrence matrices. The results show that the best PC image is the one that has the minimal mean value of the color histogram feature and the maximal contrast value of the texture feature. In addition to its simplicity, the proposed assessment method has the advantage of objectively assessing alternative ordering of PC visualization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, D.G.; Sorensen, N.R.
1998-02-01
This report presents a nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel alloys from stainless steel alloys as well as an evaluation of cleaning techniques to remove a thermal oxide layer on aircraft exhaust components. The results of this assessment are presented in terms of how effective each technique classifies a known exhaust material. Results indicate that either inspection technique can separate inconel and stainless steel alloys. Based on the experiments conducted, the electrochemical spot test is the optimum for use by airframe and powerplant mechanics. A spot test procedure is proposed for incorporation into themore » Federal Aviation Administration Advisory Circular 65-9A Airframe & Powerplant Mechanic - General Handbook. 3 refs., 70 figs., 7 tabs.« less
An assessment of PERT as a technique for schedule planning and control
NASA Technical Reports Server (NTRS)
Sibbers, C. W.
1982-01-01
The PERT technique including the types of reports which can be computer generated using the NASA/LaRC PPARS System is described. An assessment is made of the effectiveness of PERT on various types of efforts as well as for specific purposes, namely, schedule planning, schedule analysis, schedule control, monitoring contractor schedule performance, and management reporting. This assessment is based primarily on the author's knowledge of the usage of PERT by NASA/LaRC personnel since the early 1960's. Both strengths and weaknesses of the technique for various applications are discussed. It is intended to serve as a reference guide for personnel performing project planning and control functions and technical personnel whose responsibilities either include schedule planning and control or require a general knowledge of the subject.
Using qPCR for Water Microbial Risk Assessments
Microbial risk assessment (MRA) has traditionally utilized microbiological data that was obtained by culture-based techniques that are expensive and time consuming. With the advent of PCR methods there is a realistic opportunity to conduct MRA studies economically, in less time,...
Microplastics in sediments: A review of techniques, occurrence and effects.
Van Cauwenberghe, Lisbeth; Devriese, Lisa; Galgani, François; Robbens, Johan; Janssen, Colin R
2015-10-01
Microplastics are omnipresent in the marine environment and sediments are hypothesized to be major sinks of these plastics. Here, over 100 articles spanning the last 50 year are reviewed with following objectives: (i) to evaluate current microplastic extraction techniques, (ii) to discuss the occurrence and worldwide distribution of microplastics in sediments, and (iii) to make a comprehensive assessment of the possible adverse effects of this type of pollution to marine organisms. Based on this review we propose future research needs and conclude that there is a clear need for a standardized techniques, unified reporting units and more realistic effect assessments. Copyright © 2015 Elsevier Ltd. All rights reserved.
Determination of Ammonia in Household Cleaners: An Instrumental Analysis Experiment.
ERIC Educational Resources Information Center
Graham, Richard C.; DePew, Steven
1983-01-01
Briefly discusses three techniques for assessing amount of ammonia present in household cleaners. Because of disadvantages with these methods, the thermometric titration technique is suggested in which students judge the best buy based on relative cost of ammonia present in samples. Laboratory procedures, typical results, and reactions involved…
Modelling and Simulation for Requirements Engineering and Options Analysis
2010-05-01
should be performed to work successfully in the domain; and process-based techniques model the processes that occur in the work domain. There is a crisp ...acad/sed/sedres/ dm /erg/cwa. DRDC Toronto CR 2010-049 39 23. Can the current technique for developing simulation models for assessments
The GenTechnique Project: Developing an Open Environment for Learning Molecular Genetics.
ERIC Educational Resources Information Center
Calza, R. E.; Meade, J. T.
1998-01-01
The GenTechnique project at Washington State University uses a networked learning environment for molecular genetics learning. The project is developing courseware featuring animation, hyper-link controls, and interactive self-assessment exercises focusing on fundamental concepts. The first pilot course featured a Web-based module on DNA…
Probst, Yasmine; Nguyen, Duc Thanh; Tran, Minh Khoi; Li, Wanqing
2015-01-01
Dietary assessment, while traditionally based on pen-and-paper, is rapidly moving towards automatic approaches. This study describes an Australian automatic food record method and its prototype for dietary assessment via the use of a mobile phone and techniques of image processing and pattern recognition. Common visual features including scale invariant feature transformation (SIFT), local binary patterns (LBP), and colour are used for describing food images. The popular bag-of-words (BoW) model is employed for recognizing the images taken by a mobile phone for dietary assessment. Technical details are provided together with discussions on the issues and future work. PMID:26225994
Meyer, Frans J C; Davidson, David B; Jakobus, Ulrich; Stuchly, Maria A
2003-02-01
A hybrid finite-element method (FEM)/method of moments (MoM) technique is employed for specific absorption rate (SAR) calculations in a human phantom in the near field of a typical group special mobile (GSM) base-station antenna. The MoM is used to model the metallic surfaces and wires of the base-station antenna, and the FEM is used to model the heterogeneous human phantom. The advantages of each of these frequency domain techniques are, thus, exploited, leading to a highly efficient and robust numerical method for addressing this type of bioelectromagnetic problem. The basic mathematical formulation of the hybrid technique is presented. This is followed by a discussion of important implementation details-in particular, the linear algebra routines for sparse, complex FEM matrices combined with dense MoM matrices. The implementation is validated by comparing results to MoM (surface equivalence principle implementation) and finite-difference time-domain (FDTD) solutions of human exposure problems. A comparison of the computational efficiency of the different techniques is presented. The FEM/MoM implementation is then used for whole-body and critical-organ SAR calculations in a phantom at different positions in the near field of a base-station antenna. This problem cannot, in general, be solved using the MoM or FDTD due to computational limitations. This paper shows that the specific hybrid FEM/MoM implementation is an efficient numerical tool for accurate assessment of human exposure in the near field of base-station antennas.
Virtual environment assessment for laser-based vision surface profiling
NASA Astrophysics Data System (ADS)
ElSoussi, Adnane; Al Alami, Abed ElRahman; Abu-Nabah, Bassam A.
2015-03-01
Oil and gas businesses have been raising the demand from original equipment manufacturers (OEMs) to implement a reliable metrology method in assessing surface profiles of welds before and after grinding. This certainly mandates the deviation from the commonly used surface measurement gauges, which are not only operator dependent, but also limited to discrete measurements along the weld. Due to its potential accuracy and speed, the use of laser-based vision surface profiling systems have been progressively rising as part of manufacturing quality control. This effort presents a virtual environment that lends itself for developing and evaluating existing laser vision sensor (LVS) calibration and measurement techniques. A combination of two known calibration techniques is implemented to deliver a calibrated LVS system. System calibration is implemented virtually and experimentally to scan simulated and 3D printed features of known profiles, respectively. Scanned data is inverted and compared with the input profiles to validate the virtual environment capability for LVS surface profiling and preliminary assess the measurement technique for weld profiling applications. Moreover, this effort brings 3D scanning capability a step closer towards robust quality control applications in a manufacturing environment.
Bujar, Magdalena; McAuslane, Neil; Walker, Stuart R.; Salek, Sam
2017-01-01
Introduction: Although pharmaceutical companies, regulatory authorities, and health technology assessment (HTA) agencies have been increasingly using decision-making frameworks, it is not certain whether these enable better quality decision making. This could be addressed by formally evaluating the quality of decision-making process within those organizations. The aim of this literature review was to identify current techniques (tools, questionnaires, surveys, and studies) for measuring the quality of the decision-making process across the three stakeholders. Methods: Using MEDLINE, Web of Knowledge, and other Internet-based search engines, a literature review was performed to systematically identify techniques for assessing quality of decision making in medicines development, regulatory review, and HTA. A structured search was applied using key words and a secondary review was carried out. In addition, the measurement properties of each technique were assessed and compared. Ten Quality Decision-Making Practices (QDMPs) developed previously were then used as a framework for the evaluation of techniques identified in the review. Due to the variation in studies identified, meta-analysis was inappropriate. Results: This review identified 13 techniques, where 7 were developed specifically to assess decision making in medicines' development, regulatory review, or HTA; 2 examined corporate decision making, and 4 general decision making. Regarding how closely each technique conformed to the 10 QDMPs, the 13 techniques assessed a median of 6 QDMPs, with a mode of 3 QDMPs. Only 2 techniques evaluated all 10 QDMPs, namely the Organizational IQ and the Quality of Decision Making Orientation Scheme (QoDoS), of which only one technique, QoDoS could be applied to assess decision making of both individuals and organizations, and it possessed generalizability to capture issues relevant to companies as well as regulatory authorities. Conclusion: This review confirmed a general paucity of research in this area, particularly regarding the development and systematic application of techniques for evaluating quality decision making, with no consensus around a gold standard. This review has identified QoDoS as the most promising available technique for assessing decision making in the lifecycle of medicines and the next steps would be to further test its validity, sensitivity, and reliability. PMID:28443022
Practical Team-Based Learning from Planning to Implementation
Bell, Edward; Eng, Marty; Fuentes, David G.; Helms, Kristen L.; Maki, Erik D.; Vyas, Deepti
2015-01-01
Team-based learning (TBL) helps instructors develop an active teaching approach for the classroom through group work. The TBL infrastructure engages students in the learning process through the Readiness Assessment Process, problem-solving through team discussions, and peer feedback to ensure accountability. This manuscript describes the benefits and barriers of TBL, and the tools necessary for developing, implementing, and critically evaluating the technique within coursework in a user-friendly method. Specifically, the manuscript describes the processes underpinning effective TBL development, preparation, implementation, assessment, and evaluation, as well as practical techniques and advice from authors’ classroom experiences. The paper also highlights published articles in the area of TBL in education, with a focus on pharmacy education. PMID:26889061
Smart Sensor-Based Motion Detection System for Hand Movement Training in Open Surgery.
Sun, Xinyao; Byrns, Simon; Cheng, Irene; Zheng, Bin; Basu, Anup
2017-02-01
We introduce a smart sensor-based motion detection technique for objective measurement and assessment of surgical dexterity among users at different experience levels. The goal is to allow trainees to evaluate their performance based on a reference model shared through communication technology, e.g., the Internet, without the physical presence of an evaluating surgeon. While in the current implementation we used a Leap Motion Controller to obtain motion data for analysis, our technique can be applied to motion data captured by other smart sensors, e.g., OptiTrack. To differentiate motions captured from different participants, measurement and assessment in our approach are achieved using two strategies: (1) low level descriptive statistical analysis, and (2) Hidden Markov Model (HMM) classification. Based on our surgical knot tying task experiment, we can conclude that finger motions generated from users with different surgical dexterity, e.g., expert and novice performers, display differences in path length, number of movements and task completion time. In order to validate the discriminatory ability of HMM for classifying different movement patterns, a non-surgical task was included in our analysis. Experimental results demonstrate that our approach had 100 % accuracy in discriminating between expert and novice performances. Our proposed motion analysis technique applied to open surgical procedures is a promising step towards the development of objective computer-assisted assessment and training systems.
NASA Technical Reports Server (NTRS)
Dasarathy, B. V.
1976-01-01
An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.
Farquharson, Barbara; Johnston, Marie; Smith, Karen; Williams, Brian; Treweek, Shaun; Dombrowski, Stephan U; Dougall, Nadine; Abhyankar, Purva; Grindle, Mark
2017-05-01
To evaluate the efficacy of a behaviour change technique-based intervention and compare two possible modes of delivery (text + visual and text-only) with usual care. Patient delay prevents many people from achieving optimal benefit of time-dependent treatments for acute coronary syndrome. Reducing delay would reduce mortality and morbidity, but interventions to change behaviour have had mixed results. Systematic inclusion of behaviour change techniques or a visual mode of delivery might improve the efficacy of interventions. A three-arm web-based, parallel randomized controlled trial of a theory-based intervention. The intervention comprises 12 behaviour change techniques systematically identified following systematic review and a consensus exercise undertaken with behaviour change experts. We aim to recruit n = 177 participants who have experienced acute coronary syndrome in the previous 6 months from a National Health Service Hospital. Consenting participants will be randomly allocated in equal numbers to one of three study groups: i) usual care, ii) usual care plus text-only behaviour change technique-based intervention or iii) usual care plus text + visual behaviour change technique-based intervention. The primary outcome will be the change in intention to phone an ambulance immediately with symptoms of acute coronary syndrome ≥15-minute duration, assessed using two randomized series of eight scenarios representing varied symptoms before and after delivery of the interventions or control condition (usual care). Funding granted January 2014. Positive results changing intentions would lead to a randomized controlled trial of the behaviour change intervention in clinical practice, assessing patient delay in the event of actual symptoms. Registered at ClinicalTrials.gov: NCT02820103. © 2016 John Wiley & Sons Ltd.
Development of fuzzy air quality index using soft computing approach.
Mandal, T; Gorai, A K; Pathak, G
2012-10-01
Proper assessment of air quality status in an atmosphere based on limited observations is an essential task for meeting the goals of environmental management. A number of classification methods are available for estimating the changing status of air quality. However, a discrepancy frequently arises from the quality criteria of air employed and vagueness or fuzziness embedded in the decision making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies like air quality index when describing integrated air quality conditions with respect to various pollutants parameters and time of exposure. In recent years, the fuzzy logic-based methods have demonstrated to be appropriated to address uncertainty and subjectivity in environmental issues. In the present study, a methodology based on fuzzy inference systems (FIS) to assess air quality is proposed. This paper presents a comparative study to assess status of air quality using fuzzy logic technique and that of conventional technique. The findings clearly indicate that the FIS may successfully harmonize inherent discrepancies and interpret complex conditions.
Mid Term Progress Report: Desertification Assessment and Monitoring in China Based on Remote Sensing
NASA Astrophysics Data System (ADS)
Gao, Zhihai; del Barrio, Gabriel; Li, Xiaosong; Wang, Bengyu; Puigdefabregas, Juan; Sanjuan, Maria E.; Bai, Lina; Wu, Junjun; Sun, Bin; Li, Changlong
2014-11-01
The objective of Dragon 3 Project 10367 is the development of techniques research for desertification assessment and monitoring in China using remote sensing data in combination with climate and environmental-related data. The main achievements acquired since2012could be summarized as follows: (1)Photosynthetic vegetation(PV)and non-photosynthetic vegetation(NPV)fraction were retrieved separately through utilizing Auto Monte Carlo Unmixing technique (AutoMCU), based on BJ-1 data and field measured spectral library. (2) The accuracy of sandy land classification was as high as81.52%when the object-oriented method and Support Vector Machine (SVM) classifiers were used. (3) A new Monthly net primary productivity (NPP)dataset from 2002 to 2010 for the whole China were established with Envisat-MERIS fraction of absorbed photosynthetically active radiation (FPAR) data. (4) The 2dRUE proved to be a good indicator for land degradation, based on which, land degradation status in the general potential extent of desertification in China(PEDC) was assessed preliminarily.
Mid Term Progress Report: Desertification Assessment and Monitoring in China Based on Remote Sensing
NASA Astrophysics Data System (ADS)
Gao, Zhihai; del Barrio, Gabriel; Li, Xiaosong; Wang, Wengyu; Puigdefabregas, Juan; Sanjuan, Maria E.; Bai, Lina; Wu, Junjun; Sun, Bin; Li, Changlong
2014-11-01
The objective of Dragon 3 Project 10367 is the development of techniques research for desertification assessment and monitoring in China using remote sensing data in combination with climate and environmental-related data. The main achievements acquired since 2012 could be summarized as follows:(1) Photosynthetic vegetation (PV) and non-photosynthetic vegetation (NPV) fraction were retrieved separately through utilizing Auto Monte Carlo Unmixing technique (AutoMCU), based on BJ-1 data and field measured spectral library.(2) The accuracy of sandy land classification was as high as 81.52% when the object-oriented method and Support Vector Machine (SVM) classifiers were used.(3) A new Monthly net primary productivity (NPP) dataset from 2002 to 2010 for the whole China were established with Envisat-MERIS fraction of absorbed photosynthetically active radiation (FPAR) data.(4) The 2dRUE proved to be a good indicator for land degradation, based on which, land degradation status in the general potential extent of desertification in China (PEDC) was assessed preliminarily.
Soil solution extraction techniques for microbial ecotoxicity testing: a comparative evaluation.
Tiensing, T; Preston, S; Strachan, N; Paton, G I
2001-02-01
The suitability of two different techniques (centrifugation and Rhizon sampler) for obtaining the interstitial pore water of soil (soil solution), integral to the ecotoxicity assessment of metal contaminated soil, were investigated by combining chemical analyses and a luminescence-based microbial biosensor. Two different techniques, centrifugation and Rhizon sampler, were used to extract the soil solution from Insch (a loamy sand) and Boyndie (a sandy loam) soils, which had been amended with different concentrations of Zn and Cd. The concentrations of dissolved organic carbon (DOC), major anions (F- , CI-, NO3, SO4(2-)) and major cations (K+, Mg2+, Ca2+) in the soil solutions varied depending on the extraction technique used. Overall, the concentrations of Zn and Cd were significantly higher in the soil solution extracted using the centrifugation technique compared with that extracted using the Rhizon sampler technique. Furthermore, the differences observed between the two extraction techniques depended on the type of soil from which the solution was being extracted. The luminescence-based biosensor Escherichia coli HB101 pUCD607 was shown to respond to the free metal concentrations in the soil solutions and showed that different toxicities were associated with each soil, depending on the technique used to extract the soil solution. This study highlights the need to characterise the type of extraction technique used to obtain the soil solution for ecotoxicity testing in order that a representative ecotoxicity assessment can be carried out.
Problem-based learning in laboratory medicine resident education: a satisfaction survey.
Lepiller, Quentin; Solis, Morgane; Velay, Aurélie; Gantner, Pierre; Sueur, Charlotte; Stoll-Keller, Françoise; Barth, Heidi; Fafi-Kremer, Samira
2017-04-01
Theoretical knowledge in biology and medicine plays a substantial role in laboratory medicine resident education. In this study, we assessed the contribution of problem-based learning (PBL) to improve the training of laboratory medicine residents during their internship in the department of virology, Strasbourg University Hospital, France. We compared the residents' satisfaction regarding an educational program based on PBL and a program based on lectures and presentations. PBL induced a high level of satisfaction (100%) among residents compared to lectures and presentations (53%). The main advantages of this technique were to create a situational interest regarding virological problems, to boost the residents' motivation and to help them identify the most relevant learning objectives in virology. However, it appears pertinent to educate the residents in appropriate bibliographic research techniques prior to PBL use and to monitor their learning by regular formative assessment sessions.
Dynamic frame resizing with convolutional neural network for efficient video compression
NASA Astrophysics Data System (ADS)
Kim, Jaehwan; Park, Youngo; Choi, Kwang Pyo; Lee, JongSeok; Jeon, Sunyoung; Park, JeongHoon
2017-09-01
In the past, video codecs such as vc-1 and H.263 used a technique to encode reduced-resolution video and restore original resolution from the decoder for improvement of coding efficiency. The techniques of vc-1 and H.263 Annex Q are called dynamic frame resizing and reduced-resolution update mode, respectively. However, these techniques have not been widely used due to limited performance improvements that operate well only under specific conditions. In this paper, video frame resizing (reduced/restore) technique based on machine learning is proposed for improvement of coding efficiency. The proposed method features video of low resolution made by convolutional neural network (CNN) in encoder and reconstruction of original resolution using CNN in decoder. The proposed method shows improved subjective performance over all the high resolution videos which are dominantly consumed recently. In order to assess subjective quality of the proposed method, Video Multi-method Assessment Fusion (VMAF) which showed high reliability among many subjective measurement tools was used as subjective metric. Moreover, to assess general performance, diverse bitrates are tested. Experimental results showed that BD-rate based on VMAF was improved by about 51% compare to conventional HEVC. Especially, VMAF values were significantly improved in low bitrate. Also, when the method is subjectively tested, it had better subjective visual quality in similar bit rate.
NASA Astrophysics Data System (ADS)
Stam, Christina N.; Bruckner, James; Spry, J. Andy; Venkateswaran, Kasthuri; La Duc, Myron T.
2012-07-01
Current assessments of bioburden embedded in spacecraft materials are based on work performed in the Viking era (1970s), and the ability to culture organisms extracted from such materials. To circumvent the limitations of such approaches, DNA-based techniques were evaluated alongside established culturing techniques to determine the recovery and survival of bacterial spores encapsulated in spacecraft-qualified polymer materials. Varying concentrations of Bacillus pumilus SAFR-032 spores were completely embedded in silicone epoxy. An organic dimethylacetamide-based solvent was used to digest the epoxy and spore recovery was evaluated via gyrB-targeted qPCR, direct agar plating, most probably number analysis, and microscopy. Although full-strength solvent was shown to inhibit the germination and/or outgrowth of spores, dilution in excess of 100-fold allowed recovery with no significant decrease in cultivability. Similarly, qPCR (quantitative PCR) detection sensitivities as low as ~103 CFU ml-1 were achieved upon removal of inhibitory substances associated with the epoxy and/or solvent. These detection and enumeration methods show promise for use in assessing the embedded bioburden of spacecraft hardware.
Ohno, Yoshiharu; Koyama, Hisanobu; Lee, Ho Yun; Miura, Sachiko; Yoshikawa, Takeshi; Sugimura, Kazuro
2016-01-01
Assessment of regional pulmonary perfusion as well as nodule and tumor perfusions in various pulmonary diseases are currently performed by means of nuclear medicine studies requiring radioactive macroaggregates, dual-energy computed tomography (CT), and dynamic first-pass contrast-enhanced perfusion CT techniques and unenhanced and dynamic first-pass contrast enhanced perfusion magnetic resonance imaging (MRI), as well as time-resolved three-dimensional or four-dimensional contrast-enhanced magnetic resonance angiography (MRA). Perfusion scintigraphy, single-photon emission tomography (SPECT) and SPECT fused with CT have been established as clinically available scintigraphic methods; however, they are limited by perfusion information with poor spatial resolution and other shortcomings. Although positron emission tomography with 15O water can measure absolute pulmonary perfusion, it requires a cyclotron for generation of a tracer with an extremely short half-life (2 min), and can only be performed for academic purposes. Therefore, clinicians are concentrating their efforts on the application of CT-based and MRI-based quantitative and qualitative perfusion assessment to various pulmonary diseases. This review article covers 1) the basics of dual-energy CT and dynamic first-pass contrast-enhanced perfusion CT techniques, 2) the basics of time-resolved contrast-enhanced MRA and dynamic first-pass contrast-enhanced perfusion MRI, and 3) clinical applications of contrast-enhanced CT- and MRI-based perfusion assessment for patients with pulmonary nodule, lung cancer, and pulmonary vascular diseases. We believe that these new techniques can be useful in routine clinical practice for not only thoracic oncology patients, but also patients with different pulmonary vascular diseases. PMID:27523813
ERIC Educational Resources Information Center
Krain, Matthew
2016-01-01
This study revisits case learning's effects on student engagement and assesses student learning as a result of the use of case studies and problem-based learning. The author replicates a previous study that used indirect assessment techniques to get at case learning's impact, and then extends the analysis using a pre- and post-test experimental…
NASA Astrophysics Data System (ADS)
Gani, M. R. A.; Nazir, F.; Pawiro, S. A.; Soejoko, D. S.
2016-03-01
Suspicion on coronary heart disease can be confirmed by observing the function of left ventricle cardiac muscle with Myocardial Perfusion Imaging techniques. The function perfusion itself is indicated by the uptake of radiopharmaceutical tracer. The 31 patients were studied undergoing the MPI examination on Gatot Soebroto Hospital using 99mTc-sestamibi radiopharmaceutical with stress and rest conditions. Stress was stimulated by physical exercise or pharmacological agent. After two hours, the patient did rest condition on the same day. The difference of uptake percentage between stress and rest conditions will be used to determine the malfunction of perfusion due to ischemic or infarct. Degradation of cardiac function was determined based on the image-based assessment of five segments of left ventricle cardiac. As a result, 8 (25.8%) patients had normal myocardial perfusion and 11 (35.5%) patients suspected for having partial ischemia. Total ischemia occurred to 8 (25.8%) patients with reversible and irreversible ischemia and the remaining 4 (12.9%) patients for partial infarct with characteristic the percentage of perfusion ≤50%. It is concluded that MPI technique of image-based assessment on uptake percentage difference between stress and rest conditions can be employed to predict abnormal perfusion as complementary information to diagnose the cardiac function.
Skin blotting: a noninvasive technique for evaluating physiological skin status.
Minematsu, Takeo; Horii, Motoko; Oe, Makoto; Sugama, Junko; Mugita, Yuko; Huang, Lijuan; Nakagami, Gojiro; Sanada, Hiromi
2014-06-01
The skin performs important structural and physiological functions, and skin assessment represents an important step in identifying skin problems. Although noninvasive techniques for assessing skin status exist, no such techniques for monitoring its physiological status are available. This study aimed to develop a novel skin-assessment technique known as skin blotting, based on the leakage of secreted proteins from inside the skin following overhydration in mice. The applicability of this technique was further investigated in a clinical setting. Skin blotting involves 2 steps: collecting proteins by attaching a damp nitrocellulose membrane to the surface of the skin, and immunostaining the collected proteins. The authors implanted fluorescein-conjugated dextran (F-DEX)-containing agarose gels into mice and detected the tissue distribution of F-DEX under different blotting conditions. They also analyzed the correlations between inflammatory cytokine secretion and leakage following ultraviolet irradiation in mice and in relation to body mass index in humans. The F-DEX in mice was distributed in the deeper and shallower layers of skin and leaked through the transfollicular and transepidermal routes, respectively. Ultraviolet irradiation induced tumor necrosis factor secretion in the epidermis in mice, which was detected by skin blotting, whereas follicular tumor necrosis factor was associated with body mass index in obese human subjects. These results support the applicability of skin blotting for skin assessment. Skin blotting represents a noninvasive technique for assessing skin physiology and has potential as a predictive and diagnostic tool for skin disorders.
Values of Local Wisdom: A Potential to Develop an Assessment and Remedial
ERIC Educational Resources Information Center
Toharudin, Uus; Kurniawan, Iwan Setia
2017-01-01
Development assessment and remedial needs to be done because it is an important part of a learning process. This study aimed to describe the ability of student teachers of biology in developing assessment and remedial based on local wisdom. using a quasi-experimental research methods with quantitative descriptive analysis techniques. The research…
ERIC Educational Resources Information Center
Morales-Martinez, Guadalupe Elizabeth; Lopez-Ramirez, Ernesto Octavio; Castro-Campos, Claudia; Villarreal-Treviño, Maria Guadalupe; Gonzales-Trujillo, Claudia Jaquelina
2017-01-01
Empirical directions to innovate e-assessments and to support the theoretical development of e-learning are discussed by presenting a new learning assessment system based on cognitive technology. Specifically, this system encompassing trained neural nets that can discriminate between students who successfully integrated new knowledge course…
Avoiding treatment bias of REDD+ monitoring by sampling with partial replacement
Michael Kohl; Charles T Scott; Andrew J Lister; Inez Demon; Daniel Plugge
2015-01-01
Implementing REDD+ renders the development of a measurement, reporting and verification (MRV) system necessary to monitor carbon stock changes. MRV systems generally apply a combination of remote sensing techniques and in-situ field assessments. In-situ assessments can be based on 1) permanent plots, which are assessed on all successive occasions, 2) temporary plots,...
International NMR-based Environmental Metabolomics Intercomparison Exercise
Several fundamental requirements must be met so that NMR-based metabolomics and the related technique of metabonomics can be formally adopted into environmental monitoring and chemical risk assessment. Here we report an intercomparison exercise which has evaluated the effectivene...
Villamonte-Chevalier, A; van Bree, H; Broeckx, Bjg; Dingemanse, W; Soler, M; Van Ryssen, B; Gielen, I
2015-09-25
Diagnostic imaging is essential to assess the lame patient; lesions of the elbow joint have traditionally been evaluated radiographically, however computed tomography (CT) has been suggested as a useful technique to diagnose various elbow pathologies. The primary objective of this study was to determine the sensitivity and specificity of CT to assess medial coronoid disease (MCD), using arthroscopy as gold standard. The secondary objective was to ascertain the radiographic sensitivity and specificity for MCD compared with CT. For this study 180 elbow joints were assessed, of which 141 had been examined with radiography, CT and arthroscopy; and 39 joints, had radiographic and CT assessment. Sensitivity and specificity were calculated for CT and radiographic findings using available statistical software. Sensitivity and specificity of CT using arthroscopy as gold standard resulted in high values for sensitivity (100 %) and specificity (93 %) for the assessment of MCD. For the radiographic evaluation, a sensitivity of 98 % and specificity of 64 - 69 % using CT as the technique of reference, were found. These results suggest that in case of doubt during radiographic assessment, CT could be used as a non-invasive technique to assess the presence of MCD. Based on the high sensitivity and specificity obtained in this study it has been considered that CT, rather than arthroscopy, is the preferred noninvasive technique to assess MCD lesions of the canine elbow joint.
Specificity of Structural Assessment of Knowledge
ERIC Educational Resources Information Center
Trumpower, David L.; Sharara, Harold; Goldsmith, Timothy E.
2010-01-01
This study examines the specificity of information provided by structural assessment of knowledge (SAK). SAK is a technique which uses the Pathfinder scaling algorithm to transform ratings of concept relatedness into network representations (PFnets) of individuals' knowledge. Inferences about individuals' overall domain knowledge based on the…
Impact of ensemble learning in the assessment of skeletal maturity.
Cunha, Pedro; Moura, Daniel C; Guevara López, Miguel Angel; Guerra, Conceição; Pinto, Daniela; Ramos, Isabel
2014-09-01
The assessment of the bone age, or skeletal maturity, is an important task in pediatrics that measures the degree of maturation of children's bones. Nowadays, there is no standard clinical procedure for assessing bone age and the most widely used approaches are the Greulich and Pyle and the Tanner and Whitehouse methods. Computer methods have been proposed to automatize the process; however, there is a lack of exploration about how to combine the features of the different parts of the hand, and how to take advantage of ensemble techniques for this purpose. This paper presents a study where the use of ensemble techniques for improving bone age assessment is evaluated. A new computer method was developed that extracts descriptors for each joint of each finger, which are then combined using different ensemble schemes for obtaining a final bone age value. Three popular ensemble schemes are explored in this study: bagging, stacking and voting. Best results were achieved by bagging with a rule-based regression (M5P), scoring a mean absolute error of 10.16 months. Results show that ensemble techniques improve the prediction performance of most of the evaluated regression algorithms, always achieving best or comparable to best results. Therefore, the success of the ensemble methods allow us to conclude that their use may improve computer-based bone age assessment, offering a scalable option for utilizing multiple regions of interest and combining their output.
Instantiating the art of war for effects-based operations
NASA Astrophysics Data System (ADS)
Burns, Carla L.
2002-07-01
Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.
Ultrasound elastographic techniques in focal liver lesions
Conti, Clara Benedetta; Cavalcoli, Federica; Fraquelli, Mirella; Conte, Dario; Massironi, Sara
2016-01-01
Elastographic techniques are new ultrasound-based imaging techniques developed to estimate tissue deformability/stiffness. Several ultrasound elastographic approaches have been developed, such as static elastography, transient elastography and acoustic radiation force imaging methods, which include point shear wave and shear wave imaging elastography. The application of these methods in clinical practice aims at estimating the mechanical tissues properties. One of the main settings for the application of these tools has been liver stiffness assessment in chronic liver disease, which has been studied mainly using transient elastography. Another field of application for these techniques is the assessment of focal lesions, detected by ultrasound in organs such as pancreas, prostate, breast, thyroid, lymph nodes. Considering the frequency and importance of the detection of focal liver lesions through routine ultrasound, some studies have also aimed to assess the role that elestography can play in studying the stiffness of different types of liver lesions, in order to predict their nature and thus offer valuable non-invasive methods for the diagnosis of liver masses. PMID:26973405
Ultrasound elastographic techniques in focal liver lesions.
Conti, Clara Benedetta; Cavalcoli, Federica; Fraquelli, Mirella; Conte, Dario; Massironi, Sara
2016-03-07
Elastographic techniques are new ultrasound-based imaging techniques developed to estimate tissue deformability/stiffness. Several ultrasound elastographic approaches have been developed, such as static elastography, transient elastography and acoustic radiation force imaging methods, which include point shear wave and shear wave imaging elastography. The application of these methods in clinical practice aims at estimating the mechanical tissues properties. One of the main settings for the application of these tools has been liver stiffness assessment in chronic liver disease, which has been studied mainly using transient elastography. Another field of application for these techniques is the assessment of focal lesions, detected by ultrasound in organs such as pancreas, prostate, breast, thyroid, lymph nodes. Considering the frequency and importance of the detection of focal liver lesions through routine ultrasound, some studies have also aimed to assess the role that elestography can play in studying the stiffness of different types of liver lesions, in order to predict their nature and thus offer valuable non-invasive methods for the diagnosis of liver masses.
USDA-ARS?s Scientific Manuscript database
New, non-destructive sensing techniques for fast and more effective quality assessment of fruits and vegetables are needed to meet the ever-increasing consumer demand for better, more consistent and safer food products. Over the past 15 years, hyperspectral imaging has emerged as a new generation of...
The Management of NASA Employee Health Problem; Status 1971
NASA Technical Reports Server (NTRS)
Arnoldi, L. B.
1971-01-01
A system for assessing employee health problems is introduced. The automated billing system is based on an input format including cost of medical services by user and measures in dollars, that portion of resources spent on preventive techniques versus therapeutic techniques. The system is capable of printing long term medical histories of any employee.
Radiation exposure in X-ray-based imaging techniques used in osteoporosis
Adams, Judith E.; Guglielmi, Giuseppe; Link, Thomas M.
2010-01-01
Recent advances in medical X-ray imaging have enabled the development of new techniques capable of assessing not only bone quantity but also structure. This article provides (a) a brief review of the current X-ray methods used for quantitative assessment of the skeleton, (b) data on the levels of radiation exposure associated with these methods and (c) information about radiation safety issues. Radiation doses associated with dual-energy X-ray absorptiometry are very low. However, as with any X-ray imaging technique, each particular examination must always be clinically justified. When an examination is justified, the emphasis must be on dose optimisation of imaging protocols. Dose optimisation is more important for paediatric examinations because children are more vulnerable to radiation than adults. Methods based on multi-detector CT (MDCT) are associated with higher radiation doses. New 3D volumetric hip and spine quantitative computed tomography (QCT) techniques and high-resolution MDCT for evaluation of bone structure deliver doses to patients from 1 to 3 mSv. Low-dose protocols are needed to reduce radiation exposure from these methods and minimise associated health risks. PMID:20559834
Ultrasound Elastography: Review of Techniques and Clinical Applications
Sigrist, Rosa M.S.; Liau, Joy; Kaffas, Ahmed El; Chammas, Maria Cristina; Willmann, Juergen K.
2017-01-01
Elastography-based imaging techniques have received substantial attention in recent years for non-invasive assessment of tissue mechanical properties. These techniques take advantage of changed soft tissue elasticity in various pathologies to yield qualitative and quantitative information that can be used for diagnostic purposes. Measurements are acquired in specialized imaging modes that can detect tissue stiffness in response to an applied mechanical force (compression or shear wave). Ultrasound-based methods are of particular interest due to its many inherent advantages, such as wide availability including at the bedside and relatively low cost. Several ultrasound elastography techniques using different excitation methods have been developed. In general, these can be classified into strain imaging methods that use internal or external compression stimuli, and shear wave imaging that use ultrasound-generated traveling shear wave stimuli. While ultrasound elastography has shown promising results for non-invasive assessment of liver fibrosis, new applications in breast, thyroid, prostate, kidney and lymph node imaging are emerging. Here, we review the basic principles, foundation physics, and limitations of ultrasound elastography and summarize its current clinical use and ongoing developments in various clinical applications. PMID:28435467
Logic-based assessment of the compatibility of UMLS ontology sources
2011-01-01
Background The UMLS Metathesaurus (UMLS-Meta) is currently the most comprehensive effort for integrating independently-developed medical thesauri and ontologies. UMLS-Meta is being used in many applications, including PubMed and ClinicalTrials.gov. The integration of new sources combines automatic techniques, expert assessment, and auditing protocols. The automatic techniques currently in use, however, are mostly based on lexical algorithms and often disregard the semantics of the sources being integrated. Results In this paper, we argue that UMLS-Meta’s current design and auditing methodologies could be significantly enhanced by taking into account the logic-based semantics of the ontology sources. We provide empirical evidence suggesting that UMLS-Meta in its 2009AA version contains a significant number of errors; these errors become immediately apparent if the rich semantics of the ontology sources is taken into account, manifesting themselves as unintended logical consequences that follow from the ontology sources together with the information in UMLS-Meta. We then propose general principles and specific logic-based techniques to effectively detect and repair such errors. Conclusions Our results suggest that the methodologies employed in the design of UMLS-Meta are not only very costly in terms of human effort, but also error-prone. The techniques presented here can be useful for both reducing human effort in the design and maintenance of UMLS-Meta and improving the quality of its contents. PMID:21388571
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wike, L.D.
Environmental impact can be difficult to assess, especially at the ecosystem level. Any impact assessment methodology that can give cost effective and timely results is highly desirable. Rapid bioassessment (RBA) is cost effective and produces timely results. Several types of RBA have been used at the Savannah River Site (SRS) to assess stream conditions, including the Index of Biotic Integrity (IBI) based on fish community characteristics, and various techniques using aquatic macroinvertebrate species diversity and abundance. In an attempt to broaden the applicability of the RBA concept, we have also begun to develop RBA techniques for seep-fed wetlands and terrestrialmore » habitats. These techniques will focus on vertebrate and macroinvertebrate assemblages for seep-fed wetlands and arthropod assemblages for terrestrial habitats. In situ bioassay is another technique that could be used for rapid and economical assessment of the effects of anthropogenic disturbance. We propose the development of two methods of in situ bioassay that can address bioavailability of constituents of concern. The use of caged bioassay organisms can be applied to terrestrial systems such as capped or existing waste sites using the common house cricket. Another proposed bioassay could use a resident species, such as the imported red fire ant, which is found in disturbed habitats and open areas such as waste sites. Combining in situ techniques with RBA methodologies has the potential to provide a comprehensive assessment of chemical and physical impacts to a wide range of ecosystem types.« less
NASA Astrophysics Data System (ADS)
Roushangar, Kiyoumars; Mehrabani, Fatemeh Vojoudi; Shiri, Jalal
2014-06-01
This study presents Artificial Intelligence (AI)-based modeling of total bed material load through developing the accuracy level of the predictions of traditional models. Gene expression programming (GEP) and adaptive neuro-fuzzy inference system (ANFIS)-based models were developed and validated for estimations. Sediment data from Qotur River (Northwestern Iran) were used for developing and validation of the applied techniques. In order to assess the applied techniques in relation to traditional models, stream power-based and shear stress-based physical models were also applied in the studied case. The obtained results reveal that developed AI-based models using minimum number of dominant factors, give more accurate results than the other applied models. Nonetheless, it was revealed that k-fold test is a practical but high-cost technique for complete scanning of applied data and avoiding the over-fitting.
Automatic assessment of voice quality according to the GRBAS scale.
Sáenz-Lechón, Nicolás; Godino-Llorente, Juan I; Osma-Ruiz, Víctor; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando
2006-01-01
Nowadays, the most extended techniques to measure the voice quality are based on perceptual evaluation by well trained professionals. The GRBAS scale is a widely used method for perceptual evaluation of voice quality. The GRBAS scale is widely used in Japan and there is increasing interest in both Europe and the United States. However, this technique needs well-trained experts, and is based on the evaluator's expertise, depending a lot on his own psycho-physical state. Furthermore, a great variability in the assessments performed from one evaluator to another is observed. Therefore, an objective method to provide such measurement of voice quality would be very valuable. In this paper, the automatic assessment of voice quality is addressed by means of short-term Mel cepstral parameters (MFCC), and learning vector quantization (LVQ) in a pattern recognition stage. Results show that this approach provides acceptable results for this purpose, with accuracy around 65% at the best.
Jaspers, Mariëlle E H; van Haasterecht, Ludo; van Zuijlen, Paul P M; Mokkink, Lidwine B
2018-06-22
Reliable and valid assessment of burn wound depth or healing potential is essential to treatment decision-making, to provide a prognosis, and to compare studies evaluating different treatment modalities. The aim of this review was to critically appraise, compare and summarize the quality of relevant measurement properties of techniques that aim to assess burn wound depth or healing potential. A systematic literature search was performed using PubMed, EMBASE and Cochrane Library. Two reviewers independently evaluated the methodological quality of included articles using an adapted version of the Consensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist. A synthesis of evidence was performed to rate the measurement properties for each technique and to draw an overall conclusion on quality of the techniques. Thirty-six articles were included, evaluating various techniques, classified as (1) laser Doppler techniques; (2) thermography or thermal imaging; (3) other measurement techniques. Strong evidence was found for adequate construct validity of laser Doppler imaging (LDI). Moderate evidence was found for adequate construct validity of thermography, videomicroscopy, and spatial frequency domain imaging (SFDI). Only two studies reported on the measurement property reliability. Furthermore, considerable variation was observed among comparator instruments. Considering the evidence available, it appears that LDI is currently the most favorable technique; thereby assessing burn wound healing potential. Additional research is needed into thermography, videomicroscopy, and SFDI to evaluate their full potential. Future studies should focus on reliability and measurement error, and provide a precise description of which construct is aimed to measure. Copyright © 2018 Elsevier Ltd and ISBI. All rights reserved.
A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study
NASA Astrophysics Data System (ADS)
Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.
2015-03-01
The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.
Employment of adaptive learning techniques for the discrimination of acoustic emissions
NASA Astrophysics Data System (ADS)
Erkes, J. W.; McDonald, J. F.; Scarton, H. A.; Tam, K. C.; Kraft, R. P.
1983-11-01
The following aspects of this study on the discrimination of acoustic emissions (AE) were examined: (1) The analytical development and assessment of digital signal processing techniques for AE signal dereverberation, noise reduction, and source characterization; (2) The modeling and verification of some aspects of key selected techniques through a computer-based simulation; and (3) The study of signal propagation physics and their effect on received signal characteristics for relevant physical situations.
Operational Based Vision Assessment Cone Contrast Test: Description and Operation
2016-06-02
Jun 2016. Report contains color . 14. ABSTRACT The work detailed in this report was conducted by the Operational Based Vision Assessment (OBVA...currently used by the Air Force for aircrew color vision screening. The new OBVA CCT is differentiated from the Rabin device primarily by hardware...test procedures, and analysis techniques. Like the Rabin CCT, the OBVA CCT uses colors that selectively stimulate the cone photoreceptors of the
Space station/base food system study. Volume 2: System assessments
NASA Technical Reports Server (NTRS)
1970-01-01
The evaluation modeling technique is described which was used to combine the candidate element concepts into systems that meet mission requirements. Results of the assessment are presented in terms of systems performance data and plots of system trade-off data by highest ranking variable.
ERIC Educational Resources Information Center
Gable, Robert A.; Hendrickson, Jo M.
2000-01-01
This article discusses strategies and procedures for promoting maintenance and generalization of student behavior changes resulting from interventions based on functional behavioral assessment. Strategies include self-management techniques, cognitive mediation, self-advocacy training, use of peers, booster training, environmental modifications,…
Informal Reading Inventories: Creating Teacher-Designed Literature-Based Assessments
ERIC Educational Resources Information Center
Provost, Mary C.; Lambert, Monica A.; Babkie, Andrea M.
2010-01-01
Mandates emphasizing student achievement have increased the importance of appropriate assessment techniques for students in general and special education classrooms. Informal reading inventories (IRIs), designed by classroom teachers, have been proven to be an efficient and effective way to determine students' strengths, weaknesses, and strategies…
Computer Applications in Assessment and Counseling.
ERIC Educational Resources Information Center
Veldman, Donald J.; Menaker, Shirley L.
Public school counselors and psychologists can expect valuable assistance from computer-based assessment and counseling techniques within a few years, as programs now under development become generally available for the typical computers now used by schools for grade-reporting and class-scheduling. Although routine information-giving and gathering…
A Pressure Plate-Based Method for the Automatic Assessment of Foot Strike Patterns During Running.
Santuz, Alessandro; Ekizos, Antonis; Arampatzis, Adamantios
2016-05-01
The foot strike pattern (FSP, description of how the foot touches the ground at impact) is recognized to be a predictor of both performance and injury risk. The objective of the current investigation was to validate an original foot strike pattern assessment technique based on the numerical analysis of foot pressure distribution. We analyzed the strike patterns during running of 145 healthy men and women (85 male, 60 female). The participants ran on a treadmill with integrated pressure plate at three different speeds: preferred (shod and barefoot 2.8 ± 0.4 m/s), faster (shod 3.5 ± 0.6 m/s) and slower (shod 2.3 ± 0.3 m/s). A custom-designed algorithm allowed the automatic footprint recognition and FSP evaluation. Incomplete footprints were simultaneously identified and corrected from the software itself. The widely used technique of analyzing high-speed video recordings was checked for its reliability and has been used to validate the numerical technique. The automatic numerical approach showed a good conformity with the reference video-based technique (ICC = 0.93, p < 0.01). The great improvement in data throughput and the increased completeness of results allow the use of this software as a powerful feedback tool in a simple experimental setup.
Assessing LiDAR elevation data for KDOT applications.
DOT National Transportation Integrated Search
2013-02-01
LiDAR-based elevation surveys are a cost-effective means for mapping topography over large areas. LiDAR : surveys use an airplane-mounted or ground-based laser radar unit to scan terrain. Post-processing techniques are : applied to remove vegetation ...
NASA Technical Reports Server (NTRS)
Connor, S. A.; Wierwille, W. W.
1983-01-01
A comparison of the sensitivity and intrusion of twenty pilot workload assessment techniques was conducted using a psychomotor loading task in a three degree of freedom moving base aircraft simulator. The twenty techniques included opinion measures, spare mental capacity measures, physiological measures, eye behavior measures, and primary task performance measures. The primary task was an instrument landing system (ILS) approach and landing. All measures were recorded between the outer marker and the middle marker on the approach. Three levels (low, medium, and high) of psychomotor load were obtained by the combined manipulation of windgust disturbance level and simulated aircraft pitch stability. Six instrument rated pilots participated in four seasons lasting approximately three hours each.
Quinone-based stable isotope probing for assessment of 13C substrate-utilizing bacteria
NASA Astrophysics Data System (ADS)
Kunihiro, Tadao; Katayama, Arata; Demachi, Toyoko; Veuger, Bart; Boschker, Henricus T. S.; van Oevelen, Dick
2015-04-01
In this study, we attempted to establish quinone-stable-isotope probing (SIP) technique to link substrate-utilizing bacterial group to chemotaxonomic group in bacterial community. To identify metabolically active bacterial group in various environments, SIP techniques combined with biomarkers have been widely utilized as an attractive method for environmental study. Quantitative approaches of the SIP technique have unique advantage to assess substrate-incorporation into bacteria. As a most major quantitative approach, SIP technique based on phospholipid-derived fatty acids (PLFA) have been applied to simultaneously assess substrate-incorporation rate into bacteria and microbial community structure. This approach is powerful to estimate the incorporation rate because of the high sensitivity due to the detection by a gas chromatograph-combustion interface-isotope ratio mass spectrometer (GC-c-IRMS). However, its phylogenetic resolution is limited by specificity of a compound-specific marker. We focused on respiratory quinone as a biomarker. Our previous study found a good correlation between concentrations of bacteria-specific PLFAs and quinones over several orders of magnitude in various marine sediments, and the quinone method has a higher resolution (bacterial phylum level) for resolving differences in bacterial community composition more than that of bacterial PLFA. Therefore, respiratory quinones are potentially good biomarkers for quantitative approaches of the SIP technique. The LC-APCI-MS method as molecular-mass based detection method for quinone was developed and provides useful structural information for identifying quinone molecular species in environmental samples. LC-MS/MS on hybrid triple quadrupole/linear ion trap, which enables to simultaneously identify and quantify compounds in a single analysis, can detect high molecular compounds with their isotope ions. Use of LC-MS/MS allows us to develop quinone-SIP based on molecular mass differences due to 13C abundance in the quinone. In this study, we verified carbon stable isotope of quinone compared with bulk carbon stable isotope of bacterial culture. Results indicated a good correlation between carbon stable isotope of quinone compared with bulk carbon stable isotope. However, our measurement conditions for detection of quinone isotope-ions incurred underestimation of 13C abundance in the quinone. The quinone-SIP technique needs further optimization for measurement conditions of LC-MS/MS.
Mélin, Frédéric; Zibordi, Giuseppe
2007-06-20
An optically based technique is presented that produces merged spectra of normalized water-leaving radiances L(WN) by combining spectral data provided by independent satellite ocean color missions. The assessment of the merging technique is based on a four-year field data series collected by an autonomous above-water radiometer located on the Acqua Alta Oceanographic Tower in the Adriatic Sea. The uncertainties associated with the merged L(WN) obtained from the Sea-viewing Wide Field-of-view Sensor and the Moderate Resolution Imaging Spectroradiometer are consistent with the validation statistics of the individual sensor products. The merging including the third mission Medium Resolution Imaging Spectrometer is also addressed for a reduced ensemble of matchups.
Early warning and crop condition assessment research
NASA Technical Reports Server (NTRS)
Boatwright, G. O.; Whitehead, V. S.
1986-01-01
The Early Warning Crop Condition Assessment Project of AgRISTARS was a multiagency and multidisciplinary effort. Its mission and objectives were centered around development and testing of remote-sensing techniques that enhance operational methodologies for global crop-condition assessments. The project developed crop stress indicators models that provide data filter and alert capabilities for monitoring global agricultural conditions. The project developed a technique for using NOAA-n satellite advanced very-high-resolution radiometer (AVHRR) data for operational crop-condition assessments. This technology was transferred to the Foreign Agricultural Service of the USDA. The project developed a U.S. Great Plains data base that contains various meteorological parameters and vegetative index numbers (VIN) derived from AVHRR satellite data. It developed cloud screening techniques and scan angle correction models for AVHRR data. It also developed technology for using remotely acquired thermal data for crop water stress indicator modeling. The project provided basic technology including spectral characteristics of soils, water, stressed and nonstressed crop and range vegetation, solar zenith angle, and atmospheric and canopy structure effects.
Bex, Axel; Fournier, Laure; Lassau, Nathalie; Mulders, Peter; Nathan, Paul; Oyen, Wim J G; Powles, Thomas
2014-04-01
The introduction of targeted agents for the treatment of renal cell carcinoma (RCC) has resulted in new challenges for assessing response to therapy, and conventional response criteria using computed tomography (CT) are limited. It is widely recognised that targeted therapies may lead to significant necrosis without significant reduction in tumour size. In addition, the vascular effects of antiangiogenic therapy may occur long before there is any reduction in tumour size. To perform a systematic review of conventional and novel imaging methods for the assessment of response to targeted agents in RCC and to discuss their use from a clinical perspective. Relevant databases covering the period January 2006 to April 2013 were searched for studies reporting on the use of anatomic and functional imaging techniques to predict response to targeted therapy in RCC. Inclusion criteria were randomised trials, nonrandomised controlled studies, retrospective case series, and cohort studies. Reviews, animal and preclinical studies, case reports, and commentaries were excluded. A narrative synthesis of the evidence is presented. A total of 331 abstracts and 76 full-text articles were assessed; 34 studies met the inclusion criteria. Current methods of response assessment in RCC include anatomic methods--based on various criteria including Choi, size and attenuation CT, and morphology, attenuation, size, and structure--and functional techniques including dynamic contrast-enhanced (DCE) CT, DCE-magnetic resonance imaging, DCE-ultrasonography, positron emission tomography, and approaches utilising radiolabelled monoclonal antibodies. Functional imaging techniques are promising surrogate biomarkers of response in RCC and may be more appropriate than anatomic CT-based methods. By enabling quantification of tumour vascularisation, functional techniques can directly and rapidly detect the biologic effects of antiangiogenic therapies compared with the indirect detection of belated effects on tumour size by anatomic methods. However, larger prospective studies are needed to validate early results and standardise techniques. Copyright © 2013 European Association of Urology. All rights reserved.
Wass, Sam V
2014-08-01
Convergent research points to the importance of studying the ontogenesis of sustained attention during the early years of life, but little research hitherto has compared and contrasted different techniques available for measuring sustained attention. Here, we compare methods that have been used to assess one parameter of sustained attention, namely infants' peak look duration to novel stimuli. Our focus was to assess whether individual differences in peak look duration are stable across different measurement techniques. In a single cohort of 42 typically developing 11-month-old infants we assessed peak look duration using six different measurement paradigms (four screen-based, two naturalistic). Zero-order correlations suggested that individual differences in peak look duration were stable across all four screen-based paradigms, but no correlations were found between peak look durations observed on the screen-based and the naturalistic paradigms. A factor analysis conducted on the dependent variable of peak look duration identified two factors. All four screen-based tasks loaded onto the first factor, but the two naturalistic tasks did not relate, and mapped onto a different factor. Our results question how individual differences observed on screen-based tasks manifest in more ecologically valid contexts. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Wass, Sam V.
2014-01-01
Convergent research points to the importance of studying the ontogenesis of sustained attention during the early years of life, but little research hitherto has compared and contrasted different techniques available for measuring sustained attention. Here, we compare methods that have been used to assess one parameter of sustained attention, namely infants’ peak look duration to novel stimuli. Our focus was to assess whether individual differences in peak look duration are stable across different measurement techniques. In a single cohort of 42 typically developing 11-month-old infants we assessed peak look duration using six different measurement paradigms (four screen-based, two naturalistic). Zero-order correlations suggested that individual differences in peak look duration were stable across all four screen-based paradigms, but no correlations were found between peak look durations observed on the screen-based and the naturalistic paradigms. A factor analysis conducted on the dependent variable of peak look duration identified two factors. All four screen-based tasks loaded onto the first factor, but the two naturalistic tasks did not relate, and mapped onto a different factor. Our results question how individual differences observed on screen-based tasks manifest in more ecologically valid contexts. PMID:24905901
A constrained modulus reconstruction technique for breast cancer assessment.
Samani, A; Bishop, J; Plewes, D B
2001-09-01
A reconstruction technique for breast tissue elasticity modulus is described. This technique assumes that the geometry of normal and suspicious tissues is available from a contrast-enhanced magnetic resonance image. Furthermore, it is assumed that the modulus is constant throughout each tissue volume. The technique, which uses quasi-static strain data, is iterative where each iteration involves modulus updating followed by stress calculation. Breast mechanical stimulation is assumed to be done by two compressional rigid plates. As a result, stress is calculated using the finite element method based on the well-controlled boundary conditions of the compression plates. Using the calculated stress and the measured strain, modulus updating is done element-by-element based on Hooke's law. Breast tissue modulus reconstruction using simulated data and phantom modulus reconstruction using experimental data indicate that the technique is robust.
Developing an ICT-Literacy Task-Based Assessment Instrument: The Findings on the Final Testing Phase
ERIC Educational Resources Information Center
Mat-jizat, Jessnor Elmy
2013-01-01
This paper reports the findings of a study which seeks to identify the information and communications technology (ICT) literacy levels of trainee teachers, by investigating their ICT proficiency using a task-bask assessment instrument. The Delphi technique was used as a primary validation method for the new assessment tool and the ICT literacy…
NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding
Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo
2016-01-01
Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280
Introduction to a New Approach to Experiential Learning.
ERIC Educational Resources Information Center
Jackson, Lewis; MacIsaac, Doug
1994-01-01
A process model for experiential learning (EL) in adult education begins with the characteristics and needs of adult learners and conceptual foundations of EL. It includes methods and techniques for in-class and field-based experiences, building a folio (point-in-time performance assessment), and portfolio construction (assessing transitional…
TRANSFERRING TECHNOLOGIES, TOOLS AND TECHNIQUES: THE NATIONAL COASTAL ASSESSMENT
The purpose of the National Coastal Assessment (NCA) is to estimate the status and trends of the condition of the nation's coastal resources on a state, regional and national basis. Based on NCA monitoring from 1999-2001, 100% of the nation's estuarine waters (at over 2500 locati...
Assessing Institutional Ineffectiveness: A Strategy for Improvement.
ERIC Educational Resources Information Center
Cameron, Kim S.
1984-01-01
Based on the theory that institutional change and improvement are motivated more by knowledge of problems than by knowledge of successes, a fault tree analysis technique using Boolean logic for assessing institutional ineffectiveness by determining weaknesses in the system is presented. Advantages and disadvantages of focusing on weakness rather…
Modelling Subjectivity in Visual Perception of Orientation for Image Retrieval.
ERIC Educational Resources Information Center
Sanchez, D.; Chamorro-Martinez, J.; Vila, M. A.
2003-01-01
Discussion of multimedia libraries and the need for storage, indexing, and retrieval techniques focuses on the combination of computer vision and data mining techniques to model high-level concepts for image retrieval based on perceptual features of the human visual system. Uses fuzzy set theory to measure users' assessments and to capture users'…
NASA Astrophysics Data System (ADS)
Vidya Sagar, R.; Raghu Prasad, B. K.
2012-03-01
This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.
Combining model based and data based techniques in a robust bridge health monitoring algorithm.
DOT National Transportation Integrated Search
2014-09-01
Structural Health Monitoring (SHM) aims to analyze civil, mechanical and aerospace systems in order to assess : incipient damage occurrence. In this project, we are concerned with the development of an algorithm within the : SHM paradigm for applicat...
Assessing LiDAR elevation data for KDOT applications : [technical summary].
DOT National Transportation Integrated Search
2013-02-01
LiDAR-based elevation surveys : are a cost-effective means for : mapping topography over large : areas. LiDAR surveys use an : airplane-mounted or ground-based : laser radar unit to scan terrain. : Post-processing techniques are : applied to remove v...
Computer-aided assessment of pulmonary disease in novel swine-origin H1N1 influenza on CT
NASA Astrophysics Data System (ADS)
Yao, Jianhua; Dwyer, Andrew J.; Summers, Ronald M.; Mollura, Daniel J.
2011-03-01
The 2009 pandemic is a global outbreak of novel H1N1 influenza. Radiologic images can be used to assess the presence and severity of pulmonary infection. We develop a computer-aided assessment system to analyze the CT images from Swine-Origin Influenza A virus (S-OIV) novel H1N1 cases. The technique is based on the analysis of lung texture patterns and classification using a support vector machine (SVM). Pixel-wise tissue classification is computed from the SVM value. The method was validated on four H1N1 cases and ten normal cases. We demonstrated that the technique can detect regions of pulmonary abnormality in novel H1N1 patients and differentiate these regions from visually normal lung (area under the ROC curve is 0.993). This technique can also be applied to differentiate regions infected by different pulmonary diseases.
Advanced MR Imaging of the Placenta: Exploring the in utero placenta-brain connection
Andescavage, Nickie Niforatos; DuPlessis, Adre; Limperopoulos, Catherine
2015-01-01
The placenta is a vital organ necessary for the healthy neurodevelopment of the fetus. Despite the known associations between placental dysfunction and neurologic impairment, there is a paucity of tools available to reliably assess in vivo placental health and function. Existing clinical tools for placental assessment remain insensitive in predicting and assessing placental well-being. Advanced MRI techniques hold significant promise for the dynamic, non-invasive, real-time assessment of placental health and identification of early placental-based disorders. In this review, we summarize the available clinical tools for placental assessment including ultrasound, Doppler, and conventional MRI. We then explore the emerging role of advanced placental MR imaging techniques for supporting the developing fetus, appraise the strengths and limitations of quantitative MRI in identifying early markers of placental dysfunction for improved pregnancy monitoring and fetal outcomes. PMID:25765905
Employing continuous quality improvement in community-based substance abuse programs.
Chinman, Matthew; Hunter, Sarah B; Ebener, Patricia
2012-01-01
This article aims to describe continuous quality improvement (CQI) for substance abuse prevention and treatment programs in a community-based organization setting. CQI (e.g., plan-do-study-act cycles (PDSA)) applied in healthcare and industry was adapted for substance abuse prevention and treatment programs in a community setting. The authors assessed the resources needed, acceptability and CQI feasibility for ten programs by evaluating CQI training workshops with program staff and a series of three qualitative interviews over a nine-month implementation period with program participants. The CQI activities, PDSA cycle progress, effort, enthusiasm, benefits and challenges were examined. Results indicated that CQI was feasible and acceptable for community-based substance abuse prevention and treatment programs; however, some notable resource challenges remain. Future studies should examine CQI impact on service quality and intended program outcomes. The study was conducted on a small number of programs. It did not assess CQI impact on service quality and intended program outcomes. Practical implications- This project shows that it is feasible to adapt CQI techniques and processes for community-based programs substance abuse prevention and treatment programs. These techniques may help community-based program managers to improve service quality and achieve program outcomes. This is one of the first studies to adapt traditional CQI techniques for community-based settings delivering substance abuse prevention and treatment programs.
Gaussian process regression for tool wear prediction
NASA Astrophysics Data System (ADS)
Kong, Dongdong; Chen, Yongjie; Li, Ning
2018-05-01
To realize and accelerate the pace of intelligent manufacturing, this paper presents a novel tool wear assessment technique based on the integrated radial basis function based kernel principal component analysis (KPCA_IRBF) and Gaussian process regression (GPR) for real-timely and accurately monitoring the in-process tool wear parameters (flank wear width). The KPCA_IRBF is a kind of new nonlinear dimension-increment technique and firstly proposed for feature fusion. The tool wear predictive value and the corresponding confidence interval are both provided by utilizing the GPR model. Besides, GPR performs better than artificial neural networks (ANN) and support vector machines (SVM) in prediction accuracy since the Gaussian noises can be modeled quantitatively in the GPR model. However, the existence of noises will affect the stability of the confidence interval seriously. In this work, the proposed KPCA_IRBF technique helps to remove the noises and weaken its negative effects so as to make the confidence interval compressed greatly and more smoothed, which is conducive for monitoring the tool wear accurately. Moreover, the selection of kernel parameter in KPCA_IRBF can be easily carried out in a much larger selectable region in comparison with the conventional KPCA_RBF technique, which helps to improve the efficiency of model construction. Ten sets of cutting tests are conducted to validate the effectiveness of the presented tool wear assessment technique. The experimental results show that the in-process flank wear width of tool inserts can be monitored accurately by utilizing the presented tool wear assessment technique which is robust under a variety of cutting conditions. This study lays the foundation for tool wear monitoring in real industrial settings.
Saha, Abhijoy; Banerjee, Sayantan; Kurtek, Sebastian; Narang, Shivali; Lee, Joonsang; Rao, Ganesh; Martinez, Juan; Bharath, Karthik; Rao, Arvind U K; Baladandayuthapani, Veerabhadran
2016-01-01
Tumor heterogeneity is a crucial area of cancer research wherein inter- and intra-tumor differences are investigated to assess and monitor disease development and progression, especially in cancer. The proliferation of imaging and linked genomic data has enabled us to evaluate tumor heterogeneity on multiple levels. In this work, we examine magnetic resonance imaging (MRI) in patients with brain cancer to assess image-based tumor heterogeneity. Standard approaches to this problem use scalar summary measures (e.g., intensity-based histogram statistics) that do not adequately capture the complete and finer scale information in the voxel-level data. In this paper, we introduce a novel technique, DEMARCATE (DEnsity-based MAgnetic Resonance image Clustering for Assessing Tumor hEterogeneity) to explore the entire tumor heterogeneity density profiles (THDPs) obtained from the full tumor voxel space. THDPs are smoothed representations of the probability density function of the tumor images. We develop tools for analyzing such objects under the Fisher-Rao Riemannian framework that allows us to construct metrics for THDP comparisons across patients, which can be used in conjunction with standard clustering approaches. Our analyses of The Cancer Genome Atlas (TCGA) based Glioblastoma dataset reveal two significant clusters of patients with marked differences in tumor morphology, genomic characteristics and prognostic clinical outcomes. In addition, we see enrichment of image-based clusters with known molecular subtypes of glioblastoma multiforme, which further validates our representation of tumor heterogeneity and subsequent clustering techniques.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
Assessment of simulation fidelity using measurements of piloting technique in flight
NASA Technical Reports Server (NTRS)
Clement, W. F.; Cleveland, W. B.; Key, D. L.
1984-01-01
The U.S. Army and NASA joined together on a project to conduct a systematic investigation and validation of a ground based piloted simulation of the Army/Sikorsky UH-60A helicopter. Flight testing was an integral part of the validation effort. Nap-of-the-Earth (NOE) piloting tasks which were investigated included the bob-up, the hover turn, the dash/quickstop, the sidestep, the dolphin, and the slalom. Results from the simulation indicate that the pilot's NOE task performance in the simulator is noticeably and quantifiably degraded when compared with the task performance results generated in flight test. The results of the flight test and ground based simulation experiments support a unique rationale for the assessment of simulation fidelity: flight simulation fidelity should be judged quantitatively by measuring pilot's control strategy and technique as induced by the simulator. A quantitative comparison is offered between the piloting technique observed in a flight simulator and that observed in flight test for the same tasks performed by the same pilots.
Kaale, Eliangiringa; Hope, Samuel M; Jenkins, David; Layloff, Thomas
2016-01-01
To assess the quality of cotrimoxazole tablets produced by a Tanzanian manufacturer by a newly instituted quality assurance programme. Tablets underwent a diffuse reflectance spectroscopy procedure with periodic quality assessment confirmation by assay and dissolution testing using validated HPTLC techniques (including weight variation and disintegration evaluations). Based on results from the primary test methods, the first group of product was <80% compliant, whereas subsequent groups reached >99% compliance. This approach provides a model for rapidly assuring product quality of future procurements of other products that is more cost-effective than traditional pharmaceutical testing techniques. © 2015 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel
2012-01-01
In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.
West, Robert; Evans, Adam; Michie, Susan
2011-12-01
To develop a reliable coding scheme for components of group-based behavioral support for smoking cessation, to establish the frequency of inclusion in English Stop-Smoking Service (SSS) treatment manuals of specific components, and to investigate the associations between inclusion of behavior change techniques (BCTs) and service success rates. A taxonomy of BCTs specific to group-based behavioral support was developed and reliability of use assessed. All English SSSs (n = 145) were contacted to request their group-support treatment manuals. BCTs included in the manuals were identified using this taxonomy. Associations between inclusion of specific BCTs and short-term (4-week) self-reported quit outcomes were assessed. Fourteen group-support BCTs were identified with >90% agreement between coders. One hundred and seven services responded to the request for group-support manuals of which 30 had suitable documents. On average, 7 BCTs were included in each manual. Two were positively associated with 4-week quit rates: "communicate group member identities" and a "betting game" (a financial deposit that is lost if a stop-smoking "buddy" relapses). It is possible to reliably code group-specific BCTs for smoking cessation. Fourteen such techniques are present in guideline documents of which 2 appear to be associated with higher short-term self-reported quit rates when included in treatment manuals of English SSSs.
Carbo, Alexander R; Blanco, Paola G; Graeme-Cooke, Fiona; Misdraji, Joseph; Kappler, Steven; Shaffer, Kitt; Goldsmith, Jeffrey D; Berzin, Tyler; Leffler, Daniel; Najarian, Robert; Sepe, Paul; Kaplan, Jennifer; Pitman, Martha; Goldman, Harvey; Pelletier, Stephen; Hayward, Jane N; Shields, Helen M
2012-05-15
In 2008, we changed the gastrointestinal pathology laboratories in a gastrointestinal pathophysiology course to a more interactive format using modified team-based learning techniques and multimedia presentations. The results were remarkably positive and can be used as a model for pathology laboratory improvement in any organ system. Over a two-year period, engaging and interactive pathology laboratories were designed. The initial restructuring of the laboratories included new case material, Digital Atlas of Video Education Project videos, animations and overlays. Subsequent changes included USMLE board-style quizzes at the beginning of each laboratory, with individual readiness assessment testing and group readiness assessment testing, incorporation of a clinician as a co-teacher and role playing for the student groups. Student responses for pathology laboratory contribution to learning improved significantly compared to baseline. Increased voluntary attendance at pathology laboratories was observed. Spontaneous student comments noted the positive impact of the laboratories on their learning. Pathology laboratory innovations, including modified team-based learning techniques with individual and group self-assessment quizzes, multimedia presentations, and paired teaching by a pathologist and clinical gastroenterologist led to improvement in student perceptions of pathology laboratory contributions to their learning and better pathology faculty evaluations. These changes can be universally applied to other pathology laboratories to improve student satisfaction. Copyright © 2012 Elsevier GmbH. All rights reserved.
Efficient use of mobile devices for quantification of pressure injury images.
Garcia-Zapirain, Begonya; Sierra-Sosa, Daniel; Ortiz, David; Isaza-Monsalve, Mariano; Elmaghraby, Adel
2018-01-01
Pressure Injuries are chronic wounds that are formed due to the constriction of the soft tissues against bone prominences. In order to assess these injuries, the medical personnel carry out the evaluation and diagnosis using visual methods and manual measurements, which can be inaccurate and may generate discomfort in the patients. By using segmentation techniques, the Pressure Injuries can be extracted from an image and accurately parameterized, leading to a correct diagnosis. In general, these techniques are based on the solution of differential equations and the involved numerical methods are demanding in terms of computational resources. In previous work, we proposed a technique developed using toroidal parametric equations for image decomposition and segmentation without solving differential equations. In this paper, we present the development of a mobile application useful for the non-contact assessment of Pressure Injuries based on the toroidal decomposition from images. The usage of this technique allows us to achieve an accurate segmentation almost 8 times faster than Active Contours without Edges (ACWE) and Dynamic Contours methods. We describe the techniques and the implementation for Android devices using Python and Kivy. This application allows for the segmentation and parameterization of injuries, obtain relevant information for the diagnosis and tracking the evolution of patient's injuries.
Kate, Rohit J.; Swartz, Ann M.; Welch, Whitney A.; Strath, Scott J.
2016-01-01
Wearable accelerometers can be used to objectively assess physical activity. However, the accuracy of this assessment depends on the underlying method used to process the time series data obtained from accelerometers. Several methods have been proposed that use this data to identify the type of physical activity and estimate its energy cost. Most of the newer methods employ some machine learning technique along with suitable features to represent the time series data. This paper experimentally compares several of these techniques and features on a large dataset of 146 subjects doing eight different physical activities wearing an accelerometer on the hip. Besides features based on statistics, distance based features and simple discrete features straight from the time series were also evaluated. On the physical activity type identification task, the results show that using more features significantly improve results. Choice of machine learning technique was also found to be important. However, on the energy cost estimation task, choice of features and machine learning technique were found to be less influential. On that task, separate energy cost estimation models trained specifically for each type of physical activity were found to be more accurate than a single model trained for all types of physical activities. PMID:26862679
Microbial monitoring of spacecraft and associated environments
NASA Technical Reports Server (NTRS)
La Duc, M. T.; Kern, R.; Venkateswaran, K.
2004-01-01
Rapid microbial monitoring technologies are invaluable in assessing contamination of spacecraft and associated environments. Universal and widespread elements of microbial structure and chemistry are logical targets for assessing microbial burden. Several biomarkers such as ATP, LPS, and DNA (ribosomal or spore-specific), were targeted to quantify either total bioburden or specific types of microbial contamination. The findings of these assays were compared with conventional, culture-dependent methods. This review evaluates the applicability and efficacy of some of these methods in monitoring the microbial burden of spacecraft and associated environments. Samples were collected from the surfaces of spacecraft, from surfaces of assembly facilities, and from drinking water reservoirs aboard the International Space Station (ISS). Culture-dependent techniques found species of Bacillus to be dominant on these surfaces. In contrast, rapid, culture-independent techniques revealed the presence of many Gram-positive and Gram-negative microorganisms, as well as actinomycetes and fungi. These included both cultivable and noncultivable microbes, findings further confirmed by DNA-based microbial detection techniques. Although the ISS drinking water was devoid of cultivable microbes, molecular-based techniques retrieved DNA sequences of numerous opportunistic pathogens. Each of the methods tested in this study has its advantages, and by coupling two or more of these techniques even more reliable information as to microbial burden is rapidly obtained. Copyright 2004 Springer-Verlag.
A novel spectral imaging system for quantitative analysis of hypertrophic scar
NASA Astrophysics Data System (ADS)
Ghassemi, Pejhman; Shupp, Jeffrey W.; Moffatt, Lauren T.; Ramella-Roman, Jessica C.
2013-03-01
Scarring can lead to significant cosmetic, psychosocial, and functional consequences in patients with hypertrophic scars from burn and trauma injuries. Therefore, quantitative assessment of scar is needed in clinical diagnosis and treatment. The Vancouver Scar Scale (VSS), the accepted clinical scar assessment tool, was introduced in the nineties and relies only on the physician subjective evaluation of skin pliability, height, vascularity, and pigmentation. To date, no entirely objective method has been available for scar assessment. So, there is a continued need for better techniques to monitor patients with scars. We introduce a new spectral imaging system combining out-of-plane Stokes polarimetry, Spatial Frequency Domain Imaging (SFDI), and three-dimensional (3D) reconstruction. The main idea behind this system is to estimate hemoglobin and melanin contents of scar using SFDI technique, roughness and directional anisotropy features with Stokes polarimetry, and height and general shape with 3D reconstruction. Our proposed tool has several advantages compared to current methodologies. First and foremost, it is non-contact and non-invasive and thus can be used at any stage in wound healing without causing harm to the patient. Secondarily, the height, pigmentation, and hemoglobin assessments are co-registered and are based on imaging and not point measurement, allowing for more meaningful interpretation of the data. Finally, the algorithms used in the data analysis are physics based which will be very beneficial in the standardization of the technique. A swine model has also been developed for hypertrophic scarring and an ongoing pre-clinical evaluation of the technique is being conducted.
Wehage, Kristopher; Chenhansa, Panan; Schoenung, Julie M
2017-01-01
GreenScreen® for Safer Chemicals is a framework for comparative chemical hazard assessment. It is the first transparent, open and publicly accessible framework of its kind, allowing manufacturers and governmental agencies to make informed decisions about the chemicals and substances used in consumer products and buildings. In the GreenScreen® benchmarking process, chemical hazards are assessed and classified based on 18 hazard endpoints from up to 30 different sources. The result is a simple numerical benchmark score and accompanying assessment report that allows users to flag chemicals of concern and identify safer alternatives. Although the screening process is straightforward, aggregating and sorting hazard data is tedious, time-consuming, and prone to human error. In light of these challenges, the present work demonstrates the usage of automation to cull chemical hazard data from publicly available internet resources, assign metadata, and perform a GreenScreen® hazard assessment using the GreenScreen® "List Translator." The automated technique, written as a module in the Python programming language, generates GreenScreen® List Translation data for over 3000 chemicals in approximately 30 s. Discussion of the potential benefits and limitations of automated techniques is provided. By embedding the library into a web-based graphical user interface, the extensibility of the library is demonstrated. The accompanying source code is made available to the hazard assessment community. Integr Environ Assess Manag 2017;13:167-176. © 2016 SETAC. © 2016 SETAC.
Reviewing effectiveness of ankle assessment techniques for use in robot-assisted therapy.
Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Shane
2014-01-01
This article provides a comprehensive review of studies that investigated ankle assessment techniques to better understand those that can be used in the real-time monitoring of rehabilitation progress for implementation in conjunction with robot-assisted therapy. Seventy-six publications published between January 1980 and August 2013 were selected based on eight databases. They were divided into two main categories (16 qualitative and 60 quantitative studies): 13 goniometer studies, 18 dynamometer studies, and 29 studies about innovative techniques. A total of 465 subjects participated in the 29 quantitative studies of innovative measurement techniques that may potentially be integrated in a real-time monitoring device, of which 19 studies included less than 10 participants. Results show that qualitative ankle assessment methods are not suitable for real-time monitoring in robot-assisted therapy, though they are reliable for certain patients, while the quantitative methods show great potential. The majority of quantitative techniques are reliable in measuring ankle kinematics and kinetics but are usually available only for use in the sagittal plane. Limited studies determine kinematics and kinetics in all three planes (sagittal, transverse, and frontal) where motions of the ankle joint and the subtalar joint actually occur.
Cardiac output monitoring using indicator-dilution techniques: basics, limits, and perspectives.
Reuter, Daniel A; Huang, Cecil; Edrich, Thomas; Shernan, Stanton K; Eltzschig, Holger K
2010-03-01
The ability to monitor cardiac output is one of the important cornerstones of hemodynamic assessment for managing critically ill patients at increased risk for developing cardiac complications, and in particular in patients with preexisting cardiovascular comorbidities. For >30 years, single-bolus thermodilution measurement through a pulmonary artery catheter for assessment of cardiac output has been widely accepted as the "clinical standard" for advanced hemodynamic monitoring. In this article, we review this clinical standard, along with current alternatives also based on the indicator-dilution technique, such as the transcardiopulmonary thermodilution and lithium dilution techniques. In this review, not only the underlying technical principles and the unique features but also the limitations of each application of indicator dilution are outlined.
DOT National Transportation Integrated Search
2010-07-01
The objective of this work was to develop a : low-cost portable damage detection tool to : assess and predict damage areas in highway : bridges. : The proposed tool was based on standard : vibration-based damage identification (VBDI) : techniques but...
CONTINUOUS FORMALDEHYDE MEASUREMENT SYSTEM BASED ON MODIFIED FOURIER TRANSFORM INFRARED SPECTROSCOPY
EPA is developing advanced open-path and cell-based optical techniques for time-resolved measurement of priority hazardous air pollutants such as formaldehyde (HCHO). Due to its high National Air Toxics Assessment risk factor, there is increasing interest in continuous measuremen...
20180312 - Structure-based QSAR Models to Predict Systemic Toxicity Points of Departure (SOT)
Human health risk assessment associated with environmental chemical exposure is limited by the tens of thousands of chemicals with little or no experimental in vivo toxicity data. Data gap filling techniques, such as quantitative structure activity relationship (QSAR) models base...
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Ishikawa, Shun; Okamoto, Shogo; Isogai, Kaoru; Akiyama, Yasuhiro; Yanagihara, Naomi; Yamada, Yoji
2015-01-01
Robots that simulate patients suffering from joint resistance caused by biomechanical and neural impairments are used to aid the training of physical therapists in manual examination techniques. However, there are few methods for assessing such robots. This article proposes two types of assessment measures based on typical judgments of clinicians. One of the measures involves the evaluation of how well the simulator presents different severities of a specified disease. Experienced clinicians were requested to rate the simulated symptoms in terms of severity, and the consistency of their ratings was used as a performance measure. The other measure involves the evaluation of how well the simulator presents different types of symptoms. In this case, the clinicians were requested to classify the simulated resistances in terms of symptom type, and the average ratios of their answers were used as performance measures. For both types of assessment measures, a higher index implied higher agreement among the experienced clinicians that subjectively assessed the symptoms based on typical symptom features. We applied these two assessment methods to a patient knee robot and achieved positive appraisals. The assessment measures have potential for use in comparing several patient simulators for training physical therapists, rather than as absolute indices for developing a standard. PMID:25923719
Khan, Muhammad Burhan; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Lai, Koon Chun
2017-12-01
Image processing and analysis is an effective tool for monitoring and fault diagnosis of activated sludge (AS) wastewater treatment plants. The AS image comprise of flocs (microbial aggregates) and filamentous bacteria. In this paper, nine different approaches are proposed for image segmentation of phase-contrast microscopic (PCM) images of AS samples. The proposed strategies are assessed for their effectiveness from the perspective of microscopic artifacts associated with PCM. The first approach uses an algorithm that is based on the idea that different color space representation of images other than red-green-blue may have better contrast. The second uses an edge detection approach. The third strategy, employs a clustering algorithm for the segmentation and the fourth applies local adaptive thresholding. The fifth technique is based on texture-based segmentation and the sixth uses watershed algorithm. The seventh adopts a split-and-merge approach. The eighth employs Kittler's thresholding. Finally, the ninth uses a top-hat and bottom-hat filtering-based technique. The approaches are assessed, and analyzed critically with reference to the artifacts of PCM. Gold approximations of ground truth images are prepared to assess the segmentations. Overall, the edge detection-based approach exhibits the best results in terms of accuracy, and the texture-based algorithm in terms of false negative ratio. The respective scenarios are explained for suitability of edge detection and texture-based algorithms.
Ronald E. McRoberts; Warren B. Cohen; Erik Naesset; Stephen V. Stehman; Erkki O. Tomppo
2010-01-01
Tremendous advances in the construction and assessment of forest attribute maps and related spatial products have been realized in recent years, partly as a result of the use of remotely sensed data as an information source. This review focuses on the current state of techniques for the construction and assessment of remote sensing-based maps and addresses five topic...
An assessment of the adaptive unstructured tetrahedral grid, Euler Flow Solver Code FELISA
NASA Technical Reports Server (NTRS)
Djomehri, M. Jahed; Erickson, Larry L.
1994-01-01
A three-dimensional solution-adaptive Euler flow solver for unstructured tetrahedral meshes is assessed, and the accuracy and efficiency of the method for predicting sonic boom pressure signatures about simple generic models are demonstrated. Comparison of computational and wind tunnel data and enhancement of numerical solutions by means of grid adaptivity are discussed. The mesh generation is based on the advancing front technique. The FELISA code consists of two solvers, the Taylor-Galerkin and the Runge-Kutta-Galerkin schemes, both of which are spacially discretized by the usual Galerkin weighted residual finite-element methods but with different explicit time-marching schemes to steady state. The solution-adaptive grid procedure is based on either remeshing or mesh refinement techniques. An alternative geometry adaptive procedure is also incorporated.
Kassou, Koussila; Remram, Youcef; Laugier, Pascal; Minonzio, Jean-Gabriel
2017-11-01
Guided waves-based techniques are currently under development for quantitative cortical bone assessment. However, the signal interpretation is challenging due to multiple mode overlapping. To overcome this limitation, dry point-contact transducers have been used at low frequencies for a selective excitation of the zeroth order anti-symmetric Lamb A0 mode, a mode whose dispersion characteristics can be used to infer the thickness of the waveguide. In this paper, our purpose was to extend the technique by combining a dry point-contact transducers approach to the SVD-enhanced 2-D Fourier transform in order to measure the dispersion characteristics of the flexural mode. The robustness of our approach is assessed on bone-mimicking phantoms covered or not with soft tissue-mimicking layer. Experiments were also performed on a bovine bone. Dispersion characteristics of measured modes were extracted using a SVD-based signal processing technique. The thickness was obtained by fitting a free plate model to experimental data. The results show that, in all studied cases, the estimated thickness values are in good agreement with the actual thickness values. From the results, we speculate that in vivo cortical thickness assessment by measuring the flexural wave using point-contact transducers is feasible. However, this assumption has to be confirmed by further in vivo studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Theorists and Techniques: Connecting Education Theories to Lamaze Teaching Techniques
Podgurski, Mary Jo
2016-01-01
ABSTRACT Should childbirth educators connect education theory to technique? Is there more to learning about theorists than memorizing facts for an assessment? Are childbirth educators uniquely poised to glean wisdom from theorists and enhance their classes with interactive techniques inspiring participant knowledge and empowerment? Yes, yes, and yes. This article will explore how an awareness of education theory can enhance retention of material through interactive learning techniques. Lamaze International childbirth classes already prepare participants for the childbearing year by using positive group dynamics; theory will empower childbirth educators to address education through well-studied avenues. Childbirth educators can provide evidence-based learning techniques in their classes and create true behavioral change. PMID:26848246
Earth Observation System Flight Dynamics System Covariance Realism
NASA Technical Reports Server (NTRS)
Zaidi, Waqar H.; Tracewell, David
2016-01-01
This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.
ERIC Educational Resources Information Center
Bolton Inst., Inc., Wellesley, MA.
This model, developed and tested in Vermont, involved a two-day assembly using a technique concentrating on small-group dynamics to encourage productive community assessment and decision-making. Included are exhaustive documentation of an example assembly to consider options for integrating human and environmental requirements in Vermont as a…
Denys Yemshanov; Frank H. Koch; Mark Ducey; Klaus Koehler
2013-01-01
Geographic mapping of risks is a useful analytical step in ecological risk assessments and in particular, in analyses aimed to estimate risks associated with introductions of invasive organisms. In this paper, we approach invasive species risk mapping as a portfolio allocation problem and apply techniques from decision theory to build an invasion risk map that combines...
ERIC Educational Resources Information Center
Saleh, Salmiza
2012-01-01
The aim of this study was to assess the effectiveness of Brain Based Teaching Approach in enhancing students' scientific understanding of Newtonian Physics in the context of Form Four Physics instruction. The technique was implemented based on the Brain Based Learning Principles developed by Caine & Caine (1991, 2003). This brain compatible…
The mobile image quality survey game
NASA Astrophysics Data System (ADS)
Rasmussen, D. René
2012-01-01
In this paper we discuss human assessment of the quality of photographic still images, that are degraded in various manners relative to an original, for example due to compression or noise. In particular, we examine and present results from a technique where observers view images on a mobile device, perform pairwise comparisons, identify defects in the images, and interact with the display to indicate the location of the defects. The technique measures the response time and accuracy of the responses. By posing the survey in a form similar to a game, providing performance feedback to the observer, the technique attempts to increase the engagement of the observers, and to avoid exhausting observers, a factor that is often a problem for subjective surveys. The results are compared with the known physical magnitudes of the defects and with results from similar web-based surveys. The strengths and weaknesses of the technique are discussed. Possible extensions of the technique to video quality assessment are also discussed.
Advanced reliability modeling of fault-tolerant computer-based systems
NASA Technical Reports Server (NTRS)
Bavuso, S. J.
1982-01-01
Two methodologies for the reliability assessment of fault tolerant digital computer based systems are discussed. The computer-aided reliability estimation 3 (CARE 3) and gate logic software simulation (GLOSS) are assessment technologies that were developed to mitigate a serious weakness in the design and evaluation process of ultrareliable digital systems. The weak link is based on the unavailability of a sufficiently powerful modeling technique for comparing the stochastic attributes of one system against others. Some of the more interesting attributes are reliability, system survival, safety, and mission success.
Automatic Grading of 3D Computer Animation Laboratory Assignments
ERIC Educational Resources Information Center
Lamberti, Fabrizio; Sanna, Andrea; Paravati, Gianluca; Carlevaris, Gilles
2014-01-01
Assessment is a delicate task in the overall teaching process because it may require significant time and may be prone to subjectivity. Subjectivity is especially true for disciplines in which perceptual factors play a key role in the evaluation. In previous decades, computer-based assessment techniques were developed to address the…
A Fault Tree Approach to Needs Assessment -- An Overview.
ERIC Educational Resources Information Center
Stephens, Kent G.
A "failsafe" technology is presented based on a new unified theory of needs assessment. Basically the paper discusses fault tree analysis as a technique for enhancing the probability of success in any system by analyzing the most likely modes of failure that could occur and then suggesting high priority avoidance strategies for those…
The Social Validity Assessment of Social Competence Intervention Behavior Goals
ERIC Educational Resources Information Center
Hurley, Jennifer J.; Wehby, Joseph H.; Feurer, Irene D.
2010-01-01
Social validation is the value judgment from society on the importance of a study. The social validity of behavior goals used in the social competence intervention literature was assessed using the Q-sort technique. The stimulus items were 80 different social competence behavior goals taken from 78 classroom-based social competence intervention…
Confidence-Based Assessments within an Adult Learning Environment
ERIC Educational Resources Information Center
Novacek, Paul
2013-01-01
Traditional knowledge assessments rely on multiple-choice type questions that only report a right or wrong answer. The reliance within the education system on this technique infers that a student who provides a correct answer purely through guesswork possesses knowledge equivalent to a student who actually knows the correct answer. A more complete…
Pallaro, Anabel; Tarducci, Gabriel
2014-12-01
The application of nuclear techniques in the area of nutrition is safe because they use stable isotopes. The deuterium dilution method is used in body composition and human milk intake analysis. It is a reference method for body fat and validates inexpensive tools because of its accuracy, simplicity of application in individuals and population and the background of its usefulness in adults and children as an evaluation tool in clinical and health programs. It is a non-invasive technique as it uses saliva, which facilitates the assessment in pediatric populations. Changes in body fat are associated with non-communicable diseases; moreover, normal weight individuals with high fat deposition were reported. Furthermore, this technique is the only accurate way to determine whether infants are exclusively breast-fed and validate conventional methods based on surveys to mothers.
Requirements for radiation emergency urine bioassay techniques for the public and first responders.
Li, Chunsheng; Vlahovich, Slavica; Dai, Xiongxin; Richardson, Richard B; Daka, Joseph N; Kramer, Gary H
2010-11-01
Following a radiation emergency, the affected public and the first responders may need to be quickly assessed for internal contamination by the radionuclides involved. Urine bioassay is one of the most commonly used methods for assessing radionuclide intake and radiation dose. This paper attempts to derive the sensitivity requirements (from inhalation exposure) for the urine bioassay techniques for the top 10 high-risk radionuclides that might be used in a terrorist attack. The requirements are based on a proposed reference dose to adults of 0.1 Sv (CED, committed effective dose). In addition, requirements related to sample turnaround time and field deployability of the assay techniques are also discussed. A review of currently available assay techniques summarized in this paper reveals that method development for ²⁴¹Am, ²²⁶Ra, ²³⁸Pu, and ⁹⁰Sr urine bioassay is needed.
NASA Astrophysics Data System (ADS)
Lewis, Donna L.; Phinn, Stuart
2011-01-01
Aerial photography interpretation is the most common mapping technique in the world. However, unlike an algorithm-based classification of satellite imagery, accuracy of aerial photography interpretation generated maps is rarely assessed. Vegetation communities covering an area of 530 km2 on Bullo River Station, Northern Territory, Australia, were mapped using an interpretation of 1:50,000 color aerial photography. Manual stereoscopic line-work was delineated at 1:10,000 and thematic maps generated at 1:25,000 and 1:100,000. Multivariate and intuitive analysis techniques were employed to identify 22 vegetation communities within the study area. The accuracy assessment was based on 50% of a field dataset collected over a 4 year period (2006 to 2009) and the remaining 50% of sites were used for map attribution. The overall accuracy and Kappa coefficient for both thematic maps was 66.67% and 0.63, respectively, calculated from standard error matrices. Our findings highlight the need for appropriate scales of mapping and accuracy assessment of aerial photography interpretation generated vegetation community maps.
NASA Astrophysics Data System (ADS)
McShane, Gareth; James, Mike R.; Quinton, John; Anderson, Karen; DeBell, Leon; Evans, Martin; Farrow, Luke; Glendell, Miriam; Jones, Lee; Kirkham, Matthew; Lark, Murray; Rawlins, Barry; Rickson, Jane; Quine, Tim; Wetherelt, Andy; Brazier, Richard
2014-05-01
3D topographic or surface models are increasingly being utilised for a wide range of applications and are established tools in geomorphological research. In this pilot study 'a cost effective framework for monitoring soil erosion in England and Wales', funded by the UK Department for Environment, Food and Rural Affairs (Defra), we compare methods of collecting topographic measurements via remote sensing for detailed studies of dynamic processes such as erosion and mass movement. The techniques assessed are terrestrial laser scanning (TLS), and unmanned aerial vehicle (UAV) photography and ground-based photography, processed using structure-from-motion (SfM) 3D reconstruction software. The methods will be applied in regions of different land use, including arable and horticultural, upland and semi natural habitats, and grassland, to quantify visible erosion pathways at the site scale. Volumetric estimates of soil loss will be quantified using the digital surface models (DSMs) provided by each technique and a modelled pre-erosion surface. Visible erosion and severity will be independently established through each technique, with their results compared and combined effectiveness assessed. A fixed delta-wing UAV (QuestUAV, http://www.questuav.com/) captures photos from a range of altitudes and angles over the study area, with automated SfM-based processing enabling rapid orthophoto production to support ground-based data acquisition. At sites with suitable scale erosion features, UAV data will also provide a DSM for volume loss measurement. Terrestrial laser scanning will provide detailed, accurate, high density measurements of the ground surface over long (100s m) distances. Ground-based photography is anticipated to be most useful for characterising small and difficult to view features. By using a consumer-grade digital camera and an SfM-based approach (using Agisoft Photoscan version 1.0.0, http://www.agisoft.ru/products/photoscan/), less expertise and fewer control measurements are required compared with traditional photogrammetry, and image processing is automated. Differential GPS data will be used to geo-reference the models to facilitate comparison. The relative advantages, limitations and cost-effectiveness of each approach will be established, and which technique, or combination of techniques, is most appropriate to monitor, model and estimate soil erosion at the national scale, determined.
MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, G; Pan, X; Stayman, J
2014-06-15
Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within themore » reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical applications. Learning Objectives: Learn the general methodologies associated with model-based 3D image reconstruction. Learn the potential advantages in image quality and dose associated with model-based image reconstruction. Learn the challenges associated with computational load and image quality assessment for such reconstruction methods. Learn how imaging task can be incorporated as a means to drive optimal image acquisition and reconstruction techniques. Learn how model-based reconstruction methods can incorporate prior information to improve image quality, ease sampling requirements, and reduce dose.« less
Cryptosporidium is an important waterborne protozoan parasite that can cause severe diarrhea and death in the immunocompromised. Current methods to monitor for Cryptosporidium oocysts in water are microscopy-based USEPA Methods 1622 and 1623. These methods assess total levels o...
Zinken, Katarzyna M; Cradock, Sue; Skinner, T Chas
2008-08-01
The paper presents the development of a coding tool for self-efficacy orientated interventions in diabetes self-management programmes (Analysis System for Self-Efficacy Training, ASSET) and explores its construct validity and clinical utility. Based on four sources of self-efficacy (i.e., mastery experience, role modelling, verbal persuasion and physiological and affective states), published self-efficacy based interventions for diabetes care were analysed in order to identify specific verbal behavioural techniques. Video-recorded facilitating behaviours were evaluated using ASSET. The reliability between four coders was high (K=0.71). ASSET enabled assessment of both self-efficacy based techniques and participants' response to those techniques. Individual patterns of delivery and shifts over time across facilitators were found. In the presented intervention we observed that self-efficacy utterances were followed by longer patient verbal responses than non-self-efficacy utterances. These detailed analyses with ASSET provide rich data and give the researcher an insight into the underlying mechanism of the intervention process. By providing a detailed description of self-efficacy strategies ASSET can be used by health care professionals to guide reflective practice and support training programmes.
Pauchard, Y; Smith, M; Mintchev, M
2004-01-01
Magnetic resonance imaging (MRI) suffers from geometric distortions arising from various sources. One such source are the non-linearities associated with the presence of metallic implants, which can profoundly distort the obtained images. These non-linearities result in pixel shifts and intensity changes in the vicinity of the implant, often precluding any meaningful assessment of the entire image. This paper presents a method for correcting these distortions based on non-rigid image registration techniques. Two images from a modelled three-dimensional (3D) grid phantom were subjected to point-based thin-plate spline registration. The reference image (without distortions) was obtained from a grid model including a spherical implant, and the corresponding test image containing the distortions was obtained using previously reported technique for spatial modelling of magnetic susceptibility artifacts. After identifying the nonrecoverable area in the distorted image, the calculated spline model was able to quantitatively account for the distortions, thus facilitating their compensation. Upon the completion of the compensation procedure, the non-recoverable area was removed from the reference image and the latter was compared to the compensated image. Quantitative assessment of the goodness of the proposed compensation technique is presented.
Failure warning of hydrous sandstone based on electroencephalogram technique
NASA Astrophysics Data System (ADS)
Tao, Kai; Zheng, Wei
2018-06-01
Sandstone is a type of rock mass that widely exists in nature. Moisture is an important factor that leads to sandstone structural failure. The major failure assessment methods of hydrous sandstone at present cannot satisfy real-time and portability requirements, especially lacks of warning function. In this study, acoustic emission (AE) and computed tomography (CT) techniques are combined for real-time failure assessment of hydrous sandstone. Eight visual colors for warning are screened according to different failure states, and an electroencephalogram (EEG) experiment is conducted to demonstrate their diverse excitations of the human brain's concentration.
Camacho-Basallo, Paula; Yáñez-Vico, Rosa-María; Solano-Reina, Enrique; Iglesias-Linares, Alejandro
2017-03-01
The need for accurate techniques of estimating age has sharply increased in line with the rise in illegal migration and the political, economic and socio-demographic problems that this poses in developed countries today. The methods routinely employed for determining chronological age are mainly based on determining skeletal maturation using radiological techniques. The objective of this study was to correlate five different methods for assessing skeletal maturation. 606 radiographs of growing patients were analyzed, and each patient was classified according to two cervical vertebral-based methods, two hand-wrist-based methods and one tooth-based method. Spearman's rank-order correlation coefficient was applied to assess the relationship between chronological age and the five methods of assessing maturation, as well as correlations between the five methods (p < 0.05). Spearman's rank correlation coefficients for chronological age and cervical vertebral maturation stage using both methods were 0.656/0.693 (p < 0.001), respectively, for males. For females, the correlation was stronger for both methods. The correlation coefficients for chronological age against the two hand-wrist assessment methods were statistically significant only for Fishman's method, 0.722 (p < 0.001) and 0.839 (p < 0.001), respectively for males and females. The cervical vertebral, hand-wrist and dental maturation methods of assessment were all found to correlate strongly with each other, irrespective of gender, except for Grave and Brown's method. The results found the strongest correlation between the second molars and females, and the second premolar and males. This study sheds light on and correlates with the five radiographic methods most commonly used for assessing skeletal maturation in a Spanish population in southern Europe.
Dercon, G; Mabit, L; Hancock, G; Nguyen, M L; Dornhofer, P; Bacchi, O O S; Benmansour, M; Bernard, C; Froehlich, W; Golosov, V N; Haciyakupoglu, S; Hai, P S; Klik, A; Li, Y; Lobb, D A; Onda, Y; Popa, N; Rafiq, M; Ritchie, J C; Schuller, P; Shakhashiro, A; Wallbrink, P; Walling, D E; Zapata, F; Zhang, X
2012-05-01
This paper summarizes key findings and identifies the main lessons learnt from a 5-year (2002-2008) coordinated research project (CRP) on "Assessing the effectiveness of soil conservation measures for sustainable watershed management and crop production using fallout radionuclides" (D1.50.08), organized and funded by the International Atomic Energy Agency through the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture. The project brought together nineteen participants, from Australia, Austria, Brazil, Canada, Chile, China, Japan, Morocco, Pakistan, Poland, Romania, Russian Federation, Turkey, United Kingdom, United States of America and Vietnam, involved in the use of nuclear techniques and, more particularly, fallout radionuclides (FRN) to assess the relative impacts of different soil conservation measures on soil erosion and land productivity. The overall objective of the CRP was to develop improved land use and management strategies for sustainable watershed management through effective soil erosion control practices, by the use of ¹³⁷Cs (half-life of 30.2 years), ²¹⁰Pb(ex) (half-life of 22.3 years) and ⁷Be (half-life of 53.4 days) for measuring soil erosion over several spatial and temporal scales. The environmental conditions under which the different research teams applied the tools based on the use of fallout radionuclides varied considerably--a variety of climates, soils, topographies and land uses. Nevertheless, the achievements of the CRP, as reflected in this overview paper, demonstrate that fallout radionuclide-based techniques are powerful tools to assess soil erosion/deposition at several spatial and temporal scales in a wide range of environments, and offer potential to monitor soil quality. The success of the CRP has stimulated an interest in many IAEA Member States in the use of these methodologies to identify factors and practices that can enhance sustainable agriculture and minimize land degradation. Copyright © 2012 Elsevier Ltd. All rights reserved.
Dynamic drought risk assessment using crop model and remote sensing techniques
NASA Astrophysics Data System (ADS)
Sun, H.; Su, Z.; Lv, J.; Li, L.; Wang, Y.
2017-02-01
Drought risk assessment is of great significance to reduce the loss of agricultural drought and ensure food security. The normally drought risk assessment method is to evaluate its exposure to the hazard and the vulnerability to extended periods of water shortage for a specific region, which is a static evaluation method. The Dynamic Drought Risk Assessment (DDRA) is to estimate the drought risk according to the crop growth and water stress conditions in real time. In this study, a DDRA method using crop model and remote sensing techniques was proposed. The crop model we employed is DeNitrification and DeComposition (DNDC) model. The drought risk was quantified by the yield losses predicted by the crop model in a scenario-based method. The crop model was re-calibrated to improve the performance by the Leaf Area Index (LAI) retrieved from MODerate Resolution Imaging Spectroradiometer (MODIS) data. And the in-situ station-based crop model was extended to assess the regional drought risk by integrating crop planted mapping. The crop planted area was extracted with extended CPPI method from MODIS data. This study was implemented and validated on maize crop in Liaoning province, China.
NASA Astrophysics Data System (ADS)
Jha, Madan K.; Chowdary, V. M.; Chowdhury, Alivia
2010-11-01
An approach is presented for the evaluation of groundwater potential using remote sensing, geographic information system, geoelectrical, and multi-criteria decision analysis techniques. The approach divides the available hydrologic and hydrogeologic data into two groups, exogenous (hydrologic) and endogenous (subsurface). A case study in Salboni Block, West Bengal (India), uses six thematic layers of exogenous parameters and four thematic layers of endogenous parameters. These thematic layers and their features were assigned suitable weights which were normalized by analytic hierarchy process and eigenvector techniques. The layers were then integrated using ArcGIS software to generate two groundwater potential maps. The hydrologic parameters-based groundwater potential zone map indicated that the `good' groundwater potential zone covers 27.14% of the area, the `moderate' zone 45.33%, and the `poor' zone 27.53%. A comparison of this map with the groundwater potential map based on subsurface parameters revealed that the hydrologic parameters-based map accurately delineates groundwater potential zones in about 59% of the area, and hence it is dependable to a certain extent. More than 80% of the study area has moderate-to-poor groundwater potential, which necessitates efficient groundwater management for long-term water security. Overall, the integrated technique is useful for the assessment of groundwater resources at a basin or sub-basin scale.
Caries Detection Methods Based on Changes in Optical Properties between Healthy and Carious Tissue
Karlsson, Lena
2010-01-01
A conservative, noninvasive or minimally invasive approach to clinical management of dental caries requires diagnostic techniques capable of detecting and quantifying lesions at an early stage, when progression can be arrested or reversed. Objective evidence of initiation of the disease can be detected in the form of distinct changes in the optical properties of the affected tooth structure. Caries detection methods based on changes in a specific optical property are collectively referred to as optically based methods. This paper presents a simple overview of the feasibility of three such technologies for quantitative or semiquantitative assessment of caries lesions. Two of the techniques are well-established: quantitative light-induced fluorescence, which is used primarily in caries research, and laser-induced fluorescence, a commercially available method used in clinical dental practice. The third technique, based on near-infrared transillumination of dental enamel is in the developmental stages. PMID:20454579
Life cycle assessment of mobile phone housing.
Yang, Jian-xin; Wang, Ru-song; Fu, Hao; Liu, Jing-ru
2004-01-01
The life cycle assessment of the mobile phone housing in Motorola(China) Electronics Ltd. was carried out, in which materials flows and environmental emissions based on a basic production scheme were analyzed and assessed. In the manufacturing stage, such primary processes as polycarbonate molding and surface painting are included, whereas different surface finishing technologies like normal painting, electroplate, IMD and VDM etc. were assessed. The results showed that housing decoration plays a significant role within the housing life cycle. The most significant environmental impact from housing production is the photochemical ozone formation potential. Environmental impacts of different decoration techniques varied widely, for example, the electroplating technique is more environmentally friendly than VDM. VDM consumes much more energy and raw material. In addition, the results of two alternative scenarios of dematerialization showed that material flow analysis and assessment is very important and valuable in selecting an environmentally friendly process.
Web-Based Interactive 3D Visualization as a Tool for Improved Anatomy Learning
ERIC Educational Resources Information Center
Petersson, Helge; Sinkvist, David; Wang, Chunliang; Smedby, Orjan
2009-01-01
Despite a long tradition, conventional anatomy education based on dissection is declining. This study tested a new virtual reality (VR) technique for anatomy learning based on virtual contrast injection. The aim was to assess whether students value this new three-dimensional (3D) visualization method as a learning tool and what value they gain…
NASA Astrophysics Data System (ADS)
Fahey, R. T.; Tallant, J.; Gough, C. M.; Hardiman, B. S.; Atkins, J.; Scheuermann, C. M.
2016-12-01
Canopy structure can be an important driver of forest ecosystem functioning - affecting factors such as radiative transfer and light use efficiency, and consequently net primary production (NPP). Both above- (aerial) and below-canopy (terrestrial) remote sensing techniques are used to assess canopy structure and each has advantages and disadvantages. Aerial techniques can cover large geographical areas and provide detailed information on canopy surface and canopy height, but are generally unable to quantitatively assess interior canopy structure. Terrestrial methods provide high resolution information on interior canopy structure and can be cost-effectively repeated, but are limited to very small footprints. Although these methods are often utilized to derive similar metrics (e.g., rugosity, LAI) and to address equivalent ecological questions and relationships (e.g., link between LAI and productivity), rarely are inter-comparisons made between techniques. Our objective is to compare methods for deriving canopy structural complexity (CSC) metrics and to assess the capacity of commonly available aerial remote sensing products (and combinations) to match terrestrially-sensed data. We also assess the potential to combine CSC metrics with image-based analysis to predict plot-based NPP measurements in forests of different ages and different levels of complexity. We use combinations of data from drone-based imagery (RGB, NIR, Red Edge), aerial LiDAR (commonly available medium-density leaf-off), terrestrial scanning LiDAR, portable canopy LiDAR, and a permanent plot network - all collected at the University of Michigan Biological Station. Our results will highlight the potential for deriving functionally meaningful CSC metrics from aerial imagery, LiDAR, and combinations of data sources. We will also present results of modeling focused on predicting plot-level NPP from combinations of image-based vegetation indices (e.g., NDVI, EVI) with LiDAR- or image-derived metrics of CSC (e.g., rugosity, porosity), canopy density, (e.g., LAI), and forest structure (e.g., canopy height). This work builds toward future efforts that will use other data combinations, such as those available at NEON sites, and could be used to inform and test popular ecosystem models (e.g., ED2) incorporating structure.
Blitz, Ari Meir; Aygun, Nafi; Herzka, Daniel A; Ishii, Masaru; Gallia, Gary L
2017-01-01
High-resolution 3D MRI of the skull base allows for a more detailed and accurate assessment of normal anatomic structures as well as the location and extent of skull base pathologies than has previously been possible. This article describes the techniques employed for high-resolution skull base MRI including pre- and post-contrast constructive interference in the steady state (CISS) imaging and their utility for evaluation of the many small structures of the skull base, focusing on those regions and concepts most pertinent to localization of cranial nerve palsies and in providing pre-operative guidance and post-operative assessment. The concept of skull base compartments as a means of conceptualizing the various layers of the skull base and their importance in assessment of masses of the skull base is discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
The workload book: Assessment of operator workload to engineering systems
NASA Technical Reports Server (NTRS)
Gopher, D.
1983-01-01
The structure and initial work performed toward the creation of a handbook for workload analysis directed at the operational community of engineers and human factors psychologists are described. The goal, when complete, will be to make accessible to such individuals the results of theoretically-based research that are of practical interest and utility in the analysis and prediction of operator workload in advanced and existing systems. In addition, the results of laboratory study focused on the development of a subjective rating technique for workload that is based on psychophysical scaling techniques are described.
Periodontal plastic surgery of gingival recessions at single and multiple teeth.
Cairo, Francesco
2017-10-01
This manuscript aims to review periodontal plastic surgery for root coverage at single and multiple gingival recessions. Techniques are assessed based on biological principles, surgical procedures, prognosticative factors and expected clinical and esthetic outcomes. The use of coronally advanced flap, laterally sliding flap, free gingival graft, the tunnel grafting technique, barrier membranes, enamel matrix derivative, collagen matrix and acellular dermal matrix are evaluated. The clinical scenario and practical implications are analyzed according to a modern evidence-based approach. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Bashir, Mustafa R; Weber, Paul W; Husarik, Daniela B; Howle, Laurens E; Nelson, Rendon C
2012-08-01
To assess whether a scan triggering technique based on the slope of the time-attenuation curve combined with table speed optimization may improve arterial enhancement in aortic CT angiography compared to conventional threshold-based triggering techniques. Measurements of arterial enhancement were performed in a physiologic flow phantom over a range of simulated cardiac outputs (2.2-8.1 L/min) using contrast media boluses of 80 and 150 mL injected at 4 mL/s. These measurements were used to construct computer models of aortic attenuation in CT angiography, using cardiac output, aortic diameter, and CT table speed as input parameters. In-plane enhancement was calculated for normal and aneurysmal aortic diameters. Calculated arterial enhancement was poor (<150 HU) along most of the scan length using the threshold-based triggering technique for low cardiac outputs and the aneurysmal aorta model. Implementation of the slope-based triggering technique with table speed optimization improved enhancement in all scenarios and yielded good- (>200 HU; 13/16 scenarios) to excellent-quality (>300 HU; 3/16 scenarios) enhancement in all cases. Slope-based triggering with table speed optimization may improve the technical quality of aortic CT angiography over conventional threshold-based techniques, and may reduce technical failures related to low cardiac output and slow flow through an aneurysmal aorta.
Phase Retrieval System for Assessing Diamond Turning and Optical Surface Defects
NASA Technical Reports Server (NTRS)
Dean, Bruce; Maldonado, Alex; Bolcar, Matthew
2011-01-01
An optical design is presented for a measurement system used to assess the impact of surface errors originating from diamond turning artifacts. Diamond turning artifacts are common by-products of optical surface shaping using the diamond turning process (a diamond-tipped cutting tool used in a lathe configuration). Assessing and evaluating the errors imparted by diamond turning (including other surface errors attributed to optical manufacturing techniques) can be problematic and generally requires the use of an optical interferometer. Commercial interferometers can be expensive when compared to the simple optical setup developed here, which is used in combination with an image-based sensing technique (phase retrieval). Phase retrieval is a general term used in optics to describe the estimation of optical imperfections or aberrations. This turnkey system uses only image-based data and has minimal hardware requirements. The system is straightforward to set up, easy to align, and can provide nanometer accuracy on the measurement of optical surface defects.
Molloi, Sabee; Ding, Huanjun; Feig, Stephen
2015-01-01
Purpose The purpose of this study was to compare the precision of mammographic breast density measurement using radiologist reader assessment, histogram threshold segmentation, fuzzy C-mean segmentation and spectral material decomposition. Materials and Methods Spectral mammography images from a total of 92 consecutive asymptomatic women (50–69 years old) who presented for annual screening mammography were retrospectively analyzed for this study. Breast density was estimated using 10 radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm and spectral material decomposition. The breast density correlation between left and right breasts was used to assess the precision of these techniques to measure breast composition relative to dual-energy material decomposition. Results In comparison to the other techniques, the results of breast density measurements using dual-energy material decomposition showed the highest correlation. The relative standard error of estimate for breast density measurements from left and right breasts using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm and dual-energy material decomposition was calculated to be 1.95, 2.87, 2.07 and 1.00, respectively. Conclusion The results indicate that the precision of dual-energy material decomposition was approximately factor of two higher than the other techniques with regard to better correlation of breast density measurements from right and left breasts. PMID:26031229
An inexpensive active optical remote sensing instrument for assessing aerosol distributions.
Barnes, John E; Sharma, Nimmi C P
2012-02-01
Air quality studies on a broad variety of topics from health impacts to source/sink analyses, require information on the distributions of atmospheric aerosols over both altitude and time. An inexpensive, simple to implement, ground-based optical remote sensing technique has been developed to assess aerosol distributions. The technique, called CLidar (Charge Coupled Device Camera Light Detection and Ranging), provides aerosol altitude profiles over time. In the CLidar technique a relatively low-power laser transmits light vertically into the atmosphere. The transmitted laser light scatters off of air molecules, clouds, and aerosols. The entire beam from ground to zenith is imaged using a CCD camera and wide-angle (100 degree) optics which are a few hundred meters from the laser. The CLidar technique is optimized for low altitude (boundary layer and lower troposphere) measurements where most aerosols are found and where many other profiling techniques face difficulties. Currently the technique is limited to nighttime measurements. Using the CLidar technique aerosols may be mapped over both altitude and time. The instrumentation required is portable and can easily be moved to locations of interest (e.g. downwind from factories or power plants, near highways). This paper describes the CLidar technique, implementation and data analysis and offers specifics for users wishing to apply the technique for aerosol profiles.
The Evolution of the American Board of Ophthalmology Written Qualifying Examination.
Wilson, David J; Tasman, William S; Skuta, Gregory L; Sheth, Bhavna P
2016-09-01
Since the inception of board certification in ophthalmology in 1916, a written assessment of candidates' knowledge base has been an integral part of the certification process. Although the committee structure and technique for writing examination questions has evolved over the past 100 years, the written qualifying examination remains an essential tool for assessing the competency of physicians entering the workforce. To develop a fair and valid examination, the American Board of Ophthalmology builds examination questions using evidence-based, peer-reviewed literature and adheres to accepted psychometric assessment standards. Copyright © 2016 American Academy of Ophthalmology. All rights reserved.
Visualization of Skin Perfusion by Indocyanine Green Fluorescence Angiography—A Feasibility Study
Steinbacher, Johannes; Yoshimatsu, Hidehiko; Meng, Stefan; Hamscha, Ulrike M.; Chan, Chun-Sheng; Weninger, Wolfgang J.; Wu, Chieh-Tsai; Cheng, Ming-Huei
2017-01-01
Summary: Plastic and reconstructive surgery relies on the knowledge of angiosomes in the raising of microsurgical flaps. Growing interest in muscle-sparing perforator flaps calls for reliable methods to assess the clinical feasibility of new donor sites in anatomical studies. Several injection techniques are known for the evaluation of vascular territories. Indocyanine green–based fluorescence angiography has found wide application in the clinical assessment of tissue perfusion. In this article, the use of indocyanine green–based fluorescence angiography for the assessment of perforasomes in anatomical studies is described for the first time. PMID:29062637
Use of GIS-based Site-specific Nitrogen Management for Improving Energy Efficiency
USDA-ARS?s Scientific Manuscript database
To our knowledge, geographical information system (GIS)-based site-specific nitrogen management (SSNM) techniques have not been used to assess agricultural energy costs and efficiency. This chapter uses SSNM case studies for corn (Zea mays L.) grown in Missouri and cotton (Gossypium hirsutum L.) gro...
Developing and Assessing E-Learning Techniques for Teaching Forecasting
ERIC Educational Resources Information Center
Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian
2014-01-01
In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…
Recent literature has shown that bioavailability-based techniques, such as Tenax extraction, can estimate sediment exposure to benthos. In a previous study by the authors,Tenax extraction was used to create and validate a literature-based Tenax model to predict oligochaete bioac...
Athletic Training Educators' Knowledge, Comfort, and Perceived Importance of Evidence-Based Practice
ERIC Educational Resources Information Center
Welch, Cailee E.; Van Lunen, Bonnie L.; Walker, Stacy E.; Manspeaker, Sarah A.; Hankemeier, Dorice A.; Brown, Sara D.; Laursen, R. Mark; Onate, James A.
2011-01-01
Context: Before new strategies and effective techniques for implementation of evidence-based practice (EBP) into athletic training curricula can occur, it is crucial to recognize the current knowledge and understanding of EBP concepts among athletic training educators. Objective: To assess athletic training educators' current knowledge, comfort,…
Artificial intelligence and signal processing for infrastructure assessment
NASA Astrophysics Data System (ADS)
Assaleh, Khaled; Shanableh, Tamer; Yehia, Sherif
2015-04-01
The Ground Penetrating Radar (GPR) is being recognized as an effective nondestructive evaluation technique to improve the inspection process. However, data interpretation and complexity of the results impose some limitations on the practicality of using this technique. This is mainly due to the need of a trained experienced person to interpret images obtained by the GPR system. In this paper, an algorithm to classify and assess the condition of infrastructures utilizing image processing and pattern recognition techniques is discussed. Features extracted form a dataset of images of defected and healthy slabs are used to train a computer vision based system while another dataset is used to evaluate the proposed algorithm. Initial results show that the proposed algorithm is able to detect the existence of defects with about 77% success rate.
Surgery for obstructed defecation syndrome-is there an ideal technique
Riss, Stefan; Stift, Anton
2015-01-01
Obstructive defecation syndrome (ODS) is a common disorder with a considerable impact on the quality of life of affected patients. Surgery for ODS remains a challenging topic. There exists a great variety of operative techniques to treat patients with ODS. According to the surgeon’s preference the approach can be transanal, transvaginal, transperineal or transabdominal. All techniques have its advantages and disadvantages. Notably, high evidence based studies are significantly lacking in literature, thus making accurate assessments difficult. Careful patient’s selection is crucial to achieve optimal functional results. It is mandatory to assess not only defecation disorders but also evaluate overall pelvic floor symptoms, such as fecal incontinence and urinary disorders for choosing an appropriate and tailored strategy. Radiological investigation is essential but may not explain complaints of every patient. PMID:25574075
Sánchez-Sánchez, M Luz; Belda-Lois, Juan-Manuel; Mena-Del Horno, Silvia; Viosca-Herrero, Enrique; Igual-Camacho, Celedonia; Gisbert-Morant, Beatriz
2018-05-05
A major goal in stroke rehabilitation is the establishment of more effective physical therapy techniques to recover postural stability. Functional Principal Component Analysis provides greater insight into recovery trends. However, when missing values exist, obtaining functional data presents some difficulties. The purpose of this study was to reveal an alternative technique for obtaining the Functional Principal Components without requiring the conversion to functional data beforehand and to investigate this methodology to determine the effect of specific physical therapy techniques in balance recovery trends in elderly subjects with hemiplegia post-stroke. A randomized controlled pilot trial was developed. Thirty inpatients post-stroke were included. Control and target groups were treated with the same conventional physical therapy protocol based on functional criteria, but specific techniques were added to the target group depending on the subjects' functional level. Postural stability during standing was quantified by posturography. The assessments were performed once a month from the moment the participants were able to stand up to six months post-stroke. The target group showed a significant improvement in postural control recovery trend six months after stroke that was not present in the control group. Some of the assessed parameters revealed significant differences between treatment groups (P < 0.05). The proposed methodology allows Functional Principal Component Analysis to be performed when data is scarce. Moreover, it allowed the dynamics of recovery of two different treatment groups to be determined, showing that the techniques added in the target group increased postural stability compared to the base protocol. Copyright © 2018 Elsevier Ltd. All rights reserved.
Xiping Wang; James P. Wacker; Robert J. Ross; Brian K. Brashaw; Robert Vatalaro
2005-01-01
This paper describes an effort to develop a global dynamic testing technique for evaluating the overall stiffness of timber bridge superstructures. A forced vibration method was used to measure the natural frequency of single-span timber bridges in the laboratory and field. An analytical model based on simple beam theory was proposed to represent the relationship...
[Psychological debriefing and post-immediate psychotherapeutic intervention].
Prieto, Nathalie; Cheucle, Eric; Meylan, Françoise
2010-01-01
Psychological debriefing is a controversial treatment technique. In principle, many such treatments exist based on apparently indisputable conclusions which only assess the personal traumatic effect and neglect the collective impact, which is the original reason for the creation of this technique. Therefore, it is essential to take a look at the way in which debriefings are conducted, its indications, its limits and the psychodynamic processes at play.
NASA Astrophysics Data System (ADS)
Hirota, Koji
We demonstrate a computationally-efficient method for optical coherence elastography (OCE) based on fringe washout method for a spectral-domain OCT (SD-OCT) system. By sending short pulses of mechanical perturbation with ultrasound or shock wave during the image acquisition of alternating depth profiles, we can extract cross-sectional mechanical assessment of tissue in real-time. This was achieved through a simple comparison of the intensity in adjacent depth profiles acquired during the states of perturbation and non-perturbation in order to quantify the degree of induced fringe washout. Although the results indicate that our OCE technique based on the fringe washout effect is sensitive enough to detect mechanical property changes in biological samples, there is some loss of sensitivity in comparison to previous techniques in order to achieve computationally efficiency and minimum modification in both hardware and software in the OCT system. The tissue phantom study was carried with various agar density samples to characterize our OCE technique. Young's modulus measurements were achieved with the atomic force microscopy (AFM) to correlate to our OCE assessment. Knee cartilage samples of monosodium iodoacetate (MIA) rat models were utilized to replicate cartilage damage of a human model. Our proposed OCE technique along with intensity and AFM measurements were applied to the MIA models to assess the damage. The results from both the phantom study and MIA model study demonstrated the strong capability to assess the changes in mechanical properties of the OCE technique. The correlation between the OCE measurements and the Young's modulus values demonstrated in the OCE data that the stiffer material had less magnitude of fringe washout effect. This result is attributed to the fringe washout effect caused by axial motion that the displacement of the scatterers in the stiffer samples in response to the external perturbation induces less fringe washout effect.
Characterizing Resting-State Brain Function Using Arterial Spin Labeling
Jann, Kay; Wang, Danny J.J.
2015-01-01
Abstract Arterial spin labeling (ASL) is an increasingly established magnetic resonance imaging (MRI) technique that is finding broader applications in studying the healthy and diseased brain. This review addresses the use of ASL to assess brain function in the resting state. Following a brief technical description, we discuss the use of ASL in the following main categories: (1) resting-state functional connectivity (FC) measurement: the use of ASL-based cerebral blood flow (CBF) measurements as an alternative to the blood oxygen level-dependent (BOLD) technique to assess resting-state FC; (2) the link between network CBF and FC measurements: the use of network CBF as a surrogate of the metabolic activity within corresponding networks; and (3) the study of resting-state dynamic CBF-BOLD coupling and cerebral metabolism: the use of dynamic CBF information obtained using ASL to assess dynamic CBF-BOLD coupling and oxidative metabolism in the resting state. In addition, we summarize some future challenges and interesting research directions for ASL, including slice-accelerated (multiband) imaging as well as the effects of motion and other physiological confounds on perfusion-based FC measurement. In summary, this work reviews the state-of-the-art of ASL and establishes it as an increasingly viable MRI technique with high translational value in studying resting-state brain function. PMID:26106930
Kumar, Sasi; Adiga, Kasturi Ramesh; George, Anice
2014-01-01
Old age is a period when people need physical, emotional, and psychological support. Depression is the most prevalent mental health problem among older adults and it contributes to increase in medical morbidity and mortality, reduces quality of life and elevates health care costs. Therefore early diagnosis and effective management are required to improve the quality of life of older adults suffering from depression. Intervention like Mindfulness based Stress Reduction is a powerful relaxation technique to provide quick way to get rid of depression and negative emotions by increasing mindfulness. The study was undertaken to assess the effectiveness of MBSR on depression among elderly residing in residential homes, Bangalore. In this study, quasi experimental pre-test post-test control group research design was used. There were two groups: experimental and control, each group had 30 samples selected from different residential homes by non-probability convenience sampling technique. Pre-test depression and mindfulness was assessed before the first day of intervention. Experimental group participants were provided intervention on MBSR. Assessment of post-test depression and mindfulness was done at the end of the intervention programme for both group participants. The study revealed significant reduction in depression (p < 0.001) and increase in mindfulness (p < 0.001) among elderly in the experimental group who were subjected to MBSR technique.
Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho
2015-04-01
Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.
ERIC Educational Resources Information Center
Suendermann-Oeft, David; Ramanarayanan, Vikram; Yu, Zhou; Qian, Yao; Evanini, Keelan; Lange, Patrick; Wang, Xinhao; Zechner, Klaus
2017-01-01
We present work in progress on a multimodal dialog system for English language assessment using a modular cloud-based architecture adhering to open industry standards. Among the modules being developed for the system, multiple modules heavily exploit machine learning techniques, including speech recognition, spoken language proficiency rating,…
Fabric-based Pressure Sensor Array for Decubitus Ulcer Monitoring
Chung, Philip; Rowe, Allison; Etemadi, Mozziyar; Lee, Hanmin; Roy, Shuvo
2015-01-01
Decubitus ulcers occur in an estimated 2.5 million Americans each year at an annual cost of $11 billion to the U.S. health system. Current screening and prevention techniques for assessing risk for decubitus ulcer formation and repositioning patients every 1–2 hours are labor-intensive and can be subjective. We propose use of a Bluetooth-enabled fabric-based pressure sensor array as a simple tool to objectively assess and continuously monitor decubitus ulcer risk. PMID:24111232
Pavement Performance : Approaches Using Predictive Analytics
DOT National Transportation Integrated Search
2018-03-23
Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...
NASA Astrophysics Data System (ADS)
Meyer, Ryan M.; Komura, Ichiro; Kim, Kyung-cho; Zetterwall, Tommy; Cumblidge, Stephen E.; Prokofiev, Iouri
2016-02-01
In February 2012, the U.S. Nuclear Regulatory Commission (NRC) executed agreements with VTT Technical Research Centre of Finland, Nuclear Regulatory Authority of Japan (NRA, former JNES), Korea Institute of Nuclear Safety (KINS), Swedish Radiation Safety Authority (SSM), and Swiss Federal Nuclear Safety Inspectorate (ENSI) to establish the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT). The goal of PARENT is to investigate the effectiveness of current emerging and perspective novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is done by conducting a series of open and blind international round-robin tests on a set of large-bore dissimilar metal welds (LBDMW), small-bore dissimilar metal welds (SBDMW), and bottom-mounted instrumentation (BMI) penetration weld test blocks. The purpose of blind testing is to study the reliability of more established techniques and included only qualified teams and procedures. The purpose of open testing is aimed at a more basic capability assessment of emerging and novel technologies. The range of techniques applied in open testing varied with respect to maturity and performance uncertainty and were applied to a variety of simulated flaws. This paper will include a brief overview of the PARENT blind and open testing techniques and test blocks and present some of the blind testing results.
Noninvasive in vivo glucose sensing using an iris based technique
NASA Astrophysics Data System (ADS)
Webb, Anthony J.; Cameron, Brent D.
2011-03-01
Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.
Development of the re-emit technique for ICF foot symmetry tuning for indirect drive ignition on NIF
NASA Astrophysics Data System (ADS)
Dewald, Eduard; Milovich, Jose; Edwards, John; Thomas, Cliff; Kalantar, Dan; Meeker, Don; Jones, Ogden
2007-11-01
Tuning of the the symmetry of the hohlraum radiation drive for the first 2 ns of the ICF pulse on NIF will be assessed by the re-emit technique [1] which measures the instantaneous x-ray drive asymmetry based on soft x-ray imaging of the re-emission of a high-Z sphere surrogate capsule. We will discuss the design of re-emit foot symmetry tuning measurements planned on NIF and their surrogacy for ignition experiments, including assessing the residual radiation asymmetry of the patches required for soft x-ray imaging. We will present the tuning strategy and expected accuracies based on calculations, analytical estimates and first results from scaled experiments performed at the Omega laser facility. [1] N. Delamater, G. Magelssen, A. Hauer, Phys. Rev. E 53, 5241 (1996.)
Spectrum Efficiency Through Dynamic Spectrum Access Techniques (Briefing Charts)
2014-06-01
Telemetry Data Sources IP BASED TELEMETRY STATION Flow control • Volume- based • Credit- based • Rate- based Signaling using custom protocols or standards...Responsible for all T&E infrastructure assessment within the Major Range and Test Facility Base (MRTFB) DoD Directive 3200.11 • Administer three...Memorandum Unleashing of the Wireless Broadband Revolution THE WHY: Based on the view that “we are now beginning the next transformation in
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Marchand, C; Gagnayre, R; d'Ivernois, J F
1996-01-01
There are very few examples of health training assessment in developing countries. Such an undertaking faces a number of difficulties concerning the problems inherent to assessment, the particular and unstable nature of the environment, and the problems associated with humanitarian action and development aid. It is difficult to choose between a formal and a natural approach. Indeed, a dual approach, combining quantitative and qualitative data seems best suited to a variety of cultural contexts of variable stability. Faced with these difficulties, a criteria-based, formative, quality-oriented assessment aimed at improving teaching and learning methods should be able to satisfy the needs of training professionals. We propose a training assessment guide based on an assessment model which aims to improve training techniques using comprehensive, descriptive and prescriptive approaches.
Characterization and delineation of caribou habitat on Unimak Island using remote sensing techniques
NASA Astrophysics Data System (ADS)
Atkinson, Brain M.
The assessment of herbivore habitat quality is traditionally based on quantifying the forages available to the animal across their home range through ground-based techniques. While these methods are highly accurate, they can be time-consuming and highly expensive, especially for herbivores that occupy vast spatial landscapes. The Unimak Island caribou herd has been decreasing in the last decade at rates that have prompted discussion of management intervention. Frequent inclement weather in this region of Alaska has provided for little opportunity to study the caribou forage habitat on Unimak Island. The overall objectives of this study were two-fold 1) to assess the feasibility of using high-resolution color and near-infrared aerial imagery to map the forage distribution of caribou habitat on Unimak Island and 2) to assess the use of a new high-resolution multispectral satellite imagery platform, RapidEye, and use of the "red-edge" spectral band on vegetation classification accuracy. Maximum likelihood classification algorithms were used to create land cover maps in aerial and satellite imagery. Accuracy assessments and transformed divergence values were produced to assess vegetative spectral information and classification accuracy. By using RapidEye and aerial digital imagery in a hierarchical supervised classification technique, we were able to produce a high resolution land cover map of Unimak Island. We obtained overall accuracy rates of 71.4 percent which are comparable to other land cover maps using RapidEye imagery. The "red-edge" spectral band included in the RapidEye imagery provides additional spectral information that allows for a more accurate overall classification, raising overall accuracy 5.2 percent.
NASA Astrophysics Data System (ADS)
Baltazart, Vincent; Moliard, Jean-Marc; Amhaz, Rabih; Wright, Dean; Jethwa, Manish
2015-04-01
Monitoring road surface conditions is an important issue in many countries. Several projects have looked into this issue in recent years, including TRIMM 2011-2014. The objective of such projects has been to detect surface distresses, like cracking, raveling and water ponding, in order to plan effective road maintenance and to afford a better sustainability of the pavement. The monitoring of cracking conventionally focuses on open cracks on the surface of the pavement, as opposed to reflexive cracks embedded in the pavement materials. For monitoring surface condition, in situ human visual inspection has been gradually replaced by automatic image data collection at traffic speed. Off-line image processing techniques have been developed for monitoring surface condition in support of human visual control. Full automation of crack monitoring has been approached with caution, and depends on a proper manual assessment of the performance. This work firstly presents some aspects of the current state of monitoring that have been reported so far in the literature and in previous projects: imaging technology and image processing techniques. Then, the work presents the two image processing techniques that have been developed within the scope of the TRIMM project to automatically detect pavement cracking from images. The first technique is a heuristic approach (HA) based on the search for gradient within the image. It was originally developed to process pavement images from the French imaging device, Aigle-RN. The second technique, the Minimal Path Selection (MPS) method, has been developed within an ongoing PhD work at IFSTTAR. The proposed new technique provides a fine and accurate segmentation of the crack pattern along with the estimation of the crack width. HA has been assessed against the field data collection provided by Yotta and TRL with the imaging device Tempest 2. The performance assessment has been threefold: first it was performed against the reference data set including 130 km of pavement images over UK roads, second over a few selected short sections of contiguous pavement images, and finally over a few sample images as a case study. The performance of MPS has been assessed against an older image data base. Pixel-based PGT was available to provide the most sensitive performance assessment. MPS has shown its ability to provide a very accurate cracking pattern without reducing the image resolution on the segmented images. Thus, it allows measurement of the crack width; it is found to behave more robustly against the image texture and better matched for dealing with low contrast pavement images. The benchmarking of seven automatic segmentation techniques has been provided at both the pixel and the grid levels. The performance assessment includes three minimal path selection algorithms, namely MPS, Free Form Anisotropy (FFA), one geodesic contour with automatic selection of points of interests (GC-POI), HA, and two Markov-based methods. Among others, MPS approach reached the best performance at the pixel level while it is matched to the FFA approach at the grid level. Finally, the project has emphasized the need for a reliable ground truth data collection. Owing to its accuracy, MPS may serve as a reference benchmark for other methods to provide the automatic segmentation of pavement images at the pixel level and beyond. As a counterpart, MPS requires a reduction in the computing time. Keywords: cracking, automatic segmentation, image processing, pavement, surface distress, monitoring, DICE, performance
Statistical methodology: II. Reliability and validity assessment in study design, Part B.
Karras, D J
1997-02-01
Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.
Pitfalls in the measurement of muscle mass: a need for a reference standard
Landi, Francesco; Cesari, Matteo; Fielding, Roger A.; Visser, Marjolein; Engelke, Klaus; Maggi, Stefania; Dennison, Elaine; Al‐Daghri, Nasser M.; Allepaerts, Sophie; Bauer, Jurgen; Bautmans, Ivan; Brandi, Maria Luisa; Bruyère, Olivier; Cederholm, Tommy; Cerreta, Francesca; Cherubini, Antonio; Cooper, Cyrus; Cruz‐Jentoft, Alphonso; McCloskey, Eugene; Dawson‐Hughes, Bess; Kaufman, Jean‐Marc; Laslop, Andrea; Petermans, Jean; Reginster, Jean‐Yves; Rizzoli, René; Robinson, Sian; Rolland, Yves; Rueda, Ricardo; Vellas, Bruno; Kanis, John A.
2018-01-01
Abstract Background All proposed definitions of sarcopenia include the measurement of muscle mass, but the techniques and threshold values used vary. Indeed, the literature does not establish consensus on the best technique for measuring lean body mass. Thus, the objective measurement of sarcopenia is hampered by limitations intrinsic to assessment tools. The aim of this study was to review the methods to assess muscle mass and to reach consensus on the development of a reference standard. Methods Literature reviews were performed by members of the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis working group on frailty and sarcopenia. Face‐to‐face meetings were organized for the whole group to make amendments and discuss further recommendations. Results A wide range of techniques can be used to assess muscle mass. Cost, availability, and ease of use can determine whether the techniques are better suited to clinical practice or are more useful for research. No one technique subserves all requirements but dual energy X‐ray absorptiometry could be considered as a reference standard (but not a gold standard) for measuring muscle lean body mass. Conclusions Based on the feasibility, accuracy, safety, and low cost, dual energy X‐ray absorptiometry can be considered as the reference standard for measuring muscle mass. PMID:29349935
Reproducibility of telomere length assessment: an international collaborative study.
Martin-Ruiz, Carmen M; Baird, Duncan; Roger, Laureline; Boukamp, Petra; Krunic, Damir; Cawthon, Richard; Dokter, Martin M; van der Harst, Pim; Bekaert, Sofie; de Meyer, Tim; Roos, Goran; Svenson, Ulrika; Codd, Veryan; Samani, Nilesh J; McGlynn, Liane; Shiels, Paul G; Pooley, Karen A; Dunning, Alison M; Cooper, Rachel; Wong, Andrew; Kingston, Andrew; von Zglinicki, Thomas
2015-10-01
Telomere length is a putative biomarker of ageing, morbidity and mortality. Its application is hampered by lack of widely applicable reference ranges and uncertainty regarding the present limits of measurement reproducibility within and between laboratories. We instigated an international collaborative study of telomere length assessment: 10 different laboratories, employing 3 different techniques [Southern blotting, single telomere length analysis (STELA) and real-time quantitative PCR (qPCR)] performed two rounds of fully blinded measurements on 10 human DNA samples per round to enable unbiased assessment of intra- and inter-batch variation between laboratories and techniques. Absolute results from different laboratories differed widely and could thus not be compared directly, but rankings of relative telomere lengths were highly correlated (correlation coefficients of 0.63-0.99). Intra-technique correlations were similar for Southern blotting and qPCR and were stronger than inter-technique ones. However, inter-laboratory coefficients of variation (CVs) averaged about 10% for Southern blotting and STELA and more than 20% for qPCR. This difference was compensated for by a higher dynamic range for the qPCR method as shown by equal variance after z-scoring. Technical variation per laboratory, measured as median of intra- and inter-batch CVs, ranged from 1.4% to 9.5%, with differences between laboratories only marginally significant (P = 0.06). Gel-based and PCR-based techniques were not different in accuracy. Intra- and inter-laboratory technical variation severely limits the usefulness of data pooling and excludes sharing of reference ranges between laboratories. We propose to establish a common set of physical telomere length standards to improve comparability of telomere length estimates between laboratories. © The Author 2014. Published by Oxford University Press on behalf of the International Epidemiological Association.
Zientek, Michael L.; Oszczepalski, Sławomir; Parks, Heather L.; Bliss, James D.; Borg, Gregor; Box, Stephen E.; Denning, Paul; Hayes, Timothy S.; Spieth, Volker; Taylor, Cliff D.
2015-01-01
Using the three-part form of assessment, a mean of 126 Mt of undiscovered copper is predicted in 4 assessed permissive tracts. Seventy-five percent of the mean amount of undiscovered copper (96 Mt) is associated with a tract in southwest Poland. For this same permissive tract in Poland, Gaussian geostatistical simulation techniques indicate a mean of 62 Mt of copper based on copper surface-density data from drill holes.
Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M
2012-07-01
The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P < 0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0-4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. Fat infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.
Estimation of shoreline position and change using airborne topographic lidar data
Stockdon, H.F.; Sallenger, A.H.; List, J.H.; Holman, R.A.
2002-01-01
A method has been developed for estimating shoreline position from airborne scanning laser data. This technique allows rapid estimation of objective, GPS-based shoreline positions over hundreds of kilometers of coast, essential for the assessment of large-scale coastal behavior. Shoreline position, defined as the cross-shore position of a vertical shoreline datum, is found by fitting a function to cross-shore profiles of laser altimetry data located in a vertical range around the datum and then evaluating the function at the specified datum. Error bars on horizontal position are directly calculated as the 95% confidence interval on the mean value based on the Student's t distribution of the errors of the regression. The technique was tested using lidar data collected with NASA's Airborne Topographic Mapper (ATM) in September 1997 on the Outer Banks of North Carolina. Estimated lidar-based shoreline position was compared to shoreline position as measured by a ground-based GPS vehicle survey system. The two methods agreed closely with a root mean square difference of 2.9 m. The mean 95% confidence interval for shoreline position was ?? 1.4 m. The technique has been applied to a study of shoreline change on Assateague Island, Maryland/Virginia, where three ATM data sets were used to assess the statistics of large-scale shoreline change caused by a major 'northeaster' winter storm. The accuracy of both the lidar system and the technique described provides measures of shoreline position and change that are ideal for studying storm-scale variability over large spatial scales.
Porter, P Steven; Rao, S Trivikrama; Zurbenko, Igor G; Dunker, Alan M; Wolff, George T
2001-02-01
Assessment of regulatory programs aimed at improving ambient O 3 air quality is of considerable interest to the scientific community and to policymakers. Trend detection, the identification of statistically significant long-term changes, and attribution, linking change to specific clima-tological and anthropogenic forcings, are instrumental to this assessment. Detection and attribution are difficult because changes in pollutant concentrations of interest to policymakers may be much smaller than natural variations due to weather and climate. In addition, there are considerable differences in reported trends seemingly based on similar statistical methods and databases. Differences arise from the variety of techniques used to reduce nontrend variation in time series, including mitigating the effects of meteorology and the variety of metrics used to track changes. In this paper, we review the trend assessment techniques being used in the air pollution field and discuss their strengths and limitations in discerning and attributing changes in O 3 to emission control policies.
Developing integrated patient pathways using hybrid simulation
NASA Astrophysics Data System (ADS)
Zulkepli, Jafri; Eldabi, Tillal
2016-10-01
Integrated patient pathways includes several departments, i.e. healthcare which includes emergency care and inpatient ward; intermediate care which patient(s) will stay for a maximum of two weeks and at the same time be assessed by assessment team to find the most suitable care; and social care. The reason behind introducing the intermediate care in western countries was to reduce the rate of patients that stays in the hospital especially for elderly patients. This type of care setting has been considered to be set up in some other countries including Malaysia. Therefore, to assess the advantages of introducing this type of integrated healthcare setting, we suggest develop the model using simulation technique. We argue that single simulation technique is not viable enough to represent this type of patient pathways. Therefore, we suggest develop this model using hybrid techniques, i.e. System Dynamics (SD) and Discrete Event Simulation (DES). Based on hybrid model result, we argued that the result is viable to be as references for decision making process.
Computational Biology Methods for Characterization of Pluripotent Cells.
Araúzo-Bravo, Marcos J
2016-01-01
Pluripotent cells are a powerful tool for regenerative medicine and drug discovery. Several techniques have been developed to induce pluripotency, or to extract pluripotent cells from different tissues and biological fluids. However, the characterization of pluripotency requires tedious, expensive, time-consuming, and not always reliable wet-lab experiments; thus, an easy, standard quality-control protocol of pluripotency assessment remains to be established. Here to help comes the use of high-throughput techniques, and in particular, the employment of gene expression microarrays, which has become a complementary technique for cellular characterization. Research has shown that the transcriptomics comparison with an Embryonic Stem Cell (ESC) of reference is a good approach to assess the pluripotency. Under the premise that the best protocol is a computer software source code, here I propose and explain line by line a software protocol coded in R-Bioconductor for pluripotency assessment based on the comparison of transcriptomics data of pluripotent cells with an ESC of reference. I provide advice for experimental design, warning about possible pitfalls, and guides for results interpretation.
NASA Astrophysics Data System (ADS)
Sambeka, Yana; Nahadi, Sriyati, Siti
2017-05-01
The study aimed to obtain the scientific information about increase of student's concept mastering in project based learning that used authentic assessment. The research was conducted in May 2016 at one of junior high school in Bandung in the academic year of 2015/2016. The research method was weak experiment with the one-group pretest-posttest design. The sample was taken by random cluster sampling technique and the sample was 24 students. Data collected through instruments, i.e. written test, observation sheet, and questionnaire sheet. Student's concept mastering test obtained N-Gain of 0.236 with the low category. Based on the result of paired sample t-test showed that implementation of authentic assessment in the project based learning increased student's concept mastering significantly, (sig<0.05).
Wire Crimp Termination Verification Using Ultrasonic Inspection
NASA Technical Reports Server (NTRS)
Perey, Daniel F.; Cramer, K. Elliott; Yost, William T.
2007-01-01
The development of a new ultrasonic measurement technique to quantitatively assess wire crimp terminations is discussed. The amplitude change of a compressional ultrasonic wave propagating through the junction of a crimp termination and wire is shown to correlate with the results of a destructive pull test, which is a standard for assessing crimp wire junction quality. Various crimp junction pathologies such as undercrimping, missing wire strands, incomplete wire insertion, partial insulation removal, and incorrect wire gauge are ultrasonically tested, and their results are correlated with pull tests. Results show that the nondestructive ultrasonic measurement technique consistently (as evidenced with destructive testing) predicts good crimps when ultrasonic transmission is above a certain threshold amplitude level. A physics-based model, solved by finite element analysis, describes the compressional ultrasonic wave propagation through the junction during the crimping process. This model is in agreement within 6% of the ultrasonic measurements. A prototype instrument for applying this technique while wire crimps are installed is also presented. The instrument is based on a two-jaw type crimp tool suitable for butt-splice type connections. Finally, an approach for application to multipin indenter type crimps will be discussed.
Protection of Health Imagery by Region Based Lossless Reversible Watermarking Scheme
Priya, R. Lakshmi; Sadasivam, V.
2015-01-01
Providing authentication and integrity in medical images is a problem and this work proposes a new blind fragile region based lossless reversible watermarking technique to improve trustworthiness of medical images. The proposed technique embeds the watermark using a reversible least significant bit embedding scheme. The scheme combines hashing, compression, and digital signature techniques to create a content dependent watermark making use of compressed region of interest (ROI) for recovery of ROI as reported in literature. The experiments were carried out to prove the performance of the scheme and its assessment reveals that ROI is extracted in an intact manner and PSNR values obtained lead to realization that the presented scheme offers greater protection for health imageries. PMID:26649328
NASA Technical Reports Server (NTRS)
Wildesen, S. E.; Phillips, E. P.
1981-01-01
Because of the size of the Pocomoke River Basin, the inaccessibility of certain areas, and study time constraints, several remote sensing techniques were used to collect base information on the river corridor, (a 23.2 km channel) and on a 1.2 km wooded floodplain. This information provided an adequate understanding of the environment and its resources, thus enabling effective management options to be designed. The remote sensing techniques used for assessment included manual analysis of high altitude color-infrared photography, computer-assisted analysis of LANDSAT-2 imagery, and the application of airborne oceanographic Lidar for topographic mapping. Results show that each techniques was valuable in providing the needed base data necessary for resource planning.
Mutel, Christopher L; Pfister, Stephan; Hellweg, Stefanie
2012-01-17
We describe a new methodology for performing regionalized life cycle assessment and systematically choosing the spatial scale of regionalized impact assessment methods. We extend standard matrix-based calculations to include matrices that describe the mapping from inventory to impact assessment spatial supports. Uncertainty in inventory spatial data is modeled using a discrete spatial distribution function, which in a case study is derived from empirical data. The minimization of global spatial autocorrelation is used to choose the optimal spatial scale of impact assessment methods. We demonstrate these techniques on electricity production in the United States, using regionalized impact assessment methods for air emissions and freshwater consumption. Case study results show important differences between site-generic and regionalized calculations, and provide specific guidance for future improvements of inventory data sets and impact assessment methods.
Chandrasekhar, Shalini; Prasad, Madu Ghanashyam; Radhakrishna, Ambati Naga; Saujanya, Kaniti; Raviteja, N V K; Deepthi, B; Ramakrishna, J
2018-01-01
The aim of this study was to evaluate the efficiency of four different obturating techniques in filling the radicular space in primary teeth. This clinical trial was carried out on 34 healthy, cooperative children (5-9 years) who had 63 carious primary teeth indicated for pulpectomy. They were divided into four groups, such that in each group, a total of 40 canals were allotted for obturation with respective technique. The root canals of selected primary teeth were filled with Endoflas obturating material using either bi-directional spiral (Group 1); incremental technique (Group 2), past inject (Group 3) or lentulo spiral (Group 4) according to the groups assigned. The effectiveness of the obturation techniques was assessed using postoperative radiographs. The assessment was made for a depth of fill in the canal, the presence of any voids using Modified Coll and Sadrian criteria. The obtained data were analyzed by using ANOVA test and unpaired t-test. Bi-directional spiral and lentulo spiral were superior to other techniques in providing optimally filled canals (P< 0.05). The bi-directional spiral was superior to lentulo spiral in preventing overfill (P< 0.05). Based on the present study results, bi-directional spiral can be recommended as an alternate obturating technique in primary teeth.
Nondestructive assessment of pore size in foam-based hybrid composite materials
NASA Astrophysics Data System (ADS)
Chen, M. Y.; Ko, R. T.
2012-05-01
In-situ non-destructive evaluation (NDE) during processing of high temperature polymer based hybrids offers great potential to gain close control and achieve the desired level of pore size, with low overall development cost. During the polymer curing cycle, close control over the evolution of volatiles would be beneficial to avoid the presence of pores or at least control their sizes. Traditional NDE methods cannot realistically be expected to evaluate individual pores in such components, as each pore evolves and grows during curing. However, NDE techniques offer the potential to detect and quantify the macroscopic response of many pores that are undesirable or intentionally introduced into these advanced materials. In this paper, preliminary results will be presented for nondestructive assessment of pore size in foam-based hybrid composite materials using ultrasonic techniques. Pore size was evaluated through the frequency content of the ultrasonic signal. The effects of pore size on the attenuation of ultrasound were studied. Feasibility of this method was demonstrated on two types of foams with various pore sizes.
A Bio-Inspired Herbal Tea Flavour Assessment Technique
Zakaria, Nur Zawatil Isqi; Masnan, Maz Jamilah; Zakaria, Ammar; Shakaff, Ali Yeon Md
2014-01-01
Herbal-based products are becoming a widespread production trend among manufacturers for the domestic and international markets. As the production increases to meet the market demand, it is very crucial for the manufacturer to ensure that their products have met specific criteria and fulfil the intended quality determined by the quality controller. One famous herbal-based product is herbal tea. This paper investigates bio-inspired flavour assessments in a data fusion framework involving an e-nose and e-tongue. The objectives are to attain good classification of different types and brands of herbal tea, classification of different flavour masking effects and finally classification of different concentrations of herbal tea. Two data fusion levels were employed in this research, low level data fusion and intermediate level data fusion. Four classification approaches; LDA, SVM, KNN and PNN were examined in search of the best classifier to achieve the research objectives. In order to evaluate the classifiers' performance, an error estimator based on k-fold cross validation and leave-one-out were applied. Classification based on GC-MS TIC data was also included as a comparison to the classification performance using fusion approaches. Generally, KNN outperformed the other classification techniques for the three flavour assessments in the low level data fusion and intermediate level data fusion. However, the classification results based on GC-MS TIC data are varied. PMID:25010697
Decision Tree Approach for Soil Liquefaction Assessment
Gandomi, Amir H.; Fridline, Mark M.; Roke, David A.
2013-01-01
In the current study, the performances of some decision tree (DT) techniques are evaluated for postearthquake soil liquefaction assessment. A database containing 620 records of seismic parameters and soil properties is used in this study. Three decision tree techniques are used here in two different ways, considering statistical and engineering points of view, to develop decision rules. The DT results are compared to the logistic regression (LR) model. The results of this study indicate that the DTs not only successfully predict liquefaction but they can also outperform the LR model. The best DT models are interpreted and evaluated based on an engineering point of view. PMID:24489498
Decision tree approach for soil liquefaction assessment.
Gandomi, Amir H; Fridline, Mark M; Roke, David A
2013-01-01
In the current study, the performances of some decision tree (DT) techniques are evaluated for postearthquake soil liquefaction assessment. A database containing 620 records of seismic parameters and soil properties is used in this study. Three decision tree techniques are used here in two different ways, considering statistical and engineering points of view, to develop decision rules. The DT results are compared to the logistic regression (LR) model. The results of this study indicate that the DTs not only successfully predict liquefaction but they can also outperform the LR model. The best DT models are interpreted and evaluated based on an engineering point of view.
The Animism Controversy Revisited: A Probability Analysis
ERIC Educational Resources Information Center
Smeets, Paul M.
1973-01-01
Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)
ERIC Educational Resources Information Center
Small, Jason W.; Lee, Jon; Frey, Andy J.; Seeley, John R.; Walker, Hill M.
2014-01-01
As specialized instructional support personnel begin learning and using motivational interviewing (MI) techniques in school-based settings, there is growing need for context-specific measures to assess initial MI skill development. In this article, we describe the iterative development and preliminary evaluation of two measures of MI skill adapted…
Considering the Efficacy of Web-Based Worked Examples in Introductory Chemistry
ERIC Educational Resources Information Center
Crippen, Kent J.; Earl, Boyd L.
2004-01-01
Theory suggests that studying worked examples and engaging in self-explanation will improve learning and problem solving. A growing body of evidence supports the use of web-based assessments for improving undergraduate performance in traditional large enrollment courses. This article describes a study designed to investigate these techniques in a…
ERIC Educational Resources Information Center
Ball, B. Hunter; Brewer, Gene A.
2018-01-01
The present study implemented an individual differences approach in conjunction with response time (RT) variability and distribution modeling techniques to better characterize the cognitive control dynamics underlying ongoing task cost (i.e., slowing) and cue detection in event-based prospective memory (PM). Three experiments assessed the relation…
2015-11-01
National Guard PLR Division of Polar Programs SMM /I Special Sensor Microwave/Imager SMMR Scanning Multi-channel Microwave Radiometer ERDC/CRREL...and the Special Sensor Microwave/Imager ( SMM /I). The satellite-based technique uses a difference in the passive microwave brightness temperatures
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-12
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-01
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839
A Method For The Verification Of Wire Crimp Compression Using Ultrasonic Inspection
NASA Technical Reports Server (NTRS)
Cramer, K. E.; Perey, Daniel F.; Yost, William t.
2010-01-01
The development of a new ultrasonic measurement technique to assess quantitatively wire crimp terminations is discussed. The amplitude change of a compressional ultrasonic wave propagating at right angles to the wire axis and through the junction of a crimp termination is shown to correlate with the results of a destructive pull test, which is a standard for assessing crimp wire junction quality. To demonstrate the technique, the case of incomplete compression of crimped connections is ultrasonically tested, and the results are correlated with pull tests. Results show that the nondestructive ultrasonic measurement technique consistently predicts good crimps when the ultrasonic transmission is above a certain threshold amplitude level. A quantitative measure of the quality of the crimped connection based on the ultrasonic energy transmitted is shown to respond accurately to crimp quality. A wave propagation model, solved by finite element analysis, describes the compressional ultrasonic wave propagation through the junction during the crimping process. This model is in agreement within 6% of the ultrasonic measurements. A prototype instrument for applying this technique while wire crimps are installed is also presented. The instrument is based on a two-jaw type crimp tool suitable for butt-splice type connections. A comparison of the results of two different instruments is presented and shows reproducibility between instruments within a 95% confidence bound.
Effective evaluation of privacy protection techniques in visible and thermal imagery
NASA Astrophysics Data System (ADS)
Nawaz, Tahir; Berg, Amanda; Ferryman, James; Ahlberg, Jörgen; Felsberg, Michael
2017-09-01
Privacy protection may be defined as replacing the original content in an image region with a (less intrusive) content having modified target appearance information to make it less recognizable by applying a privacy protection technique. Indeed, the development of privacy protection techniques also needs to be complemented with an established objective evaluation method to facilitate their assessment and comparison. Generally, existing evaluation methods rely on the use of subjective judgments or assume a specific target type in image data and use target detection and recognition accuracies to assess privacy protection. An annotation-free evaluation method that is neither subjective nor assumes a specific target type is proposed. It assesses two key aspects of privacy protection: "protection" and "utility." Protection is quantified as an appearance similarity, and utility is measured as a structural similarity between original and privacy-protected image regions. We performed an extensive experimentation using six challenging datasets (having 12 video sequences), including a new dataset (having six sequences) that contains visible and thermal imagery. The new dataset is made available online for the community. We demonstrate effectiveness of the proposed method by evaluating six image-based privacy protection techniques and also show comparisons of the proposed method over existing methods.
Canon, Abbey J; Lauterbach, Nicholas; Bates, Jessica; Skoland, Kristin; Thomas, Paul; Ellingson, Josh; Ruston, Chelsea; Breuer, Mary; Gerardy, Kimberlee; Hershberger, Nicole; Hayman, Kristen; Buckley, Alexis; Holtkamp, Derald; Karriker, Locke
2017-06-15
OBJECTIVE To develop and evaluate a pyramid training method for teaching techniques for collection of diagnostic samples from swine. DESIGN Experimental trial. SAMPLE 45 veterinary students. PROCEDURES Participants went through a preinstruction assessment to determine their familiarity with the equipment needed and techniques used to collect samples of blood, nasal secretions, feces, and oral fluid from pigs. Participants were then shown a series of videos illustrating the correct equipment and techniques for collecting samples and were provided hands-on pyramid-based instruction wherein a single swine veterinarian trained 2 or 3 participants on each of the techniques and each of those participants, in turn, trained additional participants. Additional assessments were performed after the instruction was completed. RESULTS Following the instruction phase, percentages of participants able to collect adequate samples of blood, nasal secretions, feces, and oral fluid increased, as did scores on a written quiz assessing participants' ability to identify the correct equipment, positioning, and procedures for collection of samples. CONCLUSIONS AND CLINICAL RELEVANCE Results suggested that the pyramid training method may be a feasible way to rapidly increase diagnostic sampling capacity during an emergency veterinary response to a swine disease outbreak.
Practice parameter on disaster preparedness.
Pfefferbaum, Betty; Shaw, Jon A
2013-11-01
This Practice Parameter identifies best approaches to the assessment and management of children and adolescents across all phases of a disaster. Delivered within a disaster system of care, many interventions are appropriate for implementation in the weeks and months after a disaster. These include psychological first aid, family outreach, psychoeducation, social support, screening, and anxiety reduction techniques. The clinician should assess and monitor risk and protective factors across all phases of a disaster. Schools are a natural site for conducting assessments and delivering services to children. Multimodal approaches using social support, psychoeducation, and cognitive behavioral techniques have the strongest evidence base. Psychopharmacologic interventions are not generally used but may be necessary as an adjunct to other interventions for children with severe reactions or coexisting psychiatric conditions. Copyright © 2013. Published by Elsevier Inc.
Blood oxygenation level-dependent MRI for assessment of renal oxygenation
Neugarten, Joel; Golestaneh, Ladan
2014-01-01
Blood oxygen level-dependent magnetic resonance imaging (BOLD MRI) has recently emerged as an important noninvasive technique to assess intrarenal oxygenation under physiologic and pathophysiologic conditions. Although this tool represents a major addition to our armamentarium of methodologies to investigate the role of hypoxia in the pathogenesis of acute kidney injury and progressive chronic kidney disease, numerous technical limitations confound interpretation of data derived from this approach. BOLD MRI has been utilized to assess intrarenal oxygenation in numerous experimental models of kidney disease and in human subjects with diabetic and nondiabetic chronic kidney disease, acute kidney injury, renal allograft rejection, contrast-associated nephropathy, and obstructive uropathy. However, confidence in conclusions based on data derived from BOLD MRI measurements will require continuing advances and technical refinements in the use of this technique. PMID:25473304
Technique for Reduction of Environmental Pollution from Construction Wastes
NASA Astrophysics Data System (ADS)
Bakaeva, N. V.; Klimenko, M. Y.
2017-11-01
The results of the research on the negative impact construction wastes have on the urban environment and construction ecological safety are described. The research results are based on the statistical data and indicators calculated with the use of environmental pollution assessment in the restoration system of urban buildings technical conditions. The technique for the reduction of environmental pollution from construction wastes is scientifically based on the analytic summary of scientific and practical results for ecological safety ensuring at major overhaul and current repairs (reconstruction) of the buildings and structures. It is also based on the practical application of the probability theory method, system analysis and disperse system theory. It is necessary to execute some stages implementing the developed technique to reduce environmental pollution from construction wastes. The stages include various steps starting from information collection to the system formation with optimum performance characteristics which are more resource saving and energy efficient for the accumulation of construction wastes from urban construction units. The following tasks are solved under certain studies: basic data collection about construction wastes accumulation; definition and comparison of technological combinations at each system functional stage intended for the reduction of construction wastes discharge into the environment; assessment criteria calculation of resource saving and energy efficiency; optimum working parameters of each implementation stage are created. The urban construction technique implementation shows that the resource saving criteria are from 55.22% to 88.84%; potential of construction wastes recycling is 450 million tons of construction damaged elements (parts).
NASA Astrophysics Data System (ADS)
Daryanani, Aditya; Dangi, Shusil; Ben-Zikri, Yehuda Kfir; Linte, Cristian A.
2016-03-01
Magnetic Resonance Imaging (MRI) is a standard-of-care imaging modality for cardiac function assessment and guidance of cardiac interventions thanks to its high image quality and lack of exposure to ionizing radiation. Cardiac health parameters such as left ventricular volume, ejection fraction, myocardial mass, thickness, and strain can be assessed by segmenting the heart from cardiac MRI images. Furthermore, the segmented pre-operative anatomical heart models can be used to precisely identify regions of interest to be treated during minimally invasive therapy. Hence, the use of accurate and computationally efficient segmentation techniques is critical, especially for intra-procedural guidance applications that rely on the peri-operative segmentation of subject-specific datasets without delaying the procedure workflow. Atlas-based segmentation incorporates prior knowledge of the anatomy of interest from expertly annotated image datasets. Typically, the ground truth atlas label is propagated to a test image using a combination of global and local registration. The high computational cost of non-rigid registration motivated us to obtain an initial segmentation using global transformations based on an atlas of the left ventricle from a population of patient MRI images and refine it using well developed technique based on graph cuts. Here we quantitatively compare the segmentations obtained from the global and global plus local atlases and refined using graph cut-based techniques with the expert segmentations according to several similarity metrics, including Dice correlation coefficient, Jaccard coefficient, Hausdorff distance, and Mean absolute distance error.
Drill hole logging with infrared spectroscopy
Calvin, W.M.; Solum, J.G.
2005-01-01
Infrared spectroscopy has been used to identify rocks and minerals for over 40 years. The technique is sensitive to primary silicates as well as alteration products. Minerals can be uniquely identified based on multiple absorption features at wavelengths from the visible to the thermal infrared. We are currently establishing methods and protocols in order to use the technique for rapid assessment of downhole lithology on samples obtained during drilling operations. Initial work performed includes spectral analysis of chip cuttings and core sections from drill sites around Desert Peak, NV. In this paper, we report on a survey of 10,000 feet of drill cuttings, at 100 foot intervals, from the San Andreas Fault Observatory at Depth (SAFOD). Data from Blue Mountain geothermal wells will also be acquired. We will describe the utility of the technique for rapid assessment of lithologic and mineralogic discrimination.
Performance tests and quality control of cathode ray tube displays.
Roehrig, H; Blume, H; Ji, T L; Browne, M
1990-08-01
Spatial resolution, noise, characteristic curve, and absolute luminance are the essential parameters that describe physical image quality of a display. This paper presents simple procedures for assessing the performance of a cathode ray tube (CRT) in terms of these parameters as well as easy set up techniques. The procedures can be used in the environment where the CRT is used. The procedures are based on a digital representation of the Society of Motion Pictures and Television Engineers pattern plus a few simple other digital patterns. Additionally, measurement techniques are discussed for estimating brightness uniformity, veiling glare, and distortion. Apart from the absolute luminance, all performance features can be assessed with an uncalibrated photodetector and the eyes of a human observer. The measurement techniques especially enable the user to perform comparisons of different display systems.
On the Assessment of Global Terrestrial Reference Frame Temporal Variations
NASA Astrophysics Data System (ADS)
Ampatzidis, Dimitrios; Koenig, Rolf; Zhu, Shengyuan
2015-04-01
Global Terrestrial Reference Frames (GTRFs) as the International Terrestrial Reference Frame (ITRF) provide reliable 4-D position information (3-D coordinates and their evolution through time). The given 3-D velocities play a significant role in precise position acquisition and are estimated from long term coordinate time series from the space-geodetic techniques DORIS, GNSS, SLR, and VLBI. GTRFs temporal evolution is directly connected with their internal stability: The more intense and inhomogeneous velocity field, the less stable TRF is derived. The assessment of the quality of the GTRF is mainly realized by comparing it to each individual technique's reference frame. E.g the comparison of GTRFs to SLR-only based TRF gives the sense of the ITRF stability with respect to the Geocenter and scale and their associated rates respectively. In addition, the comparison of ITRF to the VLBI-only based TRF can be used for the scale validation. However, till now there is not any specified methodology for the total assessment (in terms of origin, orientation and scale respectively) of the temporal evolution and GTRFs associated accuracy. We present a new alternative diagnostic tool for the assessment of GTRFs temporal evolution based on the well-known time-dependent Helmert type transformation formula (three shifts, three rotations and scale rates respectively). The advantage of the new methodology relies on the fact that it uses the full velocity field of the TRF and therefore all points not just the ones common to different techniques. It also examines simultaneously rates of origin, orientation and scale. The methodology is presented and implemented to the two existing GTRFs on the market (ITRF and DTRF which is computed from DGFI) , the results are discussed. The results also allow to compare directly each GTRF dynamic behavior. Furthermore, the correlations of the estimated parameters can also provide useful information to the proposed GTRFs assessment scheme.
ERIC Educational Resources Information Center
Ali, Gadacha
2007-01-01
This investigation aims to assess awareness of genre and writing skills among science students via an abstract writing task, with recall and follow-up protocols to monitor the students, and to characterize the relationship between the abstract and the base article. Abstract writing involves specific data selection techniques of activities involved…
Mee-Sook Kim; Ned B. Klopfenstein; Geral I. McDonald; Kathiravetpillai Arumuganathan
2001-01-01
For assessments of intraspecific mating using flow cytometry and fluorescence microscopy, two compatible basidiospore-derived isolates were selected from each of four parental basidiomata of North American Biological Species (NABS) X. The nuclear status in NABS X varied with basidiospore-derived isolates. Nuclei within basidiospore-derived isolates existed as haploids...
Assessment of Physical Activity and Energy Expenditure: An Overview of Objective Measures
Hills, Andrew P.; Mokhtar, Najat; Byrne, Nuala M.
2014-01-01
The ability to assess energy expenditure (EE) and estimate physical activity (PA) in free-living individuals is extremely important in the global context of non-communicable diseases including malnutrition, overnutrition (obesity), and diabetes. It is also important to appreciate that PA and EE are different constructs with PA defined as any bodily movement that results in EE and accordingly, energy is expended as a result of PA. However, total energy expenditure, best assessed using the criterion doubly labeled water (DLW) technique, includes components in addition to physical activity energy expenditure, namely resting energy expenditure and the thermic effect of food. Given the large number of assessment techniques currently used to estimate PA in humans, it is imperative to understand the relative merits of each. The goal of this review is to provide information on the utility and limitations of a range of objective measures of PA and their relationship with EE. The measures discussed include those based on EE or oxygen uptake including DLW, activity energy expenditure, physical activity level, and metabolic equivalent; those based on heart rate monitoring and motion sensors; and because of their widespread use, selected subjective measures. PMID:25988109
Integrated performance and reliability specification for digital avionics systems
NASA Technical Reports Server (NTRS)
Brehm, Eric W.; Goettge, Robert T.
1995-01-01
This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.
Shape-based human detection for threat assessment
NASA Astrophysics Data System (ADS)
Lee, Dah-Jye; Zhan, Pengcheng; Thomas, Aaron; Schoenberger, Robert B.
2004-07-01
Detection of intrusions for early threat assessment requires the capability of distinguishing whether the intrusion is a human, an animal, or other objects. Most low-cost security systems use simple electronic motion detection sensors to monitor motion or the location of objects within the perimeter. Although cost effective, these systems suffer from high rates of false alarm, especially when monitoring open environments. Any moving objects including animals can falsely trigger the security system. Other security systems that utilize video equipment require human interpretation of the scene in order to make real-time threat assessment. Shape-based human detection technique has been developed for accurate early threat assessments for open and remote environment. Potential threats are isolated from the static background scene using differential motion analysis and contours of the intruding objects are extracted for shape analysis. Contour points are simplified by removing redundant points connecting short and straight line segments and preserving only those with shape significance. Contours are represented in tangent space for comparison with shapes stored in database. Power cepstrum technique has been developed to search for the best matched contour in database and to distinguish a human from other objects from different viewing angles and distances.
Extending the Distributed Lag Model framework to handle chemical mixtures.
Bello, Ghalib A; Arora, Manish; Austin, Christine; Horton, Megan K; Wright, Robert O; Gennings, Chris
2017-07-01
Distributed Lag Models (DLMs) are used in environmental health studies to analyze the time-delayed effect of an exposure on an outcome of interest. Given the increasing need for analytical tools for evaluation of the effects of exposure to multi-pollutant mixtures, this study attempts to extend the classical DLM framework to accommodate and evaluate multiple longitudinally observed exposures. We introduce 2 techniques for quantifying the time-varying mixture effect of multiple exposures on an outcome of interest. Lagged WQS, the first technique, is based on Weighted Quantile Sum (WQS) regression, a penalized regression method that estimates mixture effects using a weighted index. We also introduce Tree-based DLMs, a nonparametric alternative for assessment of lagged mixture effects. This technique is based on the Random Forest (RF) algorithm, a nonparametric, tree-based estimation technique that has shown excellent performance in a wide variety of domains. In a simulation study, we tested the feasibility of these techniques and evaluated their performance in comparison to standard methodology. Both methods exhibited relatively robust performance, accurately capturing pre-defined non-linear functional relationships in different simulation settings. Further, we applied these techniques to data on perinatal exposure to environmental metal toxicants, with the goal of evaluating the effects of exposure on neurodevelopment. Our methods identified critical neurodevelopmental windows showing significant sensitivity to metal mixtures. Copyright © 2017 Elsevier Inc. All rights reserved.
Limb length inequality: clinical implications for assessment and intervention.
Brady, Rebecca J; Dean, John B; Skinner, T Marc; Gross, Michael T
2003-05-01
The purpose of this paper is to review relevant literature concerning limb length inequalities in adults and to make recommendations for assessment and intervention based on the literature and our own clinical experience. Literature searches were conducted in the MEDLINE, PubMed, and CINAHL databases. Limb length inequality and common classification criteria are defined and etiological factors are presented. Common methods of detecting limb length inequality include direct (tape measure methods), indirect (pelvic leveling), and radiological techniques. Interventions include shoe inserts or external shoe lift therapy for mild cases. Surgery may be appropriate in severe cases. Little agreement exists regarding the prevalence of limb length inequality, the degree of limb length inequality that is considered clinically significant, and the reliability and validity of assessment methods. Based on correlational studies, the relationship between limb length inequality and orthopaedic pathologies is questionable. Stronger support for the link between low back pain (LBP) and limb length inequality is provided by intervention studies. Methods involving palpation of pelvic landmarks with block correction have the most support for clinical assessment of limb length inequality. Standing radiographs are suggested when clinical assessment methods are unsatisfactory. Clinicians should exercise caution when undertaking intervention strategies for limb length inequality of less than 5 mm when limb length inequality has been identified with clinical techniques. Recommendations are provided regarding intervention strategies.
Transforming the Cross Cultural Collaborative of Pierce County Through Assessment Capacity Building
Garza, Mary A.; Abatemarco, Diane J.; Gizzi, Cindan; Abegglen, Lynn M.; Johnson-Conley, Christina
2010-01-01
Background Underserved populations are underrepresented in public health initiatives such as tobacco control and in cancer clinical trials. Community involvement is crucial to interventions aimed at reducing health disparities, and local health departments increasingly are called upon to provide both leadership and funding. The Tacoma Pierce County Health Department (TPCHD), in conjunction with 13 key community-based organizations and healthcare systems, formed the Cross Cultural Collaborative of Pierce County (CCC) that successfully employs needs-assessment and evaluation techniques to identify community health initiatives. Methods Community leaders from six underserved populations of the CCC were trained in needs-assessments techniques. Assessments measured effectiveness of the collaborative process and community health initiatives by using key informant (n = 18) and group interviews (n = 3). Results The CCC, facilitated by its partnership with the TPCHD, built capacity and competence across community groups to successfully obtain two funded public health initiatives for six priority populations. Members expressed overall satisfaction with the training, organizational structure, and leadership. The CCC’s diversity, cultural competency, and sharing of resources were viewed both as a strength and a decision-making challenge. Conclusion Public health department leadership, collaboration, and evidence-based assessment and evaluation were key to demonstrating effectiveness of the interventions, ensuring the CCC’s sustainability. PMID:19077598
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
Methodology development for evaluation of selective-fidelity rotorcraft simulation
NASA Technical Reports Server (NTRS)
Lewis, William D.; Schrage, D. P.; Prasad, J. V. R.; Wolfe, Daniel
1992-01-01
This paper addressed the initial step toward the goal of establishing performance and handling qualities acceptance criteria for realtime rotorcraft simulators through a planned research effort to quantify the system capabilities of 'selective fidelity' simulators. Within this framework the simulator is then classified based on the required task. The simulator is evaluated by separating the various subsystems (visual, motion, etc.) and applying corresponding fidelity constants based on the specific task. This methodology not only provides an assessment technique, but also provides a technique to determine the required levels of subsystem fidelity for a specific task.
NASA Astrophysics Data System (ADS)
Caffini, Matteo; Bergsland, Niels; LaganÃ, Marcella; Tavazzi, Eleonora; Tortorella, Paola; Rovaris, Marco; Baselli, Giuseppe
2014-03-01
Despite advances in the application of nonconventional MRI techniques in furthering the understanding of multiple sclerosis pathogenic mechanisms, there are still many unanswered questions, such as the relationship between gray and white matter damage. We applied a combination of advanced surface-based reconstruction and diffusion tensor imaging techniques to address this issue. We found significant relationships between white matter tract integrity indices and corresponding cortical structures. Our results suggest a direct link between damage in white and gray matter and contribute to the notion of gray matter loss relating to clinical disability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Hu-Chen; Department of Industrial Engineering and Management, Tokyo Institute of Technology, Tokyo 152-8552; Wu, Jing
Highlights: • Propose a VIKOR-based fuzzy MCDM technique for evaluating HCW disposal methods. • Linguistic variables are used to assess the ratings and weights for the criteria. • The OWA operator is utilized to aggregate individual opinions of decision makers. • A case study is given to illustrate the procedure of the proposed framework. - Abstract: Nowadays selection of the appropriate treatment method in health-care waste (HCW) management has become a challenge task for the municipal authorities especially in developing countries. Assessment of HCW disposal alternatives can be regarded as a complicated multi-criteria decision making (MCDM) problem which requires considerationmore » of multiple alternative solutions and conflicting tangible and intangible criteria. The objective of this paper is to present a new MCDM technique based on fuzzy set theory and VIKOR method for evaluating HCW disposal methods. Linguistic variables are used by decision makers to assess the ratings and weights for the established criteria. The ordered weighted averaging (OWA) operator is utilized to aggregate individual opinions of decision makers into a group assessment. The computational procedure of the proposed framework is illustrated through a case study in Shanghai, one of the largest cities of China. The HCW treatment alternatives considered in this study include “incineration”, “steam sterilization”, “microwave” and “landfill”. The results obtained using the proposed approach are analyzed in a comparative way.« less
Three-dimensional assessment of scoliosis based on ultrasound data
NASA Astrophysics Data System (ADS)
Zhang, Junhua; Li, Hongjian; Yu, Bo
2015-12-01
In this study, an approach was proposed to assess the 3D scoliotic deformity based on ultrasound data. The 3D spine model was reconstructed by using a freehand 3D ultrasound imaging system. The geometric torsion was then calculated from the reconstructed spine model. A thoracic spine phantom set at a given pose was used in the experiment. The geometric torsion of the spine phantom calculated from the freehand ultrasound imaging system was 0.041 mm-1 which was close to that calculated from the biplanar radiographs (0.025 mm-1). Therefore, ultrasound is a promising technique for the 3D assessment of scoliosis.
Validation of Regression-Based Myogenic Correction Techniques for Scalp and Source-Localized EEG
McMenamin, Brenton W.; Shackman, Alexander J.; Maxwell, Jeffrey S.; Greischar, Lawrence L.; Davidson, Richard J.
2008-01-01
EEG and EEG source-estimation are susceptible to electromyographic artifacts (EMG) generated by the cranial muscles. EMG can mask genuine effects or masquerade as a legitimate effect - even in low frequencies, such as alpha (8–13Hz). Although regression-based correction has been used previously, only cursory attempts at validation exist and the utility for source-localized data is unknown. To address this, EEG was recorded from 17 participants while neurogenic and myogenic activity were factorially varied. We assessed the sensitivity and specificity of four regression-based techniques: between-subjects, between-subjects using difference-scores, within-subjects condition-wise, and within-subject epoch-wise on the scalp and in data modeled using the LORETA algorithm. Although within-subject epoch-wise showed superior performance on the scalp, no technique succeeded in the source-space. Aside from validating the novel epoch-wise methods on the scalp, we highlight methods requiring further development. PMID:19298626
Fly ash based zeolitic pigments for application in anticorrosive paints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Ruchi, E-mail: shawruchi1@gmail.com; Tiwari, Sangeeta, E-mail: stiwari2@amity.edu
2016-04-13
The purpose of this work is to evaluate the utilization of waste fly ash in anticorrosive paints. Zeolite NaY was synthesized from waste fly ash and subsequently modified by exchanging its nominal cation Na{sup +} with Mg{sup 2+} and Ca{sup 2+} ions. The metal ion exchanged zeolite was then used as anticorrosive zeolitic pigments in paints. The prepared zeolite NaY was characterized using X-Ray diffraction technique and Scanning electron microscopy. The size, shape and density of the prepared fly ash based pigments were determined by various techniques. The paints were prepared by using fly ash based zeolitic pigments in epoxymore » resin and the percentages of pigments used in paints were 2% and 5%. These paints were applied to the mild steel panels and the anticorrosive properties of the pigments were assessed by the electrochemical spectroscopy technique (EIS).« less
Towards a Quality Assessment Method for Learning Preference Profiles in Negotiation
NASA Astrophysics Data System (ADS)
Hindriks, Koen V.; Tykhonov, Dmytro
In automated negotiation, information gained about an opponent's preference profile by means of learning techniques may significantly improve an agent's negotiation performance. It therefore is useful to gain a better understanding of how various negotiation factors influence the quality of learning. The quality of learning techniques in negotiation are typically assessed indirectly by means of comparing the utility levels of agreed outcomes and other more global negotiation parameters. An evaluation of learning based on such general criteria, however, does not provide any insight into the influence of various aspects of negotiation on the quality of the learned model itself. The quality may depend on such aspects as the domain of negotiation, the structure of the preference profiles, the negotiation strategies used by the parties, and others. To gain a better understanding of the performance of proposed learning techniques in the context of negotiation and to be able to assess the potential to improve the performance of such techniques a more systematic assessment method is needed. In this paper we propose such a systematic method to analyse the quality of the information gained about opponent preferences by learning in single-instance negotiations. The method includes measures to assess the quality of a learned preference profile and proposes an experimental setup to analyse the influence of various negotiation aspects on the quality of learning. We apply the method to a Bayesian learning approach for learning an opponent's preference profile and discuss our findings.
Dai, Yunchao; Nasir, Mubasher; Zhang, Yulin; Gao, Jiakai; Lv, Yamin; Lv, Jialong
2018-01-01
Several predictive models and methods have been used for heavy metals bioavailability, but there is no universally accepted approach in evaluating the bioavailability of arsenic (As) in soil. The technique of diffusive gradients in thin-films (DGT) is a promising tool, but there is a considerable debate with respect to its suitability. The DGT method was compared with other traditional chemical extractions techniques (soil solution, NaHCO 3 , NH 4 Cl, HCl, and total As method) for estimating As bioavailability in soil based on a greenhouse experiment using Brassica chinensis grown in various soils from 15 provinces in China. In addition, we assessed whether these methods are independent of soil properties. The correlations between plant and soil As concentration measured with traditional extraction techniques were pH and iron oxide (Fe ox ) dependent, indicating that these methods are influenced by soil properties. In contrast, DGT measurements were independent of soil properties and also showed a better correlation coefficient than other traditional techniques. Thus, DGT technique is superior to traditional techniques and should be preferable for evaluating As bioavailability in different type of soils. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hipol, Leilani J; Deacon, Brett J
2013-03-01
Despite the well-established effectiveness of exposure-based cognitive-behavioral therapy (CBT) in the treatment of anxiety disorders, therapists have been slow to adopt CBT into their clinical practice. The present study was conducted to examine the utilization of psychotherapy techniques for anxiety disorders among community practitioners in a rural setting in order to determine the current status of the dissemination of CBT. A sample of 51 licensed psychotherapists from various mental health professions was recruited from online practice listings in the state of Wyoming. Participants completed a survey assessing their use of various psychotherapy techniques in the past 12 months for clients with obsessive-compulsive disorder, post-traumatic stress disorder, panic disorder, and social phobia. Nearly all psychotherapists reported providing CBT, and techniques such as cognitive restructuring, arousal-reduction strategies, and mindfulness were used by the vast majority of respondents. Therapist-assisted exposure was rarely utilized, and providers who delivered exposure therapy often did so alongside other techniques of questionable compatibility with this approach. Non-evidence-based techniques were frequently used, particularly by self-proclaimed anxiety specialists. Our findings highlight the successes and failures of efforts to disseminate exposure-based CBT to community practitioners. Implications for clinical training and practice are discussed.
A Structured Approach to Teaching Applied Problem Solving through Technology Assessment.
ERIC Educational Resources Information Center
Fischbach, Fritz A.; Sell, Nancy J.
1986-01-01
Describes an approach to problem solving based on real-world problems. Discusses problem analysis and definitions, preparation of briefing documents, solution finding techniques (brainstorming and synectics), solution evaluation and judgment, and implementation. (JM)
Suitability of analytical methods to measure solubility for the purpose of nanoregulation.
Tantra, Ratna; Bouwmeester, Hans; Bolea, Eduardo; Rey-Castro, Carlos; David, Calin A; Dogné, Jean-Michel; Jarman, John; Laborda, Francisco; Laloy, Julie; Robinson, Kenneth N; Undas, Anna K; van der Zande, Meike
2016-01-01
Solubility is an important physicochemical parameter in nanoregulation. If nanomaterial is completely soluble, then from a risk assessment point of view, its disposal can be treated much in the same way as "ordinary" chemicals, which will simplify testing and characterisation regimes. This review assesses potential techniques for the measurement of nanomaterial solubility and evaluates the performance against a set of analytical criteria (based on satisfying the requirements as governed by the cosmetic regulation as well as the need to quantify the concentration of free (hydrated) ions). Our findings show that no universal method exists. A complementary approach is thus recommended, to comprise an atomic spectrometry-based method in conjunction with an electrochemical (or colorimetric) method. This article shows that although some techniques are more commonly used than others, a huge research gap remains, related with the need to ensure data reliability.
Ultrasound elastography: principles, techniques, and clinical applications.
Dewall, Ryan J
2013-01-01
Ultrasound elastography is an emerging set of imaging modalities used to image tissue elasticity and are often referred to as virtual palpation. These techniques have proven effective in detecting and assessing many different pathologies, because tissue mechanical changes often correlate with tissue pathological changes. This article reviews the principles of ultrasound elastography, many of the ultrasound-based techniques, and popular clinical applications. Originally, elastography was a technique that imaged tissue strain by comparing pre- and postcompression ultrasound images. However, new techniques have been developed that use different excitation methods such as external vibration or acoustic radiation force. Some techniques track transient phenomena such as shear waves to quantitatively measure tissue elasticity. Clinical use of elastography is increasing, with applications including lesion detection and classification, fibrosis staging, treatment monitoring, vascular imaging, and musculoskeletal applications.
Interpolation/extrapolation technique with application to hypervelocity impact of space debris
NASA Technical Reports Server (NTRS)
Rule, William K.
1992-01-01
A new technique for the interpolation/extrapolation of engineering data is described. The technique easily allows for the incorporation of additional independent variables, and the most suitable data in the data base is automatically used for each prediction. The technique provides diagnostics for assessing the reliability of the prediction. Two sets of predictions made for known 5-degree-of-freedom, 15-parameter functions using the new technique produced an average coefficient of determination of 0.949. Here, the technique is applied to the prediction of damage to the Space Station from hypervelocity impact of space debris. A new set of impact data is presented for this purpose. Reasonable predictions for bumper damage were obtained, but predictions of pressure wall and multilayer insulation damage were poor.
Kelly, Kassandra C; Jordan, Erin M; Joyner, A Barry; Burdette, G Trey; Buckley, Thomas A
2014-01-01
A cornerstone of the recent consensus statements on concussion is a multifaceted concussion-assessment program at baseline and postinjury and when tracking recovery. Earlier studies of athletic trainers' (ATs') practice patterns found limited use of multifaceted protocols; however, these authors typically grouped diverse athletic training settings together. To (1) describe the concussion-management practice patterns of National Collegiate Athletic Association (NCAA) Division I ATs, (2) compare these practice patterns to earlier studies, and (3) objectively characterize the clinical examination. Cross-sectional study. Online survey. A total of 610 ATs from NCAA Division I institutions, for a response rate of 34.4%. The survey had 3 subsections: demographic questions related to the participant's experiences, concussion-assessment practice patterns, and concussion-recovery and return-to-participation practice patterns. Specific practice-pattern questions addressed balance, cognitive and mental status, neuropsychological testing, and self-reported symptoms. Finally, specific components of the clinical examination were examined. We identified high rates of multifaceted assessments (i.e., assessments using at least 3 techniques) during testing at baseline (71.2%), acute concussion assessment (79.2%), and return to participation (66.9%). The specific techniques used are provided along with their adherence with evidence-based practice findings. Respondents endorsed a diverse array of clinical examination techniques that often overlapped objective concussion-assessment protocols or were likely used to rule out associated potential conditions. Respondents were cognizant of the Third International Consensus Statement, the National Athletic Trainers' Association position statement, and the revised NCAA Sports Medicine Handbook recommendations. Athletic trainers in NCAA Division I demonstrated widespread use of multifaceted concussion-assessment techniques and appeared compliant with recent consensus statements and the NCAA Sports Medicine Handbook.
Community-based approaches to strategic environmental assessment: Lessons from Costa Rica
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sinclair, A. John; Sims, Laura; Spaling, Harry
This paper describes a community-based approach to strategic environmental assessment (SEA) using a case study of the Instituto Costarricense de Electricidad's (ICE) watershed management agricultural program (WMAP) in Costa Rica. The approach focused on four highly interactive workshops that used visioning, brainstorming and critical reflection exercises. Each workshop represented a critical step in the SEA process. Through this approach, communities in two rural watersheds assessed the environmental, social and economic impacts of a proposed second phase for WMAP. Lessons from this community-based approach to strategic environmental assessment include a recognition of participants learning what a participatory SEA is conceptually andmore » methodologically; the role of interactive techniques for identifying positive and negative impacts of the proposed program and generating creative mitigation strategies; the effect of workshops in reducing power differentials among program participants (proponent, communities, government agencies); and, the logistical importance of notice, timing and location for meaningful participation. The community-based approach to SEA offers considerable potential for assessing regional (watershed) development programs focused on sustainable resource-based livelihoods.« less
Murdoch, B E; Pitt, G; Theodoros, D G; Ward, E C
1999-01-01
The efficacy of traditional and physiological biofeedback methods for modifying abnormal speech breathing patterns was investigated in a child with persistent dysarthria following severe traumatic brain injury (TBI). An A-B-A-B single-subject experimental research design was utilized to provide the subject with two exclusive periods of therapy for speech breathing, based on traditional therapy techniques and physiological biofeedback methods, respectively. Traditional therapy techniques included establishing optimal posture for speech breathing, explanation of the movement of the respiratory muscles, and a hierarchy of non-speech and speech tasks focusing on establishing an appropriate level of sub-glottal air pressure, and improving the subject's control of inhalation and exhalation. The biofeedback phase of therapy utilized variable inductance plethysmography (or Respitrace) to provide real-time, continuous visual biofeedback of ribcage circumference during breathing. As in traditional therapy, a hierarchy of non-speech and speech tasks were devised to improve the subject's control of his respiratory pattern. Throughout the project, the subject's respiratory support for speech was assessed both instrumentally and perceptually. Instrumental assessment included kinematic and spirometric measures, and perceptual assessment included the Frenchay Dysarthria Assessment, Assessment of Intelligibility of Dysarthric Speech, and analysis of a speech sample. The results of the study demonstrated that real-time continuous visual biofeedback techniques for modifying speech breathing patterns were not only effective, but superior to the traditional therapy techniques for modifying abnormal speech breathing patterns in a child with persistent dysarthria following severe TBI. These results show that physiological biofeedback techniques are potentially useful clinical tools for the remediation of speech breathing impairment in the paediatric dysarthric population.
Knowledge-based system verification and validation
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1990-01-01
The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.
Laser ultrasonic techniques for assessment of tooth structure
NASA Astrophysics Data System (ADS)
Blodgett, David W.; Baldwin, Kevin C.
2000-06-01
Dental health care and research workers require a means of imaging the structures within teeth in vivo. For example, there is a need to image the margins of a restoration for the detection of poor bonding or voids between the restorative material and the dentin. With conventional x-ray techniques, it is difficult to detect cracks and to visualize interfaces between hard media. This due to the x-ray providing only a 2 dimensional projection of the internal structure (i.e. a silhouette). In addition, a high resolution imaging modality is needed to detect tooth decay in its early stages. If decay can be detected early enough, the process can be monitored and interventional procedures, such as fluoride washes and controlled diet, can be initiated which can help the tooth to re-mineralize itself. Currently employed x-ray imaging is incapable of detecting decay at a stage early enough to avoid invasive cavity preparation followed by a restoration with a synthetic material. Other clinical applications include the visualization of periodontal defects, the localization of intraosseous lesions, and determining the degree of osseointegration between a dental implant and the surrounding bone. A means of assessing the internal structure of the tooth based upon use of high frequency, highly localized ultrasound (acoustic waves) generated by a laser pulse is discussed. Optical interferometric detection of ultrasound provides a complementary technique with a very small detection footprint. Initial results using laser-based ultrasound for assessment of dental structures are presented. Discussion will center on the adaptability of this technique to clinical applications.
NASA Astrophysics Data System (ADS)
McShane, Gareth; Farrow, Luke; Morgan, David; Glendell, Miriam; James, Mike; Quinton, John; Evans, Martin; Anderson, Karen; Rawlins, Barry; Quine, Timothy; Debell, Leon; Benaud, Pia; Jones, Lee; Kirkham, Matthew; Lark, Murray; Rickson, Jane; Brazier, Richard
2015-04-01
Quantifying soil loss through erosion processes at a high resolution can be a time consuming and costly undertaking. In this pilot study 'a cost effective framework for monitoring soil erosion in England and Wales', funded by the UK Department for Environment, Food and Rural Affairs (Defra), we compare methods for collecting suitable topographic measurements via remote sensing. The aim is to enable efficient but detailed site-scale studies of erosion forms in inaccessible UK upland environments, to quantify dynamic processes, such as erosion and mass movement. The techniques assessed are terrestrial laser scanning (TLS), and unmanned aerial vehicle (UAV) photography and ground-based photography, both processed using structure-from-motion (SfM) 3D reconstruction software. Compared to other established techniques, such as expensive TLS, SfM offers a potentially low-cost alternative for the reconstruction of 3D high-resolution micro-topographic models from photographs taken with consumer grade cameras. However, whilst an increasing number of research papers examine the relative merits of these novel versus more established survey techniques, no study to date has compared both ground-based and aerial SfM photogrammetry with TLS scanning across a range of scales (from m2 to 16ha). The evaluation of these novel low cost techniques is particularly relevant in upland landscapes, where the remoteness and inaccessibility of field sites may render some of the more established survey techniques impractical. Volumetric estimates of soil loss are quantified using the digital surface models (DSMs) derived from the data from each technique and subtracted from a modelled pre-erosion surface. The results from each technique are compared. The UAV was able to capture information over a wide area, a range of altitudes and angles over the study area. Combined with automated SfM-based processing, this technique was able to produce rapid orthophotos to support ground-based data acquisition, as well as a DSM for volume loss measurement in larger features. However, the DSM of erosion features lacked the detail of those captured using the ground-based methods. Terrestrial laser scanning provided detailed, accurate, high density measurements of the ground surface over long (100s m) distances, but size and weight of the instrument made it difficult to use in mountainous environments. In addition, deriving a reliable bare-earth digital terrain model (DTM) from TLS was at times problematic due to the presence of tall shrubby vegetation. Ground-based photography produced comparable data sets to terrestrial laser scanning and was the most useful for characterising small and difficult to view features. The relative advantages, limitations and cost-effectiveness of each approach at 5 upland sites across the UK are discussed.
Identification and assessment of hazardous compounds in drinking water.
Fawell, J K; Fielding, M
1985-12-01
The identification of organic chemicals in drinking water and their assessment in terms of potential hazardous effects are two very different but closely associated tasks. In relation to both continuous low-level background contamination and specific, often high-level, contamination due to pollution incidents, the identification of contaminants is a pre-requisite to evaluation of significant hazards. Even in the case of the rapidly developing short-term bio-assays which are applied to water to indicate a potential genotoxic hazard (for example Ames tests), identification of the active chemicals is becoming a major factor in the further assessment of the response. Techniques for the identification of low concentrations of organic chemicals in drinking water have developed remarkably since the early 1970s and methods based upon gas chromatography-mass spectrometry (GC-MS) have revolutionised qualitative analysis of water. Such techniques are limited to "volatile" chemicals and these usually constitute a small fraction of the total organic material in water. However, in recent years there have been promising developments in techniques for "non-volatile" chemicals in water. Such techniques include combined high-performance liquid chromatography-mass spectrometry (HPLC-MS) and a variety of MS methods, involving, for example, field desorption, fast atom bombardment and thermospray ionisation techniques. In the paper identification techniques in general are reviewed and likely future developments outlined. The assessment of hazards associated with chemicals identified in drinking and related waters usually centres upon toxicology - an applied science which involves numerous disciplines. The paper examines the toxicological information needed, the quality and deployment of such information and discusses future research needs. Application of short-term bio-assays to drinking water is a developing area and one which is closely involved with, and to some extent dependent on, powerful methods of identification. Recent developments are discussed.
Dohrenbusch, R
2009-06-01
Chronic pain accompanied by disability and handicap is a frequent symptom necessitating medical assessment. Current guidelines for the assessment of malingering suggest discrimination between explanatory demonstration, aggravation and simulation. However, this distinction has not clearly been put into operation and validated. The necessity of assessment strategies based on general principles of psychological assessment and testing is emphasized. Standardized and normalized psychological assessment methods and symptom validation techniques should be used in the assessment of subjects with chronic pain problems. An adaptive procedure for assessing the validity of complaints is suggested to minimize effort and costs.
Damage assessment in multilayered MEMS structures under thermal fatigue
NASA Astrophysics Data System (ADS)
Maligno, A. R.; Whalley, D. C.; Silberschmidt, V. V.
2011-07-01
This paper reports on the application of a Physics of Failure (PoF) methodology to assessing the reliability of a micro electro mechanical system (MEMS). Numerical simulations, based on the finite element method (FEM) using a sub-domain approach was used to examine the damage onset due to temperature variations (e.g. yielding of metals which may lead to thermal fatigue). In this work remeshing techniques were employed in order to develop a damage tolerance approach based on the assumption that initial flaws exist in the multi-layered.
Internet-Based Delphi Research: Case Based Discussion
Donohoe, Holly M.; Stellefson, Michael L.
2013-01-01
The interactive capacity of the Internet offers benefits that are intimately linked with contemporary research innovation in the natural resource and environmental studies domains. However, e-research methodologies, such as the e-Delphi technique, have yet to undergo critical review. This study advances methodological discourse on the e-Delphi technique by critically assessing an e-Delphi case study. The analysis suggests that the benefits of using e-Delphi are noteworthy but the authors acknowledge that researchers are likely to face challenges that could potentially compromise research validity and reliability. To ensure that these issues are sufficiently considered when planning and designing an e-Delphi, important facets of the technique are discussed and recommendations are offered to help the environmental researcher avoid potential pitfalls associated with coordinating e-Delphi research. PMID:23288149
ERIC Educational Resources Information Center
Rybash, John M.; And Others
1975-01-01
This study used both verbal and videotape presentation techniques to assess the role of cognitive conflict in children's moral judgments. The results indicated that the children presented problems via videotape based their moral judgments on intentions, while verbal presentation increased the number of moral judgments based on damage. (JMB)
Nondestructive assessment of timber bridges using a vibration-based method
Xiping Wang; James P. Wacker; Robert J. Ross; Brian K. Brashaw
2005-01-01
This paper describes an effort to develop a global dynamic testing technique for evaluating the overall stiffness of timber bridge superstructures. A forced vibration method was used to measure the natural frequency of single-span timber bridges in the laboratory and field. An analytical model based on simple beam theory was proposed to represent the relationship...
ERIC Educational Resources Information Center
Wyman, Steven K.; And Others
This exploratory study establishes analytical tools (based on both technical criteria and user feedback) by which federal Web site administrators may assess the quality of their websites. The study combined qualitative and quantitative data collection techniques to achieve the following objectives: (1) identify and define key issues regarding…
ERIC Educational Resources Information Center
Yu, Jennifer W.; Wei, Xin; Wagner, Mary
2014-01-01
This study used propensity score techniques on data from the National Longitudinal Transition Study-2 to assess the causal relationship between speech and behavior-based support services and rates of social communication among high school students with Autism Spectrum Disorder (ASD). Findings indicate that receptive language problems were…
Theoretical Calculations of Atomic Data for Spectroscopy
NASA Technical Reports Server (NTRS)
Bautista, Manuel A.
2000-01-01
Several different approximations and techniques have been developed for the calculation of atomic structure, ionization, and excitation of atoms and ions. These techniques have been used to compute large amounts of spectroscopic data of various levels of accuracy. This paper presents a review of these theoretical methods to help non-experts in atomic physics to better understand the qualities and limitations of various data sources and assess how reliable are spectral models based on those data.
Nuclear imaging and radiation therapy in canine and feline thyroid disease.
Feeney, Daniel A; Anderson, Kari L
2007-07-01
The indications, techniques, and expectations for radionuclide diagnostic studies on canine and feline thyroid glands are presented. In addition, the considerations surrounding radioiodine or external beam radiotherapy for benign and malignant thyroid disease are reviewed. The intent of this article is to familiarize primary care veterinarians with the utility of and outcome of the ionizing radiation-based diagnostic and therapeutic techniques for assessing and treating canine and feline thyroid disease.
NASA Astrophysics Data System (ADS)
Ding, Xuemei; Wang, Bingyuan; Liu, Dongyuan; Zhang, Yao; He, Jie; Zhao, Huijuan; Gao, Feng
2018-02-01
During the past two decades there has been a dramatic rise in the use of functional near-infrared spectroscopy (fNIRS) as a neuroimaging technique in cognitive neuroscience research. Diffuse optical tomography (DOT) and optical topography (OT) can be employed as the optical imaging techniques for brain activity investigation. However, most current imagers with analogue detection are limited by sensitivity and dynamic range. Although photon-counting detection can significantly improve detection sensitivity, the intrinsic nature of sequential excitations reduces temporal resolution. To improve temporal resolution, sensitivity and dynamic range, we develop a multi-channel continuous-wave (CW) system for brain functional imaging based on a novel lock-in photon-counting technique. The system consists of 60 Light-emitting device (LED) sources at three wavelengths of 660nm, 780nm and 830nm, which are modulated by current-stabilized square-wave signals at different frequencies, and 12 photomultiplier tubes (PMT) based on lock-in photon-counting technique. This design combines the ultra-high sensitivity of the photon-counting technique with the parallelism of the digital lock-in technique. We can therefore acquire the diffused light intensity for all the source-detector pairs (SD-pairs) in parallel. The performance assessments of the system are conducted using phantom experiments, and demonstrate its excellent measurement linearity, negligible inter-channel crosstalk, strong noise robustness and high temporal resolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samei, Ehsan, E-mail: samei@duke.edu; Richard, Samuel
2015-01-15
Purpose: Different computed tomography (CT) reconstruction techniques offer different image quality attributes of resolution and noise, challenging the ability to compare their dose reduction potential against each other. The purpose of this study was to evaluate and compare the task-based imaging performance of CT systems to enable the assessment of the dose performance of a model-based iterative reconstruction (MBIR) to that of an adaptive statistical iterative reconstruction (ASIR) and a filtered back projection (FBP) technique. Methods: The ACR CT phantom (model 464) was imaged across a wide range of mA setting on a 64-slice CT scanner (GE Discovery CT750 HD,more » Waukesha, WI). Based on previous work, the resolution was evaluated in terms of a task-based modulation transfer function (MTF) using a circular-edge technique and images from the contrast inserts located in the ACR phantom. Noise performance was assessed in terms of the noise-power spectrum (NPS) measured from the uniform section of the phantom. The task-based MTF and NPS were combined with a task function to yield a task-based estimate of imaging performance, the detectability index (d′). The detectability index was computed as a function of dose for two imaging tasks corresponding to the detection of a relatively small and a relatively large feature (1.5 and 25 mm, respectively). The performance of MBIR in terms of the d′ was compared with that of ASIR and FBP to assess its dose reduction potential. Results: Results indicated that MBIR exhibits a variability spatial resolution with respect to object contrast and noise while significantly reducing image noise. The NPS measurements for MBIR indicated a noise texture with a low-pass quality compared to the typical midpass noise found in FBP-based CT images. At comparable dose, the d′ for MBIR was higher than those of FBP and ASIR by at least 61% and 19% for the small feature and the large feature tasks, respectively. Compared to FBP and ASIR, MBIR indicated a 46%–84% dose reduction potential, depending on task, without compromising the modeled detection performance. Conclusions: The presented methodology based on ACR phantom measurements extends current possibilities for the assessment of CT image quality under the complex resolution and noise characteristics exhibited with statistical and iterative reconstruction algorithms. The findings further suggest that MBIR can potentially make better use of the projections data to reduce CT dose by approximately a factor of 2. Alternatively, if the dose held unchanged, it can improve image quality by different levels for different tasks.« less
Accuracy assessment of fluoroscopy-transesophageal echocardiography registration
NASA Astrophysics Data System (ADS)
Lang, Pencilla; Seslija, Petar; Bainbridge, Daniel; Guiraudon, Gerard M.; Jones, Doug L.; Chu, Michael W.; Holdsworth, David W.; Peters, Terry M.
2011-03-01
This study assesses the accuracy of a new transesophageal (TEE) ultrasound (US) fluoroscopy registration technique designed to guide percutaneous aortic valve replacement. In this minimally invasive procedure, a valve is inserted into the aortic annulus via a catheter. Navigation and positioning of the valve is guided primarily by intra-operative fluoroscopy. Poor anatomical visualization of the aortic root region can result in incorrect positioning, leading to heart valve embolization, obstruction of the coronary ostia and acute kidney injury. The use of TEE US images to augment intra-operative fluoroscopy provides significant improvements to image-guidance. Registration is achieved using an image-based TEE probe tracking technique and US calibration. TEE probe tracking is accomplished using a single-perspective pose estimation algorithm. Pose estimation from a single image allows registration to be achieved using only images collected in standard OR workflow. Accuracy of this registration technique is assessed using three models: a point target phantom, a cadaveric porcine heart with implanted fiducials, and in-vivo porcine images. Results demonstrate that registration can be achieved with an RMS error of less than 1.5mm, which is within the clinical accuracy requirements of 5mm. US-fluoroscopy registration based on single-perspective pose estimation demonstrates promise as a method for providing guidance to percutaneous aortic valve replacement procedures. Future work will focus on real-time implementation and a visualization system that can be used in the operating room.
Briggs, Brandi N; Stender, Michael E; Muljadi, Patrick M; Donnelly, Meghan A; Winn, Virginia D; Ferguson, Virginia L
2015-06-25
Clinical practice requires improved techniques to assess human cervical tissue properties, especially at the internal os, or orifice, of the uterine cervix. Ultrasound elastography (UE) holds promise for non-invasively monitoring cervical stiffness throughout pregnancy. However, this technique provides qualitative strain images that cannot be linked to a material property (e.g., Young's modulus) without knowledge of the contact pressure under a rounded transvaginal transducer probe and correction for the resulting non-uniform strain dissipation. One technique to standardize elastogram images incorporates a material of known properties and uses one-dimensional, uniaxial Hooke's law to calculate Young's modulus within the compressed material half-space. However, this method does not account for strain dissipation and the strains that evolve in three-dimensional space. We demonstrate that an analytical approach based on 3D Hertzian contact mechanics provides a reasonable first approximation to correct for UE strain dissipation underneath a round transvaginal transducer probe and thus improves UE-derived estimates of tissue modulus. We validate the proposed analytical solution and evaluate sources of error using a finite element model. As compared to 1D uniaxial Hooke's law, the Hertzian contact-based solution yields significantly improved Young's modulus predictions in three homogeneous gelatin tissue phantoms possessing different moduli. We also demonstrate the feasibility of using this technique to image human cervical tissue, where UE-derived moduli estimations for the uterine cervix anterior lip agreed well with published, experimentally obtained values. Overall, UE with an attached reference standard and a Hertzian contact-based correction holds promise for improving quantitative estimates of cervical tissue modulus. Copyright © 2015 Elsevier Ltd. All rights reserved.
Pitfalls in the measurement of muscle mass: a need for a reference standard.
Buckinx, Fanny; Landi, Francesco; Cesari, Matteo; Fielding, Roger A; Visser, Marjolein; Engelke, Klaus; Maggi, Stefania; Dennison, Elaine; Al-Daghri, Nasser M; Allepaerts, Sophie; Bauer, Jurgen; Bautmans, Ivan; Brandi, Maria Luisa; Bruyère, Olivier; Cederholm, Tommy; Cerreta, Francesca; Cherubini, Antonio; Cooper, Cyrus; Cruz-Jentoft, Alphonso; McCloskey, Eugene; Dawson-Hughes, Bess; Kaufman, Jean-Marc; Laslop, Andrea; Petermans, Jean; Reginster, Jean-Yves; Rizzoli, René; Robinson, Sian; Rolland, Yves; Rueda, Ricardo; Vellas, Bruno; Kanis, John A
2018-04-01
All proposed definitions of sarcopenia include the measurement of muscle mass, but the techniques and threshold values used vary. Indeed, the literature does not establish consensus on the best technique for measuring lean body mass. Thus, the objective measurement of sarcopenia is hampered by limitations intrinsic to assessment tools. The aim of this study was to review the methods to assess muscle mass and to reach consensus on the development of a reference standard. Literature reviews were performed by members of the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis working group on frailty and sarcopenia. Face-to-face meetings were organized for the whole group to make amendments and discuss further recommendations. A wide range of techniques can be used to assess muscle mass. Cost, availability, and ease of use can determine whether the techniques are better suited to clinical practice or are more useful for research. No one technique subserves all requirements but dual energy X-ray absorptiometry could be considered as a reference standard (but not a gold standard) for measuring muscle lean body mass. Based on the feasibility, accuracy, safety, and low cost, dual energy X-ray absorptiometry can be considered as the reference standard for measuring muscle mass. © 2018 The Authors. Journal of Cachexia, Sarcopenia and Muscle published by John Wiley & Sons Ltd on behalf of the Society on Sarcopenia, Cachexia and Wasting Disorders.
Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.
Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander
2018-04-10
A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.
On-the-spot damage detection methodology for highway bridges.
DOT National Transportation Integrated Search
2010-07-01
Vibration-based damage identification (VBDI) techniques have been developed in part to address the problems associated with an aging civil infrastructure. To assess the potential of VBDI as it applies to highway bridges in Iowa, three applications of...
High-altitude wind prediction and measurement technology assessment
DOT National Transportation Integrated Search
2009-06-30
The principles and operational characteristics of balloon and radar-based techniques for measuring upper air winds in support of launches and recoveries are presented. Though either a balloon or radar system could serve as a standalone system, the sa...
Estimating Sobol Sensitivity Indices Using Correlations
Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on...
X-ray Phase Contrast Allows Three Dimensional, Quantitative Imaging of Hydrogel Implants
Appel, Alyssa A.; Larson, Jeffery C.; Jiang, Bin; Zhong, Zhong; Anastasio, Mark A.; Brey, Eric M.
2015-01-01
Three dimensional imaging techniques are needed for the evaluation and assessment of biomaterials used for tissue engineering and drug delivery applications. Hydrogels are a particularly popular class of materials for medical applications but are difficult to image in tissue using most available imaging modalities. Imaging techniques based on X-ray Phase Contrast (XPC) have shown promise for tissue engineering applications due to their ability to provide image contrast based on multiple X-ray properties. In this manuscript, we investigate the use of XPC for imaging a model hydrogel and soft tissue structure. Porous fibrin loaded poly(ethylene glycol) hydrogels were synthesized and implanted in a rodent subcutaneous model. Samples were explanted and imaged with an analyzer-based XPC technique and processed and stained for histology for comparison. Both hydrogel and soft tissues structures could be identified in XPC images. Structure in skeletal muscle adjacent could be visualized and invading fibrovascular tissue could be quantified. There were no differences between invading tissue measurements from XPC and the gold-standard histology. These results provide evidence of the significant potential of techniques based on XPC for 3D imaging of hydrogel structure and local tissue response. PMID:26487123
X-ray Phase Contrast Allows Three Dimensional, Quantitative Imaging of Hydrogel Implants
Appel, Alyssa A.; Larson, Jeffrey C.; Jiang, Bin; ...
2015-10-20
Three dimensional imaging techniques are needed for the evaluation and assessment of biomaterials used for tissue engineering and drug delivery applications. Hydrogels are a particularly popular class of materials for medical applications but are difficult to image in tissue using most available imaging modalities. Imaging techniques based on X-ray Phase Contrast (XPC) have shown promise for tissue engineering applications due to their ability to provide image contrast based on multiple X-ray properties. In this manuscript we describe results using XPC to image a model hydrogel and soft tissue structure. Porous fibrin loaded poly(ethylene glycol) hydrogels were synthesized and implanted inmore » a rodent subcutaneous model. Samples were explanted and imaged with an analyzer-based XPC technique and processed and stained for histology for comparison. Both hydrogel and soft tissues structures could be identified in XPC images. Structure in skeletal muscle adjacent could be visualized and invading fibrovascular tissue could be quantified. In quantitative results, there were no differences between XPC and the gold-standard histological measurements. These results provide evidence of the significant potential of techniques based on XPC for 3D imaging of hydrogel structure and local tissue response.« less
Operator Performance Measures for Assessing Voice Communication Effectiveness
1989-07-01
performance and work- load assessment techniques have been based.I Broadbent (1958) described a limited capacity filter model of human information...INFORMATION PROCESSING 20 3.1.1. Auditory Attention 20 3.1.2. Auditory Memory 24 3.2. MODELS OF INFORMATION PROCESSING 24 3.2.1. Capacity Theories 25...Learning 0 Attention * Language Specialization • Decision Making• Problem Solving Auditory Information Processing Models of Processing Ooemtor
Condition Assessment of a 2,500-Year-Old Mummy Coffin
Robert J. Ross; Turker Dundar
2012-01-01
This work was conducted to assess the condition of a 2,500-year-old mummy coffin. The coffin, part of a collection of funerary objects at the NelsonâAtkins Museum of Art in St. Louis, Missouri, is made of wood obtained from a sycamore fig tree, Ficus sycomorus. Visual and acoustic-based nondestructive testing techniques were used to inspect the...
Assessing MODIS-based Products and Techniques for Detecting Gypsy Moth Defoliation
NASA Technical Reports Server (NTRS)
Spruce, Joseph P.; Hargrove, William; Smoot, James C.; Prados, Don; McKellip, Rodney; Sader, Steven A.; Gasser, Jerry; May, George
2008-01-01
The project showed potential of MODIS and VIIRS time series data for contributing defoliation detection products to the USFS forest threat early warning system. This study yielded the first satellite-based wall-to-wall 2001 gypsy moth defoliation map for the study area. Initial results led to follow-on work to map 2007 gypsy moth defoliation over the eastern United States (in progress). MODIS-based defoliation maps offer promise for aiding aerial sketch maps either in planning surveys and/or adjusting acreage estimates of annual defoliation. More work still needs to be done to assess potential of technology for "now casts"of defoliation.
Advances in Testing Techniques for Digital Microfluidic Biochips
Shukla, Vineeta; Hussin, Fawnizu Azmadi; Hamid, Nor Hisham; Zain Ali, Noohul Basheer
2017-01-01
With the advancement of digital microfluidics technology, applications such as on-chip DNA analysis, point of care diagnosis and automated drug discovery are common nowadays. The use of Digital Microfluidics Biochips (DMFBs) in disease assessment and recognition of target molecules had become popular during the past few years. The reliability of these DMFBs is crucial when they are used in various medical applications. Errors found in these biochips are mainly due to the defects developed during droplet manipulation, chip degradation and inaccuracies in the bio-assay experiments. The recently proposed Micro-electrode-dot Array (MEDA)-based DMFBs involve both fluidic and electronic domains in the micro-electrode cell. Thus, the testing techniques for these biochips should be revised in order to ensure proper functionality. This paper describes recent advances in the testing technologies for digital microfluidics biochips, which would serve as a useful platform for developing revised/new testing techniques for MEDA-based biochips. Therefore, the relevancy of these techniques with respect to testing of MEDA-based biochips is analyzed in order to exploit the full potential of these biochips. PMID:28749411
Advances in Testing Techniques for Digital Microfluidic Biochips.
Shukla, Vineeta; Hussin, Fawnizu Azmadi; Hamid, Nor Hisham; Zain Ali, Noohul Basheer
2017-07-27
With the advancement of digital microfluidics technology, applications such as on-chip DNA analysis, point of care diagnosis and automated drug discovery are common nowadays. The use of Digital Microfluidics Biochips (DMFBs) in disease assessment and recognition of target molecules had become popular during the past few years. The reliability of these DMFBs is crucial when they are used in various medical applications. Errors found in these biochips are mainly due to the defects developed during droplet manipulation, chip degradation and inaccuracies in the bio-assay experiments. The recently proposed Micro-electrode-dot Array (MEDA)-based DMFBs involve both fluidic and electronic domains in the micro-electrode cell. Thus, the testing techniques for these biochips should be revised in order to ensure proper functionality. This paper describes recent advances in the testing technologies for digital microfluidics biochips, which would serve as a useful platform for developing revised/new testing techniques for MEDA-based biochips. Therefore, the relevancy of these techniques with respect to testing of MEDA-based biochips is analyzed in order to exploit the full potential of these biochips.
Improving the accuracy of effect-directed analysis: the role of bioavailability.
You, Jing; Li, Huizhen
2017-12-13
Aquatic ecosystems have been suffering from contamination by multiple stressors. Traditional chemical-based risk assessment usually fails to explain the toxicity contributions from contaminants that are not regularly monitored or that have an unknown identity. Diagnosing the causes of noted adverse outcomes in the environment is of great importance in ecological risk assessment and in this regard effect-directed analysis (EDA) has been designed to fulfill this purpose. The EDA approach is now increasingly used in aquatic risk assessment owing to its specialty in achieving effect-directed nontarget analysis; however, a lack of environmental relevance makes conventional EDA less favorable. In particular, ignoring the bioavailability in EDA may cause a biased and even erroneous identification of causative toxicants in a mixture. Taking bioavailability into consideration is therefore of great importance to improve the accuracy of EDA diagnosis. The present article reviews the current status and applications of EDA practices that incorporate bioavailability. The use of biological samples is the most obvious way to include bioavailability into EDA applications, but its development is limited due to the small sample size and lack of evidence for metabolizable compounds. Bioavailability/bioaccessibility-based extraction (bioaccessibility-directed and partitioning-based extraction) and passive-dosing techniques are recommended to be used to integrate bioavailability into EDA diagnosis in abiotic samples. Lastly, the future perspectives of expanding and standardizing the use of biological samples and bioavailability-based techniques in EDA are discussed.
Architectural Considerations of Fiber-Radio Millimeter-Wave Wireless Access Systems
NASA Astrophysics Data System (ADS)
Kitayama, Ken-Ichi
The architecture of fiber-radio mm-wave wireless access systems critically depends upon the optical mm-wave generation and transport techniques. Four optical mm-wave generation and transport techniques: 1) optical self-heterodyning, 2) external modulation, 3) up- and downconversion, and 4) optical transceiver, will be assessed. From the technical viewpoints, their advantages and disadvantages are discussed. The economical assessment, focusing on the cost of a base station BS ( ), will suggest that the optical transceiver looks the most promising in the long run, but in the near future, however, the external modulation will be cost-effective. The experimental results of 60 GHz testbeds using the external modulation will support the conclusion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clayton, Dwight A.; Santos-Villalobos, Hector J.; Baba, Justin S.
By the end of 1996, 109 Nuclear Power Plants were operating in the United States, producing 22% of the Nation’s electricity [1]. At present, more than two thirds of these power plants are more than 40 years old. The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years [2]. The most important safety structures in an NPP are constructed of concrete. The structures generallymore » do not allow for destructive evaluation and access is limited to one side of the concrete element. Therefore, there is a need for techniques and technologies that can assess the internal health of complex, reinforced concrete structures nondestructively. Previously, we documented the challenges associated with Non-Destructive Evaluation (NDE) of thick, reinforced concrete sections and prioritized conceptual designs of specimens that could be fabricated to represent NPP concrete structures [3]. Consequently, a 7 feet tall, by 7 feet wide, by 3 feet and 4-inch-thick concrete specimen was constructed with 2.257-inch-and 1-inch-diameter rebar every 6 to 12 inches. In addition, defects were embedded the specimen to assess the performance of existing and future NDE techniques. The defects were designed to give a mix of realistic and controlled defects for assessment of the necessary measures needed to overcome the challenges with more heavily reinforced concrete structures. Information on the embedded defects is documented in [4]. We also documented the superiority of Frequency Banded Decomposition (FBD) Synthetic Aperture Focusing Technique (SAFT) over conventional SAFT when probing defects under deep concrete cover. Improvements include seeing an intensity corresponding to a defect that is either not visible at all in regular, full frequency content SAFT, or an improvement in contrast over conventional SAFT reconstructed images. This report documents our efforts in four fronts: 1) Comparative study between traditional SAFT and FBD SAFT for concrete specimen with and without Alkali-Silica Reaction (ASR) damage, 2) improvement of our Model-Based Iterative Reconstruction (MBIR) for thick reinforced concrete [5], 3) development of a universal framework for sharing, reconstruction, and visualization of ultrasound NDE datasets, and 4) application of machine learning techniques for automated detection of ASR inside concrete. Our comparative study between FBD and traditional SAFT reconstruction images shows a clear difference between images of ASR and non-ASR specimens. In particular, the left first harmonic shows an increased contrast and sensitivity to ASR damage. For MBIR, we show the superiority of model-based techniques over delay and sum techniques such as SAFT. Improvements include elimination of artifacts caused by direct arrival signals, and increased contrast and Signal to Noise Ratio. For the universal framework, we document a format for data storage based on the HDF5 file format, and also propose a modular Graphic User Interface (GUI) for easy customization of data conversion, reconstruction, and visualization routines. Finally, two techniques for ASR automated detection are presented. The first technique is based on an analysis of the frequency content using Hilbert Transform Indicator (HTI) and the second technique employees Artificial Neural Network (ANN) techniques for training and classification of ultrasound data as ASR or non-ASR damaged classes. The ANN technique shows great potential with classification accuracy above 95%. These approaches are extensible to the detection of additional reinforced, thick concrete defects and damage.« less
Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts
Parker, Gene W.; Pinson, Harlow
1993-01-01
A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.
A systematic mapping study of process mining
NASA Astrophysics Data System (ADS)
Maita, Ana Rocío Cárdenas; Martins, Lucas Corrêa; López Paz, Carlos Ramón; Rafferty, Laura; Hung, Patrick C. K.; Peres, Sarajane Marques; Fantinato, Marcelo
2018-05-01
This study systematically assesses the process mining scenario from 2005 to 2014. The analysis of 705 papers evidenced 'discovery' (71%) as the main type of process mining addressed and 'categorical prediction' (25%) as the main mining task solved. The most applied traditional technique is the 'graph structure-based' ones (38%). Specifically concerning computational intelligence and machine learning techniques, we concluded that little relevance has been given to them. The most applied are 'evolutionary computation' (9%) and 'decision tree' (6%), respectively. Process mining challenges, such as balancing among robustness, simplicity, accuracy and generalization, could benefit from a larger use of such techniques.
On the comprehensibility and perceived privacy protection of indirect questioning techniques.
Hoffmann, Adrian; Waubert de Puiseau, Berenike; Schmidt, Alexander F; Musch, Jochen
2017-08-01
On surveys that assess sensitive personal attributes, indirect questioning aims at increasing respondents' willingness to answer truthfully by protecting confidentiality. However, the assumption that subjects understand questioning procedures fully and trust them to protect their privacy is rarely tested. In a scenario-based design, we compared four indirect questioning procedures in terms of their comprehensibility and perceived privacy protection. All indirect questioning techniques were found to be less comprehensible by respondents than a conventional direct question used for comparison. Less-educated respondents experienced more difficulties when confronted with any indirect questioning technique. Regardless of education, the crosswise model was found to be the most comprehensible among the four indirect methods. Indirect questioning in general was perceived to increase privacy protection in comparison to a direct question. Unexpectedly, comprehension and perceived privacy protection did not correlate. We recommend assessing these factors separately in future evaluations of indirect questioning.
NASA Astrophysics Data System (ADS)
Visalakshi, Talakokula; Bhalla, Suresh; Gupta, Ashok; Bhattacharjee, Bishwajit
2014-03-01
Reinforced concrete (RC) is an economical, versatile and successful construction material as it can be moulded into a variety of shapes and finishes. In most cases, it is durable and strong, performing well throughout its service life. However, in some cases, it does not perform adequately due to various reasons, one of which is the corrosion of the embedded steel bars used as reinforcement. . Although the electro-mechanical impedance (EMI) technique is well established for damage detection and quantification of civil, mechanical and aerospace structures, only limited studies have been reported of its application for rebar corrosion detection in RC structures. This paper presents the recent trends in corrosion assessment based on the model derived from the equivalent structural parameters extracted from the impedance spectrum of concrete-rebar system using the lead zirconate titanate (PZT) sensors via EMI technique.
A community assessment of privacy preserving techniques for human genomes
2014-01-01
To answer the need for the rigorous protection of biomedical data, we organized the Critical Assessment of Data Privacy and Protection initiative as a community effort to evaluate privacy-preserving dissemination techniques for biomedical data. We focused on the challenge of sharing aggregate human genomic data (e.g., allele frequencies) in a way that preserves the privacy of the data donors, without undermining the utility of genome-wide association studies (GWAS) or impeding their dissemination. Specifically, we designed two problems for disseminating the raw data and the analysis outcome, respectively, based on publicly available data from HapMap and from the Personal Genome Project. A total of six teams participated in the challenges. The final results were presented at a workshop of the iDASH (integrating Data for Analysis, 'anonymization,' and SHaring) National Center for Biomedical Computing. We report the results of the challenge and our findings about the current genome privacy protection techniques. PMID:25521230
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
A comparison of three observational techniques for assessing postural loads in industry.
Kee, Dohyung; Karwowski, Waldemar
2007-01-01
This study aims to compare 3 observational techniques for assessing postural load, namely, OWAS, RULA, and REBA. The comparison was based on the evaluation results generated by the classification techniques using 301 working postures. All postures were sampled from the iron and steel, electronics, automotive, and chemical industries, and a general hospital. While only about 21% of the 301 postures were classified at the action category/level 3 or 4 by both OWAS and REBA, about 56% of the postures were classified into action level 3 or 4 by RULA. The inter-method reliability for postural load category between OWAS and RULA was just 29.2%, and the reliability between RULA and REBA was 48.2%. These results showed that compared to RULA, OWAS, and REBA generally underestimated postural loads for the analyzed postures, irrespective of industry, work type, and whether or not the body postures were in a balanced state.
A community assessment of privacy preserving techniques for human genomes.
Jiang, Xiaoqian; Zhao, Yongan; Wang, Xiaofeng; Malin, Bradley; Wang, Shuang; Ohno-Machado, Lucila; Tang, Haixu
2014-01-01
To answer the need for the rigorous protection of biomedical data, we organized the Critical Assessment of Data Privacy and Protection initiative as a community effort to evaluate privacy-preserving dissemination techniques for biomedical data. We focused on the challenge of sharing aggregate human genomic data (e.g., allele frequencies) in a way that preserves the privacy of the data donors, without undermining the utility of genome-wide association studies (GWAS) or impeding their dissemination. Specifically, we designed two problems for disseminating the raw data and the analysis outcome, respectively, based on publicly available data from HapMap and from the Personal Genome Project. A total of six teams participated in the challenges. The final results were presented at a workshop of the iDASH (integrating Data for Analysis, 'anonymization,' and SHaring) National Center for Biomedical Computing. We report the results of the challenge and our findings about the current genome privacy protection techniques.
Wire Crimp Connectors Verification using Ultrasonic Inspection
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Perey, Daniel F.; Yost, William T.
2007-01-01
The development of a new ultrasonic measurement technique to quantitatively assess wire crimp connections is discussed. The amplitude change of a compressional ultrasonic wave propagating through the junction of a crimp connector and wire is shown to correlate with the results of a destructive pull test, which previously has been used to assess crimp wire junction quality. Various crimp junction pathologies (missing wire strands, incorrect wire gauge, incomplete wire insertion in connector) are ultrasonically tested, and their results are correlated with pull tests. Results show that the ultrasonic measurement technique consistently (as evidenced with pull-testing data) predicts good crimps when ultrasonic transmission is above a certain threshold amplitude level. A physics-based model, solved by finite element analysis, describes the compressional ultrasonic wave propagation through the junction during the crimping process. This model is in agreement within 6% of the ultrasonic measurements. A prototype instrument for applying the technique while wire crimps are installed is also presented.
Assessment method of digital Chinese dance movements based on virtual reality technology
NASA Astrophysics Data System (ADS)
Feng, Wei; Shao, Shuyuan; Wang, Shumin
2008-03-01
Virtual reality has played an increasing role in such areas as medicine, architecture, aviation, engineering science and advertising. However, in the art fields, virtual reality is still in its infancy in the representation of human movements. Based on the techniques of motion capture and reuse of motion capture data in virtual reality environment, this paper presents an assessment method in order to evaluate the quantification of dancers' basic Arm Position movements in Chinese traditional dance. In this paper, the data for quantifying traits of dance motions are defined and measured on dancing which performed by an expert and two beginners, with results indicating that they are beneficial for evaluating dance skills and distinctiveness, and the assessment method of digital Chinese dance movements based on virtual reality technology is validity and feasibility.
Advanced MRI Methods for Assessment of Chronic Liver Disease
Taouli, Bachir; Ehman, Richard L.; Reeder, Scott B.
2010-01-01
MRI plays an increasingly important role for assessment of patients with chronic liver disease. MRI has numerous advantages, including lack of ionizing radiation and the possibility of performing multiparametric imaging. With recent advances in technology, advanced MRI methods such as diffusion-, perfusion-weighted MRI, MR elastography, chemical shift based fat-water separation and MR spectroscopy can now be applied to liver imaging. We will review the respective roles of these techniques for assessment of chronic liver disease. PMID:19542391
NASA Technical Reports Server (NTRS)
Badhwar, G. D.; O'Neill, P. M.
2001-01-01
There is considerable interest in developing silicon-based telescopes because of their compactness and low power requirements. Three such telescopes have been flown on board the Space Shuttle to measure the linear energy transfer spectra of trapped, galactic cosmic ray, and solar energetic particles. Dosimeters based on single silicon detectors have also been flown on the Mir orbital station. A comparison of the absorbed dose and radiation quality factors calculated from these telescopes with that estimated from measurements made with a tissue equivalent proportional counter show differences which need to be fully understood if these telescopes are to be used for astronaut radiation risk assessments. Instrument performance is complicated by a variety of factors. A Monte Carlo-based technique was developed to model the behavior of both single element detectors in a proton beam, and the performance of a two-element, wide-angle telescope, in the trapped belt proton field inside the Space Shuttle. The technique is based on: (1) radiation transport intranuclear-evaporation model that takes into account the charge and angular distribution of target fragments, (2) Landau-Vavilov distribution of energy deposition allowing for electron escape, (3) true detector geometry of the telescope, (4) coincidence and discriminator settings, (5) spacecraft shielding geometry, and (6) the external space radiation environment, including albedo protons. The value of such detailed modeling and its implications in astronaut risk assessment is addressed. c2001 Elsevier Science B.V. All rights reserved.
Badhwar, G D; O'Neill, P M
2001-07-11
There is considerable interest in developing silicon-based telescopes because of their compactness and low power requirements. Three such telescopes have been flown on board the Space Shuttle to measure the linear energy transfer spectra of trapped, galactic cosmic ray, and solar energetic particles. Dosimeters based on single silicon detectors have also been flown on the Mir orbital station. A comparison of the absorbed dose and radiation quality factors calculated from these telescopes with that estimated from measurements made with a tissue equivalent proportional counter show differences which need to be fully understood if these telescopes are to be used for astronaut radiation risk assessments. Instrument performance is complicated by a variety of factors. A Monte Carlo-based technique was developed to model the behavior of both single element detectors in a proton beam, and the performance of a two-element, wide-angle telescope, in the trapped belt proton field inside the Space Shuttle. The technique is based on: (1) radiation transport intranuclear-evaporation model that takes into account the charge and angular distribution of target fragments, (2) Landau-Vavilov distribution of energy deposition allowing for electron escape, (3) true detector geometry of the telescope, (4) coincidence and discriminator settings, (5) spacecraft shielding geometry, and (6) the external space radiation environment, including albedo protons. The value of such detailed modeling and its implications in astronaut risk assessment is addressed. c2001 Elsevier Science B.V. All rights reserved.
Responding to the professionalism of learners and faculty in orthopaedic surgery.
Arnold, Louise
2006-08-01
Recent developments in assessing professionalism and remediating unprofessional behavior can curtail the inaction that often follows observations of negative as well as positive professionalism of learners and faculty. Developments include: longitudinal assessment models promoting professional behavior, not just penalizing lapses; clarity about the assessment's purpose; methods separating formative from summative assessment; conceptual and behavioral definitions of professionalism; techniques increasing the reliability and validity of quantitative and qualitative approaches to assessment such as 360-degree assessments, performance-based assessments, portfolios, and humanism connoisseurs; and systems-design providing infrastructure support for assessment. Models for remediation have been crafted, including: due process, a warning period and, if necessary, confrontation to initiate remediation of the physician who has acted unprofessionally. Principles for appropriate remediation stress matching the intervention to the cause of the professional lapse. Cognitive behavioral therapy, motivational interviewing, and continuous monitoring linked to behavioral contracts are effective remediation techniques. Mounting and maintaining robust systems for professionalism and remediating professional lapses are not easy tasks. They require a sea change in the fundamental goal of academic health care institutions: medical education must not only be a technical undertaking but also a moral process designed to build and sustain character in all its professional citizens.
Kopp, Sandra L; Smith, Hugh M
2011-01-01
Little is known about the use of Web-based education in regional anesthesia training. Benefits of Web-based education include the ability to standardize learning material quality and content, build appropriate learning progressions, use interactive multimedia technologies, and individualize delivery of course materials. The goals of this investigation were (1) to determine whether module design influences regional anesthesia knowledge acquisition, (2) to characterize learner preference patterns among anesthesia residents, and (3) to determine whether learner preferences play a role in knowledge acquisition. Direct comparison of knowledge assessments, learning styles, and learner preferences will be made between an interactive case-based and a traditional textbook-style module design. Forty-three Mayo Clinic anesthesiology residents completed 2 online modules, a knowledge pretest, posttest, an Index of Learning Styles assessment, and a participant satisfaction survey. Interscalene and lumbar plexus regional techniques were selected as the learning content for 4 Web modules constructed using the Blackboard Vista coursework application. One traditional textbook-style module and 1 interactive case-based module were designed for each of the interscalene and lumbar plexus techniques. Participants scored higher on the postmodule knowledge assessment for both of the interscalene and lumbar plexus modules. Postmodule knowledge performance scores were independent of both module design (interactive case-based versus traditional textbook style) and learning style preferences. However, nearly all participants reported a preference for Web-based learning and believe that it should be used in anesthesia resident education. Participants did not feel that Web-base learning should replace the current lecture-based curriculum. All residents scored higher on the postmodule knowledge assessment, but this improvement was independent of the module design and individual learning styles. Although residents believe that online learning should be used in anesthesia training, the results of this study do not demonstrate improved learning or justify the time and expense of developing complex case-based training modules. While there may be practical benefits of Web-based education, educators in regional anesthesia should be cautious about developing curricula based on learner preference data.
NASA Astrophysics Data System (ADS)
Chernavskaia, Olga; Heuke, Sandro; Vieth, Michael; Friedrich, Oliver; Schürmann, Sebastian; Atreya, Raja; Stallmach, Andreas; Neurath, Markus F.; Waldner, Maximilian; Petersen, Iver; Schmitt, Michael; Bocklitz, Thomas; Popp, Jürgen
2016-07-01
Assessing disease activity is a prerequisite for an adequate treatment of inflammatory bowel diseases (IBD) such as Crohn’s disease and ulcerative colitis. In addition to endoscopic mucosal healing, histologic remission poses a promising end-point of IBD therapy. However, evaluating histological remission harbors the risk for complications due to the acquisition of biopsies and results in a delay of diagnosis because of tissue processing procedures. In this regard, non-linear multimodal imaging techniques might serve as an unparalleled technique that allows the real-time evaluation of microscopic IBD activity in the endoscopy unit. In this study, tissue sections were investigated using the non-linear multimodal microscopy combination of coherent anti-Stokes Raman scattering (CARS), two-photon excited auto fluorescence (TPEF) and second-harmonic generation (SHG). After the measurement a gold-standard assessment of histological indexes was carried out based on a conventional H&E stain. Subsequently, various geometry and intensity related features were extracted from the multimodal images. An optimized feature set was utilized to predict histological index levels based on a linear classifier. Based on the automated prediction, the diagnosis time interval is decreased. Therefore, non-linear multimodal imaging may provide a real-time diagnosis of IBD activity suited to assist clinical decision making within the endoscopy unit.
NASA Astrophysics Data System (ADS)
Shepson, P. B.; Lavoie, T. N.; Kerlo, A. E.; Stirm, B. H.
2016-12-01
Understanding the contribution of anthropogenic activities to atmospheric greenhouse gas concentrations requires an accurate characterization of emission sources. Previously, we have reported the use of a novel aircraft-based mass balance measurement technique to quantify greenhouse gas emission rates from point and area sources, however, the accuracy of this approach has not been evaluated to date. Here, an assessment of method accuracy and precision was performed by conducting a series of six aircraft-based mass balance experiments at a power plant in southern Indiana and comparing the calculated CO2 emission rates to the reported hourly emission measurements made by continuous emissions monitoring systems (CEMS) installed directly in the exhaust stacks at the facility. For all flights, CO2 emissions were quantified before CEMS data were released online to ensure unbiased analysis. Additionally, we assess the uncertainties introduced to the final emission rate caused by our analysis method, which employs a statistical kriging model to interpolate and extrapolate the CO2 fluxes across the flight transects from the ground to the top of the boundary layer. Subsequently, using the results from these flights combined with the known emissions reported by the CEMS, we perform an inter-model comparison of alternative kriging methods to evaluate the performance of the kriging approach.
An analysis of a digital variant of the Trail Making Test using machine learning techniques.
Dahmen, Jessamyn; Cook, Diane; Fellows, Robert; Schmitter-Edgecombe, Maureen
2017-01-01
The goal of this work is to develop a digital version of a standard cognitive assessment, the Trail Making Test (TMT), and assess its utility. This paper introduces a novel digital version of the TMT and introduces a machine learning based approach to assess its capabilities. Using digital Trail Making Test (dTMT) data collected from (N = 54) older adult participants as feature sets, we use machine learning techniques to analyze the utility of the dTMT and evaluate the insights provided by the digital features. Predicted TMT scores correlate well with clinical digital test scores (r = 0.98) and paper time to completion scores (r = 0.65). Predicted TICS exhibited a small correlation with clinically derived TICS scores (r = 0.12 Part A, r = 0.10 Part B). Predicted FAB scores exhibited a small correlation with clinically derived FAB scores (r = 0.13 Part A, r = 0.29 for Part B). Digitally derived features were also used to predict diagnosis (AUC of 0.65). Our findings indicate that the dTMT is capable of measuring the same aspects of cognition as the paper-based TMT. Furthermore, the dTMT's additional data may be able to help monitor other cognitive processes not captured by the paper-based TMT alone.
Non-invasive diagnosis of advanced fibrosis and cirrhosis
Sharma, Suraj; Khalili, Korosh; Nguyen, Geoffrey Christopher
2014-01-01
Liver cirrhosis is a common and growing public health problem globally. The diagnosis of cirrhosis portends an increased risk of morbidity and mortality. Liver biopsy is considered the gold standard for diagnosis of cirrhosis and staging of fibrosis. However, despite its universal use, liver biopsy is an invasive and inaccurate gold standard with numerous drawbacks. In order to overcome the limitations of liver biopsy, a number of non-invasive techniques have been investigated for the assessment of cirrhosis. This review will focus on currently available non-invasive markers of cirrhosis. The evidence behind the use of these markers will be highlighted, along with an assessment of diagnostic accuracy and performance characteristics of each test. Non-invasive markers of cirrhosis can be radiologic or serum-based. Radiologic techniques based on ultrasound, magnetic resonance imaging and elastography have been used to assess liver fibrosis. Serum-based biomarkers of cirrhosis have also been developed. These are broadly classified into indirect and direct markers. Indirect biomarkers reflect liver function, which may decline with the onset of cirrhosis. Direct biomarkers, reflect extracellular matrix turnover, and include molecules involved in hepatic fibrogenesis. On the whole, radiologic and serum markers of fibrosis correlate well with biopsy scores, especially when excluding cirrhosis or excluding fibrosis. This feature is certainly clinically useful, and avoids liver biopsy in many cases. PMID:25492996
Sheppard, Sean C; Hickling, Edward J; Earleywine, Mitch; Hoyt, Tim; Russo, Amanda R; Donati, Matthew R; Kip, Kevin E
2015-11-01
Stigma associated with disclosing military sexual trauma (MST) makes estimating an accurate base rate difficult. Anonymous assessment may help alleviate stigma. Although anonymous research has found higher rates of male MST, no study has evaluated whether providing anonymity sufficiently mitigates the impact of stigma on accurate reporting. This study used the unmatched count technique (UCT), a form of randomized response techniques, to gain information about the accuracy of base rate estimates of male MST derived via anonymous assessment of Operation Enduring Freedom (OEF)/Operation Iraqi Freedom (OIF) combat veterans. A cross-sectional convenience sample of 180 OEF/OIF male combat veterans, recruited via online websites for military populations, provided data about history of MST via traditional anonymous self-report and the UCT. The UCT revealed a rate of male MST more than 15 times higher than the rate derived via traditional anonymous assessment (1.1% vs. 17.2%). These data suggest that anonymity does not adequately mitigate the impact of stigma on disclosure of male MST. Results, though preliminary, suggest that published rates of male MST may substantially underestimate the true rate of this problem. The UCT has significant potential to improve base rate estimation of sensitive behaviors in the military. (c) 2015 APA, all rights reserved).
Hand hygiene technique quality evaluation in nursing and medicine students of two academic courses 1
Škodová, Manuela; Gimeno-Benítez, Alfredo; Martínez-Redondo, Elena; Morán-Cortés, Juan Francisco; Jiménez-Romano, Ramona; Gimeno-Ortiz, Alfredo
2015-01-01
Abstract Objective: because they are health professionals, nursing and medical students' hands during internships can function as a transmission vehicle for hospital-acquired infections. Method: a descriptive study with nursing and medical degree students on the quality of the hand hygiene technique, which was assessed via a visual test using a hydroalcoholic solution marked with fluorescence and an ultraviolet lamp. Results: 546 students were assessed, 73.8% from medicine and 26.2% from nursing. The area of the hand with a proper antiseptic distribution was the palm (92.9%); areas not properly scrubbed were the thumbs (55.1%). 24.7% was very good in both hands, 29.8% was good, 25.1% was fair, and 20.3% was poor. The worst assessed were the male, nursing and first year students. There were no significant differences in the age groups. Conclusions: hand hygiene technique is not applied efficiently. Education plays a key role in setting a good practice base in hand hygiene, theoretical knowledge, and in skill development, as well as good practice reinforcement. PMID:26444174
Respiratory assessment in critical care units.
Cox, C L; McGrath, A
1999-08-01
As healthcare delivery changes in critical care, nursing continues to extend its practice base. Nursing practice is expanding to incorporate skills once seen as the remit of the medical profession. Critical care nurses are equipping themselves with evidence-based knowledge and skills that can enhance the care they provide to their patients. Assessment of patients is a major role in nursing and, by expanding assessment techniques, nurses can ensure patients receive the care most appropriate to their needs. Nurses in critical care are well placed to perform a more detailed assessment which can help to focus nursing care. This article describes the step-by-step process of undertaking a full and comprehensive respiratory assessment in critical care settings. It identifies many of the problems that patients may have and the signs and symptoms that a nurse may not whilst undertaking the assessment and preparing to prescribe care.
Real-time surveillance system for marine environment based on HLIF LiDAR
NASA Astrophysics Data System (ADS)
Babichenko, Sergey; Sobolev, Innokenti; Aleksejev, Valeri; Sõro, Oliver
2017-10-01
The operational monitoring of the risk areas of marine environment requires cost-effective solutions. One of the options is the use of sensor networks based on fixed installations and moving platforms (coastal boats, supply-, cargo-, and passenger vessels). Such network allows to gather environmental data in time and space with direct links to operational activities in the controlled area for further environmental risk assessment. Among many remote sensing techniques the LiDAR (Light Detection And Ranging) based on Light Induced Fluorescence (LIF) is the tool of direct assessment of water quality variations caused by chemical pollution, colored dissolved organic matter, and phytoplankton composition. The Hyperspectral LIF (HLIF) LiDAR acquires comprehensive LIF spectra and analyses them by spectral pattern recognition technique to detect and classify the substances in water remotely. Combined use of HLIF LiDARs with Real-Time Data Management System (RTDMS) provides the economically effective solution for the regular monitoring in the controlled area. OCEAN VISUALS in cooperation with LDI INNOVATION has developed Oil in Water Locator (OWL™) with RTDMS (OWL MAP™) based on HLIF LiDAR technique. This is a novel technical solution for monitoring of marine environment providing continuous unattended operations. OWL™ has been extensively tested on board of various vessels in the North Sea, Norwegian Sea, Barents Sea, Baltic Sea and Caribbean Sea. This paper describes the technology features, the results of its operational use in 2014-2017, and outlook for the technology development.
NASA Technical Reports Server (NTRS)
Padula, Santo, II
2009-01-01
The ability to sufficiently measure orbiter window defects to allow for window recertification has been an ongoing challenge for the orbiter vehicle program. The recent Columbia accident has forced even tighter constraints on the criteria that must be met in order to recertify windows for flight. As a result, new techniques are being investigated to improve the reliability, accuracy and resolution of the defect detection process. The methodology devised in this work, which is based on the utilization of a vertical scanning interferometric (VSI) tool, shows great promise for meeting the ever increasing requirements for defect detection. This methodology has the potential of a 10-100 fold greater resolution of the true defect depth than can be obtained from the currently employed micrometer based methodology. An added benefit is that it also produces a digital elevation map of the defect, thereby providing information about the defect morphology which can be utilized to ascertain the type of debris that induced the damage. However, in order to successfully implement such a tool, a greater understanding of the resolution capability and measurement repeatability must be obtained. This work focused on assessing the variability of the VSI-based measurement methodology and revealed that the VSI measurement tool was more repeatable and more precise than the current micrometer based approach, even in situations where operator variation could affect the measurement. The analysis also showed that the VSI technique was relatively insensitive to the hardware and software settings employed, making the technique extremely robust and desirable
Vital physical signals measurements using a webcam
NASA Astrophysics Data System (ADS)
Ouyang, Jianfei; Yan, Yonggang; Yao, Lifeng
2013-10-01
Non-contact and remote measurements of vital physical signals are important for reliable and comfortable physiological self-assessment. In this paper, we provide a new video-based methodology for remote and fast measurements of vital physical signals such as cardiac pulse and breathing rate. A webcam is used to track color video of a human face or wrist, and a Photoplethysmography (PPG) technique is applied to perform the measurements of the vital signals. A novel sequential blind signal extraction methodology is applied to the color video under normal lighting conditions, based on correlation analysis between the green trace and the source signals. The approach is successfully applied in the measurement of vital signals under the condition of different illuminating in which the target signal can also be found out accurately. To assess the advantages, the measuring time of a large number of cases is recorded correctly. The experimental results show that it only takes less than 30 seconds to measure the vital physical signals using presented technique. The study indicates the proposed approach is feasible for PPG technique, which provides a way to study the relationship of the signal for different ROI in future research.
Aligning Goals, Assessments, and Activities: An Approach to Teaching PCR and Gel Electrophoresis
Robertson, Amber L.; Batzli, Janet; Harris, Michelle; Miller, Sarah
2008-01-01
Polymerase chain reaction (PCR) and gel electrophoresis have become common techniques used in undergraduate molecular and cell biology labs. Although students enjoy learning these techniques, they often cannot fully comprehend and analyze the outcomes of their experiments because of a disconnect between concepts taught in lecture and experiments done in lab. Here we report the development and implementation of novel exercises that integrate the biological concepts of DNA structure and replication with the techniques of PCR and gel electrophoresis. Learning goals were defined based on concepts taught throughout the cell biology lab course and learning objectives specific to the PCR and gel electrophoresis lab. Exercises developed to promote critical thinking and target the underlying concepts of PCR, primer design, gel analysis, and troubleshooting were incorporated into an existing lab unit based on the detection of genetically modified organisms. Evaluative assessments for each exercise were aligned with the learning goals and used to measure student learning achievements. Our analysis found that the exercises were effective in enhancing student understanding of these concepts as shown by student performance across all learning goals. The new materials were particularly helpful in acquiring relevant knowledge, fostering critical-thinking skills, and uncovering prevalent misconceptions. PMID:18316813
Final evaluation report for the CAPITAL-ITS operational test and demonstration program
DOT National Transportation Integrated Search
1997-05-01
The CAPITAL project was undertaken to assess the viability of using cellular-based traffic probes as a wide area vehicular traffic surveillance technique. From the test, cellular technology demonstrated the technical potential to provide vehicle spee...
CAAT Altex workshop paper entitled "Towards Good Read-Across Practice (GRAP) Guidance"
Grouping of substances and utilizing read-across within those groups represents an important data gap filling technique for chemical safety assessments. Categories/analogue groups are typically developed based on structural similarity, and increasingly often, also on mechanistic ...
NASA Astrophysics Data System (ADS)
Chaisaowong, Kraisorn; Kraus, Thomas
2014-03-01
Pleural thickenings can be caused by asbestos exposure and may evolve into malignant pleural mesothelioma. While an early diagnosis plays the key role to an early treatment, and therefore helping to reduce morbidity, the growth rate of a pleural thickening can be in turn essential evidence to an early diagnosis of the pleural mesothelioma. The detection of pleural thickenings is today done by a visual inspection of CT data, which is time-consuming and underlies the physician's subjective judgment. Computer-assisted diagnosis systems to automatically assess pleural mesothelioma have been reported worldwide. But in this paper, an image analysis pipeline to automatically detect pleural thickenings and measure their volume is described. We first delineate automatically the pleural contour in the CT images. An adaptive surface-base smoothing technique is then applied to the pleural contours to identify all potential thickenings. A following tissue-specific topology-oriented detection based on a probabilistic Hounsfield Unit model of pleural plaques specify then the genuine pleural thickenings among them. The assessment of the detected pleural thickenings is based on the volumetry of the 3D model, created by mesh construction algorithm followed by Laplace-Beltrami eigenfunction expansion surface smoothing technique. Finally, the spatiotemporal matching of pleural thickenings from consecutive CT data is carried out based on the semi-automatic lung registration towards the assessment of its growth rate. With these methods, a new computer-assisted diagnosis system is presented in order to assure a precise and reproducible assessment of pleural thickenings towards the diagnosis of the pleural mesothelioma in its early stage.
Parrilla, Inma; del Olmo, David; Sijses, Laurien; Martinez-Alborcia, María J; Cuello, Cristina; Vazquez, Juan M; Martinez, Emilio A; Roca, Jordi
2012-05-01
The present study aimed to evaluate the ability of spermatozoa from individual boar ejaculates to withstand different semen-processing techniques. Eighteen sperm-rich ejaculate samples from six boars (three per boar) were diluted in Beltsville Thawing Solution and split into three aliquots. The aliquots were (1) further diluted to 3×10(7) sperm/mL and stored as a liquid at 17°C for 72 h, (2) frozen-thawed (FT) at 1×10(9) sperm/mL using standard 0.5-mL straw protocols, or (3) sex-sorted with subsequent liquid storage (at 17°C for 6 h) or FT (2×10(7) sperm/mL using a standard 0.25-mL straw protocol). The sperm quality was evaluated based on total sperm motility (the CASA system), viability (plasma membrane integrity assessed using flow cytometry and the LIVE/DEAD Sperm Viability Kit), lipid peroxidation (assessed via indirect measurement of the generation of malondialdehyde (MDA) using the BIOXYTECH MDA-586 Assay Kit) and DNA fragmentation (sperm chromatin dispersion assessed using the Sperm-Sus-Halomax(®) test). Data were normalized to the values assessed for the fresh (for liquid-stored and FT samples) or the sorted semen samples (for liquid stored and the FT sorted spermatozoa). All of the four sperm-processing techniques affected sperm quality (P<0.01), regardless of the semen donor, with reduced percentages of motile and viable sperm and increased MDA generation and percentages of sperm with fragmented DNA. Significant (P<0.05) inter-boar (effect of boars within each semen-processing technique) and intra-boar (effect of semen-processing techniques within each boar) differences were evident for all of the sperm quality parameters assessed, indicating differences in the ability of spermatozoa from individual boars to withstand the semen-processing techniques. These results are the first evidence that ejaculate spermatozoa from individual boars can respond in a boar-dependent manner to different semen-processing techniques. Copyright © 2012 Elsevier B.V. All rights reserved.
Live fire testing requirements - Assessing the impact
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Bryon, J.F.
1992-08-01
Full-up live-fire testing (LFT) of aircraft configured for combat is evaluated in terms of the practical implications of the technique. LFT legislation requires the testing of tactical fighters, helicopters, and other aircraft when they are loaded with the flammables and explosives associated with combat. LFT permits the study of damage mechanisms and battle-damage repair techniques during the design phase, and probability-of-kill estimates and novel systems designs can be developed based on LFT data.
Factory approach can streamline patient accounting.
Rands, J; Muench, M
1991-08-01
Although they may seem fundamentally different, similarities exist between operations of factories and healthcare organizations' business offices. As a result, a patient accounting approach based on manufacturing firms' management techniques may help smooth healthcare business processes. Receivables performance management incorporates the Japanese techniques of "just-in-time" and total quality management to reduce unbilled accounts and information backlog and accelerate payment. A preliminary diagnostic assessment of a patient accounting process helps identify bottlenecks and set priorities for work flow.
Assessment of active methods for removal of LEO debris
NASA Astrophysics Data System (ADS)
Hakima, Houman; Emami, M. Reza
2018-03-01
This paper investigates the applicability of five active methods for removal of large low Earth orbit debris. The removal methods, namely net, laser, electrodynamic tether, ion beam shepherd, and robotic arm, are selected based on a set of high-level space mission constraints. Mission level criteria are then utilized to assess the performance of each redirection method in light of the results obtained from a Monte Carlo simulation. The simulation provides an insight into the removal time, performance robustness, and propellant mass criteria for the targeted debris range. The remaining attributes are quantified based on the models provided in the literature, which take into account several important parameters pertaining to each removal method. The means of assigning attributes to each assessment criterion is discussed in detail. A systematic comparison is performed using two different assessment schemes: Analytical Hierarchy Process and utility-based approach. A third assessment technique, namely the potential-loss analysis, is utilized to highlight the effect of risks in each removal methods.
Echocardiographic strain and strain-rate imaging: a new tool to study regional myocardial function.
D'hooge, Jan; Bijnens, Bart; Thoen, Jan; Van de Werf, Frans; Sutherland, George R; Suetens, Paul
2002-09-01
Ultrasonic imaging is the noninvasive clinical imaging modality of choice for diagnosing heart disease. At present, two-dimensional ultrasonic grayscale images provide a relatively cheap, fast, bedside method to study the morphology of the heart. Several methods have been proposed to assess myocardial function. These have been based on either grayscale or motion (velocity) information measured in real-time. However, the quantitative assessment of regional myocardial function remains an important goal in clinical cardiology. To do this, ultrasonic strain and strain-rate imaging have been introduced. In the clinical setting, these techniques currently only allow one component of the true three-dimensional deformation to be measured. Clinical, multidimensional strain (rate) information can currently thus only be obtained by combining data acquired using different transducer positions. Nevertheless, given the appropriate postprocessing, the clinical value of these techniques has already been shown. Moreover, multidimensional strain and strain-rate estimation of the heart in vivo by means of a single ultrasound acquisition has been shown to be feasible. In this paper, the new techniques of ultrasonic strain rate and strain imaging of the heart are reviewed in terms of definitions, data acquisition, strain-rate estimation, postprocessing, and parameter extraction. Their clinical validation and relevance will be discussed using clinical examples on relevant cardiac pathology. Based on these examples, suggestions are made for future developments of these techniques.
NASA Astrophysics Data System (ADS)
Irwandi, Irwandi; Fashbir; Daryono
2018-04-01
Neo-Deterministic Seismic Hazard Assessment (NDSHA) method is a seismic hazard assessment method that has an advantage on realistic physical simulation of the source, propagation, and geological-geophysical structure. This simulation is capable on generating the synthetics seismograms at the sites that being observed. At the regional NDSHA scale, calculation of the strong ground motion is based on 1D modal summation technique because it is more efficient in computation. In this article, we verify the result of synthetic seismogram calculations with the result of field observations when Pidie Jaya earthquake on 7 December 2016 occurred with the moment magnitude of M6.5. Those data were recorded by broadband seismometers installed by BMKG (Indonesian Agency for Meteorology, Climatology and Geophysics). The result of the synthetic seismogram calculations verifies that some stations well show the suitability with observation while some other stations show the discrepancies with observation results. Based on the results of the observation of some stations, evidently 1D modal summation technique method has been well verified for thin sediment region (near the pre-tertiary basement), but less suitable for thick sediment region. The reason is that the 1D modal summation technique excludes the amplification effect of seismic wave occurring within thick sediment region. So, another approach is needed, e.g., 2D finite difference hybrid method, which is a part of local scale NDSHA method.
TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.
2016-01-01
Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804
Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R
2016-09-01
Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Viotti, Matias R.; Albertazzi, Armando; Staron, Peter; Pisa, Marcelo
2013-04-01
This paper shows a portable device to measure mainly residual stress fields outside the optical bench. This system combines the traditional hole drilling technique with Digital Speckle Pattern Interferometry. The novel feature of this device is the high degree of compaction since only one base supports simultaneously the measurement module and the hole-drilling device. The portable device allows the measurement of non-uniform residual stresses in accordance with the ASTM standard. In oil and gas offshore industries, alternative welding procedures among them, the friction hydro pillar processing (FHPP) is highlighted and nowadays is an important maintenance tool since it has the capability to produce structure repairs without risk of explosions. In this process a hole is drilled and filled with a consumable rod of the same material. The rod, which could be cylindrical or conical, is rotated and pressed against the hole, leading to frictional heating. In order to assess features about the residual stress distribution generated by the weld into the rod as well as into the base material around the rod, welded samples were evaluated by neutron diffraction and by the hole drilling technique having a comparison between them. For the hole drilling technique some layers were removed by using electrical discharge machining (EDM) after diffraction measurements in order to assess the bulk stress distribution. Results have shown a good agreement between techniques.
Hogan, Michael P; Pace, David E; Hapgood, Joanne; Boone, Darrell C
2006-11-01
Situation awareness (SA) is defined as the perception of elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future. This construct is vital to decision making in intense, dynamic environments. It has been used in aviation as it relates to pilot performance, but has not been applied to medical education. The most widely used objective tool for measuring trainee SA is the Situation Awareness Global Assessment Technique (SAGAT). The purpose of this study was to design and validate SAGAT for assessment of practical trauma skills, and to compare SAGAT results to traditional checklist style scoring. Using the Human Patient Simulator, we designed SAGAT for practical trauma skills assessment based on Advanced Trauma Life Support objectives. Sixteen subjects (four staff surgeons, four senior residents, four junior residents, and four medical students) participated in three scenarios each. They were assessed using SAGAT and traditional checklist assessment. A questionnaire was used to assess possible confounding factors in attaining SA and overall trainee satisfaction. SAGAT was found to show significant difference (analysis of variance; p < 0.001) in scores based on level of training lending statistical support to construct validity. SAGAT was likewise found to display reliability (Cronbach's alpha 0.767), and significant scoring correlation with traditional checklist performance measures (Pearson's coefficient 0.806). The questionnaire revealed no confounding factors and universal satisfaction with the human patient simulator and SAGAT. SAGAT is a valid, reliable assessment tool for trauma trainees in the dynamic clinical environment created by human patient simulation. Information provided by SAGAT could provide specific feedback, direct individualized teaching, and support curriculum change. Introduction of SAGAT could improve the current assessment model for practical trauma education.
Assessment methods for the evaluation of vitiligo.
Alghamdi, K M; Kumar, A; Taïeb, A; Ezzedine, K
2012-12-01
There is no standardized method for assessing vitiligo. In this article, we review the literature from 1981 to 2011 on different vitiligo assessment methods. We aim to classify the techniques available for vitiligo assessment as subjective, semi-objective or objective; microscopic or macroscopic; and as based on morphometry or colorimetry. Macroscopic morphological measurements include visual assessment, photography in natural or ultraviolet light, photography with computerized image analysis and tristimulus colorimetry or spectrophotometry. Non-invasive micromorphological methods include confocal laser microscopy (CLM). Subjective methods include clinical evaluation by a dermatologist and a vitiligo disease activity score. Semi-objective methods include the Vitiligo Area Scoring Index (VASI) and point-counting methods. Objective methods include software-based image analysis, tristimulus colorimetry, spectrophotometry and CLM. Morphometry is the measurement of the vitiliginous surface area, whereas colorimetry quantitatively analyses skin colour changes caused by erythema or pigment. Most methods involve morphometry, except for the chromameter method, which assesses colorimetry. Some image analysis software programs can assess both morphometry and colorimetry. The details of these programs (Corel Draw, Image Pro Plus, AutoCad and Photoshop) are discussed in the review. Reflectance confocal microscopy provides real-time images and has great potential for the non-invasive assessment of pigmentary lesions. In conclusion, there is no single best method for assessing vitiligo. This review revealed that VASI, the rule of nine and Wood's lamp are likely to be the best techniques available for assessing the degree of pigmentary lesions and measuring the extent and progression of vitiligo in the clinic and in clinical trials. © 2012 The Authors. Journal of the European Academy of Dermatology and Venereology © 2012 European Academy of Dermatology and Venereology.
Liang, Li; Zhao, Zhongzhen; Kang, Tingguo
2014-01-01
Background: The technique of microscopy has been applied for identification of Chinese materia medica (CMM) since decades. However, very few scientific publications report the combination of conventional microscopy and high performance liquid chromatography (HPLC) techniques for further application to quality assessment of CMM. Objective: The objective of this study is to analyze the quality of the dried root tuber of Polygonum multiflorum Thunb. (Heshouwu) and to establish the relationships between 2,3,5,4’-tetrahydroxystilbene-2-O-β-glucoside, combined anthraquinone (CAQ) and quantity of clusters of calcium oxalate. Materials and Methods: In this study, microscopy and HPLC techniques were applied to assess the quality of P. multiflorum Thunb., and SPSS software was used to establish the relationship between microscopic characteristics and chemical components. Results: The results showed close and direct correlations between the quantity of clusters of calcium oxalate in P. multiflorum Thunb. and the contents of 2,3,5,4’-tetrahydroxystilbene-2-O-β-glucoside and CAQ. From these results, it can be deduced that Polygoni Multiflori Radix with a higher quantity of clusters of calcium oxalate should be of better quality. Conclusion: The established method can be helpful for evaluating the quality of CMM based upon the identification and quantitation of chemical and ergastic substance of cells. PMID:25422540
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Klijn, Sven L; Weijenberg, Matty P; Lemmens, Paul; van den Brandt, Piet A; Lima Passos, Valéria
2017-10-01
Background and objective Group-based trajectory modelling is a model-based clustering technique applied for the identification of latent patterns of temporal changes. Despite its manifold applications in clinical and health sciences, potential problems of the model selection procedure are often overlooked. The choice of the number of latent trajectories (class-enumeration), for instance, is to a large degree based on statistical criteria that are not fail-safe. Moreover, the process as a whole is not transparent. To facilitate class enumeration, we introduce a graphical summary display of several fit and model adequacy criteria, the fit-criteria assessment plot. Methods An R-code that accepts universal data input is presented. The programme condenses relevant group-based trajectory modelling output information of model fit indices in automated graphical displays. Examples based on real and simulated data are provided to illustrate, assess and validate fit-criteria assessment plot's utility. Results Fit-criteria assessment plot provides an overview of fit criteria on a single page, placing users in an informed position to make a decision. Fit-criteria assessment plot does not automatically select the most appropriate model but eases the model assessment procedure. Conclusions Fit-criteria assessment plot is an exploratory, visualisation tool that can be employed to assist decisions in the initial and decisive phase of group-based trajectory modelling analysis. Considering group-based trajectory modelling's widespread resonance in medical and epidemiological sciences, a more comprehensive, easily interpretable and transparent display of the iterative process of class enumeration may foster group-based trajectory modelling's adequate use.
Robson, Philip M; Grant, Aaron K; Madhuranthakam, Ananth J; Lattanzi, Riccardo; Sodickson, Daniel K; McKenzie, Charles A
2008-10-01
Parallel imaging reconstructions result in spatially varying noise amplification characterized by the g-factor, precluding conventional measurements of noise from the final image. A simple Monte Carlo based method is proposed for all linear image reconstruction algorithms, which allows measurement of signal-to-noise ratio and g-factor and is demonstrated for SENSE and GRAPPA reconstructions for accelerated acquisitions that have not previously been amenable to such assessment. Only a simple "prescan" measurement of noise amplitude and correlation in the phased-array receiver, and a single accelerated image acquisition are required, allowing robust assessment of signal-to-noise ratio and g-factor. The "pseudo multiple replica" method has been rigorously validated in phantoms and in vivo, showing excellent agreement with true multiple replica and analytical methods. This method is universally applicable to the parallel imaging reconstruction techniques used in clinical applications and will allow pixel-by-pixel image noise measurements for all parallel imaging strategies, allowing quantitative comparison between arbitrary k-space trajectories, image reconstruction, or noise conditioning techniques. (c) 2008 Wiley-Liss, Inc.
Ground-based deep-space LADAR for satellite detection: A parametric study
NASA Astrophysics Data System (ADS)
Davey, Kevin F.
1989-12-01
The minimum performance requirements are determined of a ground based infrared LADAR designed to detect deep space satellites, and a candidate sensor design is presented based on current technology. The research examines LADAR techniques and detection methods to determine the optimum LADAR configuration, and then assesses the effects of atmospheric transmission, background radiance, and turbulence across the infrared region to find the optimum laser wavelengths. Diffraction theory is then used in a parametric analysis of the transmitted laser beam and received signal, using a Cassegrainian telescope design and heterodyne detection. The effects of beam truncation and obscuration, heterodyne misalignment, off-boresight detection, and image-pixel geometry are also included in the analysis. The derived equations are then used to assess the feasibility of several candidate designs under a wide range of detection conditions including daylight operation through cirrus. The results show that successful detection is theoretically possible under most conditions by transmitting a high power frequency modulated pulse train from an isotopic 13CO2 laser radiating at 11.17 micrometers, and utilizing post-detection integration and pulse compression techniques.
Nondestructive assessment of single-span timber bridges using a vibration- based method
Xiping Wang; James P. Wacker; Angus M. Morison; John W. Forsman; John R. Erickson; Robert J. Ross
2005-01-01
This paper describes an effort to develop a global dynamic testing technique for evaluating the overall stiffness of timber bridge superstructures. A forced vibration method was used to measure the natural frequency of single-span timber bridges in the laboratory and field. An analytical model based on simple beam theory was proposed to represent the relationship...
ERIC Educational Resources Information Center
Fauzi, Ahmad; Bundu, Patta; Tahmir, Suradi
2016-01-01
Bridge simulator constitutes a very fundamental and vital tool to trigger and ensure that seamen or seafarers possess the standardized competence required. By using the bridge simulator technique, a reality based study can be presented easily and delivered to the students in ongoing basis to their classroom or study place. Afterwards, the validity…
Proof of Concept for an Approach to a Finer Resolution Inventory
Chris J. Cieszewski; Kim Iles; Roger C. Lowe; Michal Zasada
2005-01-01
This report presents a proof of concept for a statistical framework to develop a timely, accurate, and unbiased fiber supply assessment in the State of Georgia, U.S.A. The proposed approach is based on using various data sources and modeling techniques to calibrate satellite image-based statewide stand lists, which provide initial estimates for a State inventory on a...
2010-01-01
In clinical neurology, a comprehensive understanding of consciousness has been regarded as an abstract concept - best left to philosophers. However, times are changing and the need to clinically assess consciousness is increasingly becoming a real-world, practical challenge. Current methods for evaluating altered levels of consciousness are highly reliant on either behavioural measures or anatomical imaging. While these methods have some utility, estimates of misdiagnosis are worrisome (as high as 43%) - clearly this is a major clinical problem. The solution must involve objective, physiologically based measures that do not rely on behaviour. This paper reviews recent advances in physiologically based measures that enable better evaluation of consciousness states (coma, vegetative state, minimally conscious state, and locked in syndrome). Based on the evidence to-date, electroencephalographic and neuroimaging based assessments of consciousness provide valuable information for evaluation of residual function, formation of differential diagnoses, and estimation of prognosis. PMID:20113490
Portable Electronic Nose Based on Electrochemical Sensors for Food Quality Assessment
Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek
2017-01-01
The steady increase in global consumption puts a strain on agriculture and might lead to a decrease in food quality. Currently used techniques of food analysis are often labour-intensive and time-consuming and require extensive sample preparation. For that reason, there is a demand for novel methods that could be used for rapid food quality assessment. A technique based on the use of an array of chemical sensors for holistic analysis of the sample’s headspace is called electronic olfaction. In this article, a prototype of a portable, modular electronic nose intended for food analysis is described. Using the SVM method, it was possible to classify samples of poultry meat based on shelf-life with 100% accuracy, and also samples of rapeseed oil based on the degree of thermal degradation with 100% accuracy. The prototype was also used to detect adulterations of extra virgin olive oil with rapeseed oil with 82% overall accuracy. Due to the modular design, the prototype offers the advantages of solutions targeted for analysis of specific food products, at the same time retaining the flexibility of application. Furthermore, its portability allows the device to be used at different stages of the production and distribution process. PMID:29186754
Holistic Analysis Enhances the Description of Metabolic Complexity in Dietary Natural Products1234
Kulakowski, Daniel; Lankin, David C; McAlpine, James B; Chen, Shao-Nong
2016-01-01
In the field of food and nutrition, complex natural products (NPs) are typically obtained from cells/tissues of diverse organisms such as plants, mushrooms, and animals. Among them, edible fruits, grains, and vegetables represent most of the human diet. Because of an important dietary dependence, the comprehensive metabolomic analysis of dietary NPs, performed holistically via the assessment of as many metabolites as possible, constitutes a fundamental building block for understanding the human diet. Both mass spectrometry (MS) and nuclear magnetic resonance (NMR) are important complementary analytic techniques, covering a wide range of metabolites at different concentrations. Particularly, 1-dimensional 1H-NMR offers an unbiased overview of all metabolites present in a sample without prior knowledge of its composition, thereby leading to an untargeted analysis. In the past decade, NMR-based metabolomics in plant and food analyses has evolved considerably. The scope of the present review, covering literature of the past 5 y, is to address the relevance of 1H-NMR–based metabolomics in food plant studies, including a comparison with MS-based techniques. Major applications of NMR-based metabolomics for the quality control of dietary NPs and assessment of their nutritional values are presented. PMID:27180381
Pike, James Russell; Xie, Bin; Tan, Nasya; Sabado-Liwag, Melanie Dee; Orne, Annette; Toilolo, Tupou; Cen, Steven; May, Vanessa; Lee, Cevadne; Pang, Victor Kaiwi; Rainer, Michelle A; Vaivao, Dorothy Etimani S; Lepule, Jonathan Tana; Tanjasiri, Sora Park; Palmer, Paula Healani
2016-01-07
Recent prevalence data indicates that Pacific Islanders living in the United States have disproportionately high smoking rates when compared to the general populace. However, little is known about the factors contributing to tobacco use in this at-risk population. Moreover, few studies have attempted to determine these factors utilizing technology-based assessment techniques. The objective was to develop a customized Internet-based Ecological Momentary Assessment (EMA) system capable of measuring cigarette use among Pacific Islanders in Southern California. This system integrated the ubiquity of text messaging, the ease of use associated with mobile phone apps, the enhanced functionality offered by Internet-based Cell phone-optimized Assessment Techniques (ICAT), and the high survey completion rates exhibited by EMA studies that used electronic diaries. These features were tested in a feasibility study designed to assess whether Pacific Islanders would respond to this method of measurement and whether the data gathered would lead to novel insights regarding the intrapersonal, social, and ecological factors associated with cigarette use. 20 young adult smokers in Southern California who self-identified as Pacific Islanders were recruited by 5 community-based organizations to take part in a 7-day EMA study. Participants selected six consecutive two-hour time blocks per day during which they would be willing to receive a text message linking them to an online survey formatted for Web-enabled mobile phones. Both automated reminders and community coaches were used to facilitate survey completion. 720 surveys were completed from 840 survey time blocks, representing a completion rate of 86%. After adjusting for gender, age, and nicotine dependence, feeling happy (P=<.001) or wanting a cigarette while drinking alcohol (P=<.001) were positively associated with cigarette use. Being at home (P=.02) or being around people who are not smoking (P=.01) were negatively associated with cigarette use. The results of the feasibility study indicate that customized systems can be used to conduct technology-based assessments of tobacco use among Pacific Islanders. Such systems can foster high levels of survey completion and may lead to novel insights for future research and interventions.
Tan, Nasya; Sabado-Liwag, Melanie Dee; Orne, Annette; Toilolo, Tupou; Cen, Steven; May, Vanessa; Lee, Cevadne; Pang, Victor Kaiwi; Rainer, Michelle A; Vaivao, Dorothy Etimani S; Lepule, Jonathan Tana; Tanjasiri, Sora Park; Palmer, Paula Healani
2016-01-01
Background Recent prevalence data indicates that Pacific Islanders living in the United States have disproportionately high smoking rates when compared to the general populace. However, little is known about the factors contributing to tobacco use in this at-risk population. Moreover, few studies have attempted to determine these factors utilizing technology-based assessment techniques. Objective The objective was to develop a customized Internet-based Ecological Momentary Assessment (EMA) system capable of measuring cigarette use among Pacific Islanders in Southern California. This system integrated the ubiquity of text messaging, the ease of use associated with mobile phone apps, the enhanced functionality offered by Internet-based Cell phone-optimized Assessment Techniques (ICAT), and the high survey completion rates exhibited by EMA studies that used electronic diaries. These features were tested in a feasibility study designed to assess whether Pacific Islanders would respond to this method of measurement and whether the data gathered would lead to novel insights regarding the intrapersonal, social, and ecological factors associated with cigarette use. Methods 20 young adult smokers in Southern California who self-identified as Pacific Islanders were recruited by 5 community-based organizations to take part in a 7-day EMA study. Participants selected six consecutive two-hour time blocks per day during which they would be willing to receive a text message linking them to an online survey formatted for Web-enabled mobile phones. Both automated reminders and community coaches were used to facilitate survey completion. Results 720 surveys were completed from 840 survey time blocks, representing a completion rate of 86%. After adjusting for gender, age, and nicotine dependence, feeling happy (P=<.001) or wanting a cigarette while drinking alcohol (P=<.001) were positively associated with cigarette use. Being at home (P=.02) or being around people who are not smoking (P=.01) were negatively associated with cigarette use. Conclusions The results of the feasibility study indicate that customized systems can be used to conduct technology-based assessments of tobacco use among Pacific Islanders. Such systems can foster high levels of survey completion and may lead to novel insights for future research and interventions. PMID:26743132
Lehotsky, Á; Szilágyi, L; Bánsághi, S; Szerémy, P; Wéber, G; Haidegger, T
2017-09-01
Ultraviolet spectrum markers are widely used for hand hygiene quality assessment, although their microbiological validation has not been established. A microbiology-based assessment of the procedure was conducted. Twenty-five artificial hand models underwent initial full contamination, then disinfection with UV-dyed hand-rub solution, digital imaging under UV-light, microbiological sampling and cultivation, and digital imaging of the cultivated flora were performed. Paired images of each hand model were registered by a software tool, then the UV-marked regions were compared with the pathogen-free sites pixel by pixel. Statistical evaluation revealed that the method indicates correctly disinfected areas with 95.05% sensitivity and 98.01% specificity. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trexler, D.T.; Flynn, T.; Koenig, B.A.
1982-01-01
Geological, geophysical and geochemical surveys were used in conjunction with temperature gradient hole drilling to assess the geothermal resources in Pumpernickel Valley and Carlin, Nevada. This program is based on a statewide assessment of geothermal resources that was completed in 1979. The exploration techniques are based on previous federally-funded assessment programs that were completed in six other areas in Nevada and include: literature search and compilation of existing data, geologic reconnaissance, chemical sampling of thermal and non-thermal fluids, interpretation of satellite imagery, interpretation of low-sun angle aerial photographs, two-meter depth temperature probe survey, gravity survey, seismic survey, soil-mercury survey, andmore » temperature gradient drilling.« less
Farnebo, S; Winbladh, A; Zettersten, E K; Sandström, P; Gullstrand, P; Samuelsson, A; Theodorson, E; Sjöberg, F
2010-01-01
Delayed detection of ischemia is one of the most feared postoperative complications. Early detection of impaired blood flow and close monitoring of the organ-specific metabolic status may therefore be critical for the surgical outcome. Urea clearance is a new technique for continuous monitoring of alterations in blood flow and metabolic markers with acceptable temporal characteristics. We compare this new microdialysis technique with the established microdialysis ethanol technique to assess hepatic blood flow. Six pigs were used in a liver ischemia/reperfusion injury model. Microdialysis catheters were placed in liver segment IV and all circulation was stopped for 80 min, followed by reperfusion for 220 min. Urea and ethanol clearance was calculated from the dialysate and correlated with metabolic changes. A laser Doppler probe was used as reference of restoration of blood flow. Both urea and ethanol clearance reproducibly depicted changes in liver blood flow in relation to metabolic changes and laser Doppler measurements. The two techniques highly correlated both overall and during the reperfusion phase (r = 0.8) and the changes were paralleled by altered perfusion as recorded by laser Doppler. Copyright © 2010 S. Karger AG, Basel.
Computer-based System for the Virtual-Endoscopic Guidance of Bronchoscopy.
Helferty, J P; Sherbondy, A J; Kiraly, A P; Higgins, W E
2007-11-01
The standard procedure for diagnosing lung cancer involves two stages: three-dimensional (3D) computed-tomography (CT) image assessment, followed by interventional bronchoscopy. In general, the physician has no link between the 3D CT image assessment results and the follow-on bronchoscopy. Thus, the physician essentially performs bronchoscopic biopsy of suspect cancer sites blindly. We have devised a computer-based system that greatly augments the physician's vision during bronchoscopy. The system uses techniques from computer graphics and computer vision to enable detailed 3D CT procedure planning and follow-on image-guided bronchoscopy. The procedure plan is directly linked to the bronchoscope procedure, through a live registration and fusion of the 3D CT data and bronchoscopic video. During a procedure, the system provides many visual tools, fused CT-video data, and quantitative distance measures; this gives the physician considerable visual feedback on how to maneuver the bronchoscope and where to insert the biopsy needle. Central to the system is a CT-video registration technique, based on normalized mutual information. Several sets of results verify the efficacy of the registration technique. In addition, we present a series of test results for the complete system for phantoms, animals, and human lung-cancer patients. The results indicate that not only is the variation in skill level between different physicians greatly reduced by the system over the standard procedure, but that biopsy effectiveness increases.
Stanisic, Veselin; Andjelkovic, Igor; Vlaovic, Darko; Babic, Igor; Kocev, Nikola; Nikolic, Bosko; Milicevic, Miroslav
2013-10-01
Predicting technical difficulties in laparoscopic cholecystectomy (LC) in a small regional hospital increases efficacy, cost-benefit and safety of the procedure. The aim of the study was to assess whether it is possible to accurately predict a difficult LC (DLC) in a small regional hospital based only on the routine available clinical work-up parameters (patient history, ultrasound examination and blood chemistry) and their combinations. A prospective, cohort, of 369 consecutive patients operated by the same surgeon was analyzed. Conversion rate was 10 (2.7%). DLC was registered in 55 (14.90%). Various data mining techniques were applied and assessed. Seven significant predictors of DLC were identified: i) shrunken (fibrotic) gallbladder (GB); ii) ultrasound (US) GB wall thickness >4 mm; iii) >5 attacks of pain lasting >5 hours; iv) WBC >10x109 g/L; v) pericholecystic fluid; vi) urine amylase >380 IU/L, and vii) BMI >30kg/m2. Bayesian network was selected as the best classifier with accuracy of 94.57, specificity 0.98, sensitivity 0.77, AUC 0.96 and F-measure 0.81. It is possible to predict a DLC with high accuracy using data mining techniques, based on routine preoperative clinical parameters and their combinations. Use of sophisticated diagnostic equipment is not necessary.
Robert J. Ross; Susan W. Willits; William Von Segen; Terry Black; Brian K. Brashaw; Roy F. Pellerin
1999-01-01
Longitudinal stress wave nondestructive evaluation (NDE) techniques have been used in a variety of applications in the forest products industry. Recently, it has been shown that they can significantly aid in the assessment of log quality, particularly when they are used to predict performance of structural lumber obtained from a log. The purpose of the research...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spangler, Lee; Cunningham, Alfred; Lageson, David
2011-03-31
ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.
NASA Technical Reports Server (NTRS)
Alvarado, U. R. (Editor)
1980-01-01
The adequacy of current technology in terms of stage of maturity, of sensing, support systems, and information extraction was assessed relative to oil spills, waste pollution, and inputs to pollution trajectory models. Needs for advanced techniques are defined and the characteristics of a future satellite system are determined based on the requirements of U.S. agencies involved in pollution monitoring.
Waste-to-energy: A review of life cycle assessment and its extension methods.
Zhou, Zhaozhi; Tang, Yuanjun; Chi, Yong; Ni, Mingjiang; Buekens, Alfons
2018-01-01
This article proposes a comprehensive review of evaluation tools based on life cycle thinking, as applied to waste-to-energy. Habitually, life cycle assessment is adopted to assess environmental burdens associated with waste-to-energy initiatives. Based on this framework, several extension methods have been developed to focus on specific aspects: Exergetic life cycle assessment for reducing resource depletion, life cycle costing for evaluating its economic burden, and social life cycle assessment for recording its social impacts. Additionally, the environment-energy-economy model integrates both life cycle assessment and life cycle costing methods and judges simultaneously these three features for sustainable waste-to-energy conversion. Life cycle assessment is sufficiently developed on waste-to-energy with concrete data inventory and sensitivity analysis, although the data and model uncertainty are unavoidable. Compared with life cycle assessment, only a few evaluations are conducted to waste-to-energy techniques by using extension methods and its methodology and application need to be further developed. Finally, this article succinctly summarises some recommendations for further research.
Prpich, George; Coulon, Frédéric; Anthony, Edward J
2016-09-01
Interest in the development of shale gas resources using hydraulic fracturing techniques is increasing worldwide despite concerns about the environmental risks associated with this activity. In the United Kingdom (UK), early attempts to hydraulically fracture a shale gas well resulted in a seismic event that led to the suspension of all hydraulic fracturing operations. In response to this occurrence, UK regulators have requested that future shale gas operations that use hydraulic fracturing should be accompanied by a high-level environmental risk assessment (ERA). Completion of an ERA can demonstrate competency, communicate understanding, and ultimately build trust that environmental risks are being managed properly, however, this assessment requires a scientific evidence base. In this paper we discuss how the ERA became a preferred assessment technique to understand the risks related to shale gas development in the UK, and how it can be used to communicate information between stakeholders. We also provide a review of the evidence base that describes the environmental risks related to shale gas operations, which could be used to support an ERA. Finally, we conclude with an update of the current environmental risks associated with shale gas development in the UK and present recommendations for further research. Copyright © 2015 Elsevier B.V. All rights reserved.
Amato, Elvio D; Simpson, Stuart L; Jarolimek, Chad V; Jolley, Dianne F
2014-04-15
Many sediment quality assessment frameworks incorporate contaminant bioavailability as a critical factor regulating toxicity in aquatic ecosystems. However, current approaches do not always adequately predict metal bioavailability to organisms living in the oxidized sediment surface layers. The deployment of the diffusive gradients in thin films (DGT) probes in sediments allows labile metals present in pore waters and weakly bound to the particulate phase to be assessed in a time-integrated manner in situ. In this study, relationships between DGT-labile metal fluxes within 5 mm of the sediment-water interface and lethal and sublethal effects to the amphipod Melita plumulosa were assessed in a range of contaminated estuarine sediments during 10-day laboratory-based bioassays. To account for differing toxicities of metals, DGT fluxes were normalized to water (WQG) or sediment quality guidelines or toxicity thresholds specific for the amphipod. The better dose-response relationship appeared to be the one based on WQG-normalized DGT fluxes, which successfully predicted toxicity despite the wide range of metals and large variations in sediment properties. The study indicated that the labile fraction of metals measured by DGT is useful for predicting metal toxicity to benthic invertebrates, supporting the applicability of this technique as a rapid monitoring tool for sediments quality assessments.
Paqué, Frank; Zehnder, Matthias; De-Deus, Gustavo
2011-10-01
A preparation technique with only 1 single instrument was proposed on the basis of the reciprocating movement of the F2 ProTaper instrument. The present study was designed to quantitatively assess canal preparation outcomes achieved by this technique. Twenty-five extracted human mandibular first molars with 2 separate mesial root canals were selected. Canals were randomly assigned to 1 of the 2 experimental groups: group 1, rotary conventional preparation by using ProTaper, and group 2, reciprocate instrumentation with 1 single ProTaper F2 instrument. Specimens were scanned initially and after root canal preparation with an isotropic resolution of 20 μm by using a micro-computed tomography system. The following parameters were assessed: changes in dentin volume, percentage of shaped canal walls, and degree of canal transportation. In addition, the time required to reach working length with the F2 instrument was recorded. Preoperatively, there were no differences regarding root canal curvature and volume between experimental groups. Overall, instrumentation led to enlarged canal shapes with no evidence of preparation errors. There were no statistical differences between the 2 preparation techniques in the anatomical parameters assessed (P > .01), except for a significantly higher canal transportation caused by the reciprocating file in the coronal canal third. On the other hand, preparation was faster by using the single-file technique (P < .01). Shaping outcomes with the single-file F2 ProTaper technique and conventional ProTaper full-sequence rotary approach were similar. However, the single-file F2 ProTaper technique was markedly faster in reaching working length. Copyright © 2011 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Gauterin, Eckhard; Kammerer, Philipp; Kühn, Martin; Schulte, Horst
2016-05-01
Advanced model-based control of wind turbines requires knowledge of the states and the wind speed. This paper benchmarks a nonlinear Takagi-Sugeno observer for wind speed estimation with enhanced Kalman Filter techniques: The performance and robustness towards model-structure uncertainties of the Takagi-Sugeno observer, a Linear, Extended and Unscented Kalman Filter are assessed. Hence the Takagi-Sugeno observer and enhanced Kalman Filter techniques are compared based on reduced-order models of a reference wind turbine with different modelling details. The objective is the systematic comparison with different design assumptions and requirements and the numerical evaluation of the reconstruction quality of the wind speed. Exemplified by a feedforward loop employing the reconstructed wind speed, the benefit of wind speed estimation within wind turbine control is illustrated. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Singamneni, Sarat; Ramos, Maximiano; Al-Jumaily, Ahmed M
2017-01-01
The conventional gas diffusion layer (GDL) of polymer electrolyte membrane (PEM) fuel cells incorporates a carbon-based substrate, which suffers from electrochemical oxidation as well as mechanical degradation, resulting in reduced durability and performance. In addition, it involves a complex manufacturing process to produce it. The proposed technique aims to resolve both these issues by an advanced 3D printing technique, namely selective laser sintering (SLS). In the proposed work, polyamide (PA) is used as the base powder and titanium metal powder is added at an optimised level to enhance the electrical conductivity, thermal, and mechanical properties. The application of selective laser sintering to fabricate a robust gas diffusion substrate for PEM fuel cell applications is quite novel and is attempted here for the first time. PMID:28773156
Jayakumar, Arunkumar; Singamneni, Sarat; Ramos, Maximiano; Al-Jumaily, Ahmed M; Pethaiah, Sethu Sundar
2017-07-14
The conventional gas diffusion layer (GDL) of polymer electrolyte membrane (PEM) fuel cells incorporates a carbon-based substrate, which suffers from electrochemical oxidation as well as mechanical degradation, resulting in reduced durability and performance. In addition, it involves a complex manufacturing process to produce it. The proposed technique aims to resolve both these issues by an advanced 3D printing technique, namely selective laser sintering (SLS). In the proposed work, polyamide (PA) is used as the base powder and titanium metal powder is added at an optimised level to enhance the electrical conductivity, thermal, and mechanical properties. The application of selective laser sintering to fabricate a robust gas diffusion substrate for PEM fuel cell applications is quite novel and is attempted here for the first time.
Deng, Yang; Liu, Yang; Chen, Suren
2017-01-01
Despite the recent developments in structural health monitoring, there remain great challenges for accurately, conveniently, and economically assessing the in-service performance of the main cables for long-span suspension bridges. A long-term structural health monitoring technique is developed to measure the tension force with a conventional sensing technology and further provide the in-service performance assessment strategy of the main cable. The monitoring system adopts conventional vibrating strings transducers to monitor the tension forces of separate cable strands of the main cable in the anchor span. The performance evaluation of the main cable is conducted based on the collected health monitoring data: (1) the measured strand forces are used to derive the overall tension force of a main cable, which is further translated into load bearing capacity assessment using the concept of safety factor; and (2) the proposed technique can also evaluate the uniformity of tension forces from different cable strands. The assessment of uniformity of strand forces of a main cable offers critical information in terms of potential risks of partial damage and performance deterioration of the main cable. The results suggest the proposed low-cost monitoring system is an option to provide approximate estimation of tension forces of main cables for suspension bridges. With the long-term monitoring data, the proposed monitoring-based evaluation methods can further provide critical information to assess the safety and serviceability performance of main cables. PMID:28621743
Deng, Yang; Liu, Yang; Chen, Suren
2017-06-16
Despite the recent developments in structural health monitoring, there remain great challenges for accurately, conveniently, and economically assessing the in-service performance of the main cables for long-span suspension bridges. A long-term structural health monitoring technique is developed to measure the tension force with a conventional sensing technology and further provide the in-service performance assessment strategy of the main cable. The monitoring system adopts conventional vibrating strings transducers to monitor the tension forces of separate cable strands of the main cable in the anchor span. The performance evaluation of the main cable is conducted based on the collected health monitoring data: (1) the measured strand forces are used to derive the overall tension force of a main cable, which is further translated into load bearing capacity assessment using the concept of safety factor; and (2) the proposed technique can also evaluate the uniformity of tension forces from different cable strands. The assessment of uniformity of strand forces of a main cable offers critical information in terms of potential risks of partial damage and performance deterioration of the main cable. The results suggest the proposed low-cost monitoring system is an option to provide approximate estimation of tension forces of main cables for suspension bridges. With the long-term monitoring data, the proposed monitoring-based evaluation methods can further provide critical information to assess the safety and serviceability performance of main cables.
Di Lascio, Nicole; Bruno, Rosa Maria; Stea, Francesco; Bianchini, Elisabetta; Gemignani, Vincenzo; Ghiadoni, Lorenzo; Faita, Francesco
2014-01-01
Carotid pulse wave velocity (PWV) is considered as a surrogate marker for carotid stiffness and its assessment is increasingly being used in clinical practice. However, at the moment, its estimation needs specific equipment and a moderate level of technical expertise; moreover, it is based on a mathematical model. The aim of this study was to validate a new system for non-invasive and model-free carotid PWV assessment based on accelerometric sensors by comparison with currently used techniques. Accelerometric PWV (accPWV) values were obtained in 97 volunteers free of cardiovascular disease (age 24-85 years) and compared with standard ultrasound-based carotid stiffness parameters, such as carotid PWV (cPWV), relative distension (relD) and distensibility coefficient (DC). Moreover, the comparison between accPWV measurements and carotid-femoral PWV (cfPWV) was performed. Accelerometric PWV evaluations showed a significant correlation with cPWV measurements (R = 0.67), relD values (R = 0.66) and DC assessments (R = 0.64). These values were also significantly correlated with cfPWV evaluations (R = 0.46). In addition, the first attempt success rate was equal to 76.8 %. The accelerometric system allows a simple and quick local carotid stiffness evaluation and the values obtained with this system are significantly correlated with known carotid stiffness biomarkers. Therefore, the presented device could provide a concrete opportunity for an easy carotid stiffness evaluation even in clinical practice.
ERIC Educational Resources Information Center
French, Ron
2001-01-01
Profiles four elementary and secondary schools from around the country that have created successful staff development programs to address an array of diversity issues common in schools nationwide. The efforts included emphasizing assessment and nonverbal techniques; pairing teachers and students on technology-based projects; emphasizing motivation…
Fecal Indicator Bacteria and Environmental Observations: Validation of Virtual Beach
Contamination of recreational waters by fecal material is often assessed using indicator bacteria such as enterococci. Enumeration based on culturing methods can take up to 48 hours to complete, limiting the accuracy of water quality evaluations. Molecular microbial techniques em...
Microbiology--Safety Considerations.
ERIC Educational Resources Information Center
Hoffmann, Sheryl K.
This paper discusses the risk assessment associated with microbiology instruction based on grade level, general control measures, appropriate activities for middle school and high school students, the preparation and sterilization of equipment, and safe handling techniques. Appended are instructions and figures on making wire loops and the…
Biocompatibility of Resin-based Dental Materials
Moharamzadeh, Keyvan; Brook, Ian M.; Van Noort, Richard
2009-01-01
Oral and mucosal adverse reactions to resin-based dental materials have been reported. Numerous studies have examined the biocompatibility of restorative dental materials and their components, and a wide range of test systems for the evaluation of the biological effects of these materials have been developed. This article reviews the biological aspects of resin-based dental materials and discusses the conventional as well as the new techniques used for biocompatibility assessment of dental materials.
Tenax extraction as a simple approach to improve environmental risk assessments.
Harwood, Amanda D; Nutile, Samuel A; Landrum, Peter F; Lydy, Michael J
2015-07-01
It is well documented that using exhaustive chemical extractions is not an effective means of assessing exposure of hydrophobic organic compounds in sediments and that bioavailability-based techniques are an improvement over traditional methods. One technique that has shown special promise as a method for assessing the bioavailability of hydrophobic organic compounds in sediment is the use of Tenax-extractable concentrations. A 6-h or 24-h single-point Tenax-extractable concentration correlates to both bioaccumulation and toxicity. This method has demonstrated effectiveness for several hydrophobic organic compounds in various organisms under both field and laboratory conditions. In addition, a Tenax bioaccumulation model was developed for multiple compounds relating 24-h Tenax-extractable concentrations to oligochaete tissue concentrations exposed in both the laboratory and field. This model has demonstrated predictive capacity for additional compounds and species. Use of Tenax-extractable concentrations to estimate exposure is rapid, simple, straightforward, and relatively inexpensive, as well as accurate. Therefore, this method would be an invaluable tool if implemented in risk assessments. © 2015 SETAC.
Evidence-Based Medicine: Liposuction.
Chia, Christopher T; Neinstein, Ryan M; Theodorou, Spero J
2017-01-01
After studying this article, the participant should be able to: 1. Review the appropriate indications and techniques for suction-assisted lipectomy body contouring surgery. 2. Accurately calculate the patient limits of lidocaine for safe dosing during the tumescent infiltration phase of liposuction. 3. Determine preoperatively possible "red flags" or symptoms and signs in the patient history and physical examination that may indicate a heightened risk profile for a liposuction procedure. 4. Provide an introduction to adjunctive techniques to liposuction such as energy-assisted liposuction and to determine whether or not the reader may decide to add them to his or her practice. With increased focus on one's aesthetic appearance, liposuction has become the most popular cosmetic procedure in the world since its introduction in the 1980s. As it has become more refined with experience, safety, patient selection, preoperative assessment, fluid management, proper technique, and overall care of the patient have been emphasized and improved. For the present article, a systematic review of the relevant literature regarding patient workup, tumescent fluid techniques, medication overview, and operative technique was conducted with a practical approach that the reader will possibly find clinically applicable. Recent trends regarding energy-assisted liposuction and body contouring local anesthesia use are addressed. Deep venous thromboembolism prophylaxis is mentioned, as are other common and less common possible complications. The article provides a literature-supported overview on liposuction techniques with an emphasis on preoperative assessment, medicines used, operative technique, and outcomes.
Fungal Diversity in Tomato Rhizosphere Soil under Conventional and Desert Farming Systems
Kazerooni, Elham A.; Maharachchikumbura, Sajeewa S. N.; Rethinasamy, Velazhahan; Al-Mahrouqi, Hamed; Al-Sadi, Abdullah M.
2017-01-01
This study examined fungal diversity and composition in conventional (CM) and desert farming (DE) systems in Oman. Fungal diversity in the rhizosphere of tomato was assessed using 454-pyrosequencing and culture-based techniques. Both techniques produced variable results in terms of fungal diversity, with 25% of the fungal classes shared between the two techniques. In addition, pyrosequencing recovered more taxa compared to direct plating. These findings could be attributed to the ability of pyrosequencing to recover taxa that cannot grow or are slow growing on culture media. Both techniques showed that fungal diversity in the conventional farm was comparable to that in the desert farm. However, the composition of fungal classes and taxa in the two farming systems were different. Pyrosequencing revealed that Microsporidetes and Dothideomycetes are the two most common fungal classes in CM and DE, respectively. However, the culture-based technique revealed that Eurotiomycetes was the most abundant class in both farming systems and some classes, such as Microsporidetes, were not detected by the culture-based technique. Although some plant pathogens (e.g., Pythium or Fusarium) were detected in the rhizosphere of tomato, the majority of fungal species in the rhizosphere of tomato were saprophytes. Our study shows that the cultivation system may have an impact on fungal diversity. The factors which affected fungal diversity in both farms are discussed. PMID:28824590
Miller, Brian W.; Van der Meeren, Anne; Tazrart, Anissa; Angulo, Jaime F.; Griffiths, Nina M.
2017-01-01
This work presents a comparison of three autoradiography techniques for imaging biological samples contaminated with actinides: emulsion-based, plastic-based autoradiography and a quantitative digital technique, the iQID camera, based on the numerical analysis of light from a scintillator screen. In radiation toxicology it has been important to develop means of imaging actinide distribution in tissues as these radionuclides may be heterogeneously distributed within and between tissues after internal contamination. Actinide distribution determines which cells are exposed to alpha radiation and is thus potentially critical for assessing absorbed dose. The comparison was carried out by generating autoradiographs of the same biological samples contaminated with actinides with the three autoradiography techniques. These samples were cell preparations or tissue sections collected from animals contaminated with different physico-chemical forms of actinides. The autoradiograph characteristics and the performances of the techniques were evaluated and discussed mainly in terms of acquisition process, activity distribution patterns, spatial resolution and feasibility of activity quantification. The obtained autoradiographs presented similar actinide distribution at low magnification. Out of the three techniques, emulsion autoradiography is the only one to provide a highly-resolved image of the actinide distribution inherently superimposed on the biological sample. Emulsion autoradiography is hence best interpreted at higher magnifications. However, this technique is destructive for the biological sample. Both emulsion- and plastic-based autoradiography record alpha tracks and thus enabled the differentiation between ionized forms of actinides and oxide particles. This feature can help in the evaluation of decorporation therapy efficacy. The most recent technique, the iQID camera, presents several additional features: real-time imaging, separate imaging of alpha particles and gamma rays, and alpha activity quantification. The comparison of these three autoradiography techniques showed that they are complementary and the choice of the technique depends on the purpose of the imaging experiment. PMID:29023595
López-Rodríguez, Patricia; Escot-Bocanegra, David; Poyatos-Martínez, David; Weinmann, Frank
2016-01-01
The trend in the last few decades is that current unmanned aerial vehicles are completely made of composite materials rather than metallic, such as carbon-fiber or fiberglass composites. From the electromagnetic point of view, this fact forces engineers and scientists to assess how these materials may affect their radar response or their electronics in terms of electromagnetic compatibility. In order to evaluate this, electromagnetic characterization of different composite materials has become a need. Several techniques exist to perform this characterization, all of them based on the utilization of different sensors for measuring different parameters. In this paper, an implementation of the metal-backed free-space technique, based on the employment of antenna probes, is utilized for the characterization of composite materials that belong to an actual drone. Their extracted properties are compared with those given by a commercial solution, an open-ended coaxial probe (OECP). The discrepancies found between both techniques along with a further evaluation of the methodologies, including measurements with a split-cavity resonator, conclude that the implemented free-space technique provides more reliable results for this kind of composites than the OECP technique. PMID:27347966
NASA Technical Reports Server (NTRS)
Powers, William O.
1987-01-01
A study of reduced chromium content in a nickel base superalloy via element substitution and rapid solidification processing was performed. The two elements used as partial substitutes for chromium were Si and Zr. The microstructure of conventionally solidified materials was characterized using microscopy techniques. These alloys were rapidly solidified using the chill block melt spinning technique and the rapidly solidified microstructures were characterized using electron microscopy. The spinning technique and the rapidly solidified microstructures was assessed following heat treatments at 1033 and 1272 K. Rapidly solidified material of three alloys was reduced to particulate form and consolidated using hot isostatic pressing (HIP). The consolidated materials were also characterized using microscopy techniques. In order to evaluate the relative strengths of the consolidated alloys, compression tests were performed at room temperature and 1033 K on samples of as-HIPed and HIPed plus solution treated material. Yield strength, porosity, and oxidation resistance characteristics are given and compared.
NASA Astrophysics Data System (ADS)
Tong, Minh Q.; Hasan, M. Monirul; Gregory, Patrick D.; Shah, Jasmine; Park, B. Hyle; Hirota, Koji; Liu, Junze; Choi, Andy; Low, Karen; Nam, Jin
2017-02-01
We demonstrate a computationally-efficient optical coherence elastography (OCE) method based on fringe washout. By introducing ultrasound in alternating depth profile, we can obtain information on the mechanical properties of a sample within acquisition of a single image. This can be achieved by simply comparing the intensity in adjacent depth profiles in order to quantify the degree of fringe washout. Phantom agar samples with various densities were measured and quantified by our OCE technique, the correlation to Young's modulus measurement by atomic force micrscopy (AFM) were observed. Knee cartilage samples of monoiodo acetate-induced arthiritis (MIA) rat models were utilized to replicate cartilage damages where our proposed OCE technique along with intensity and birefringence analyses and AFM measurements were applied. The results indicate that our OCE technique shows a correlation to the techniques as polarization-sensitive OCT, AFM Young's modulus measurements and histology were promising. Our OCE is applicable to any of existing OCT systems and demonstrated to be computationally-efficient.
Assessing uncertainties in superficial water provision by different bootstrap-based techniques
NASA Astrophysics Data System (ADS)
Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo Mario
2014-05-01
An assessment of water security can incorporate several water-related concepts, characterizing the interactions between societal needs, ecosystem functioning, and hydro-climatic conditions. The superficial freshwater provision level depends on the methods chosen for 'Environmental Flow Requirement' estimations, which integrate the sources of uncertainty in the understanding of how water-related threats to aquatic ecosystem security arise. Here, we develop an uncertainty assessment of superficial freshwater provision based on different bootstrap techniques (non-parametric resampling with replacement). To illustrate this approach, we use an agricultural basin (291 km2) within the Cantareira water supply system in Brazil monitored by one daily streamflow gage (24-year period). The original streamflow time series has been randomly resampled for different times or sample sizes (N = 500; ...; 1000), then applied to the conventional bootstrap approach and variations of this method, such as: 'nearest neighbor bootstrap'; and 'moving blocks bootstrap'. We have analyzed the impact of the sampling uncertainty on five Environmental Flow Requirement methods, based on: flow duration curves or probability of exceedance (Q90%, Q75% and Q50%); 7-day 10-year low-flow statistic (Q7,10); and presumptive standard (80% of the natural monthly mean ?ow). The bootstrap technique has been also used to compare those 'Environmental Flow Requirement' (EFR) methods among themselves, considering the difference between the bootstrap estimates and the "true" EFR characteristic, which has been computed averaging the EFR values of the five methods and using the entire streamflow record at monitoring station. This study evaluates the bootstrapping strategies, the representativeness of streamflow series for EFR estimates and their confidence intervals, in addition to overview of the performance differences between the EFR methods. The uncertainties arisen during EFR methods assessment will be propagated through water security indicators referring to water scarcity and vulnerability, seeking to provide meaningful support to end-users and water managers facing the incorporation of uncertainties in the decision making process.
Ito, Masako
Structural property of bone includes micro- or nano-structural property of the trabecular and cortical bone, and macroscopic geometry. Radiological technique is useful to analyze the bone structural property;multi-detector row CT(MDCT)or high-resolution peripheral QCT(HR-pQCT)is available to analyze human bone in vivo . For the analysis of hip geometry, CT-based hip structure analysis(HSA)is available as well as DXA-based HSA. These structural parameters are related to biomechanical property, and these assessment tools provide information of pathological changes or the effects of anti-osteoporotic agents on bone.
[Do ablative treatments modify the management of kidney tumors in the elderly?].
Long, J-A; Neuzillet, Y; Poissonnier, L; Lang, H; Paparel, P; Escudier, B; Rioux-Leclercq, N; Correas, J-M; Mejean, A; Baumert, H; Soulié, M; Patard, J-J
2009-11-01
The development of ablative techniques in renal oncology has profoundly changed treatment of small renal tumors. The objective of this review of the literature was to assess the arguments for treating localized kidney tumors with these techniques in the elderly patient. The two techniques retained because of their recognized use, for all approaches, are radiofrequency and cryotherapy. The data in the literature report more frequent local recurrence with these techniques than with surgical excision and an advantage to cryotherapy over radiofrequency. There seems to be no difference in terms of metastatic progression. Morbidity is not insignificant, with major complications in slightly less than 10% of cases. Given the need to consider small tumors (<4 cm), the advantage in terms of life expectancy is challenged by series studying active monitoring of the oldest patients who present co-morbidities. At present, the indications should therefore be measured and based on a general assessment of the patient, with particular consideration of the existing co-morbidities so as not to treat a patient while imposing undue complications. (c) 2009 Elsevier Masson SAS. All rights reserved.
Mali, Matilda; Dell'Anna, Maria Michela; Mastrorilli, Piero; Damiani, Leonardo; Ungaro, Nicola; Belviso, Claudia; Fiore, Saverio
2015-11-01
Sediment contamination by metals poses significant risks to coastal ecosystems and is considered to be problematic for dredging operations. The determination of the background values of metal and metalloid distribution based on site-specific variability is fundamental in assessing pollution levels in harbour sediments. The novelty of the present work consists of addressing the scope and limitation of analysing port sediments through the use of conventional statistical techniques (such as: linear regression analysis, construction of cumulative frequency curves and the iterative 2σ technique), that are commonly employed for assessing Regional Geochemical Background (RGB) values in coastal sediments. This study ascertained that although the tout court use of such techniques in determining the RGB values in harbour sediments seems appropriate (the chemical-physical parameters of port sediments fit well with statistical equations), it should nevertheless be avoided because it may be misleading and can mask key aspects of the study area that can only be revealed by further investigations, such as mineralogical and multivariate statistical analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.
Srivastava, Kshama; Soin, Seepika; Sapra, B K; Ratna, P; Datta, D
2017-11-01
The occupational exposure incurred by the radiation workers due to the external radiation is estimated using personal dosemeter placed on the human body during the monitoring period. In certain situations, it is required to determine whether the dosemeter alone was exposed accidentally/intentionally in radiation field (static exposure) or was exposed while being worn by a worker moving in his workplace (dynamic exposure). The present thermoluminscent (TL) based personnel monitoring systems are not capable of distinguishing between the above stated (static and dynamic) exposure conditions. The feasibility of a new methodology developed using the charge coupled device based imaging technique for identification of the static/dynamic exposure of CaSO4:Dy based TL detectors for low energy photons has been investigated. The techniques for the qualitative and the quantitative assessments of the exposure conditions are presented in this paper. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Kaufman, Y. J.; Tanre, D.; Dubovik, O.; Karnieli, A.; Remer, L. A.; Einaudi, Franco (Technical Monitor)
2000-01-01
The ability of dust to absorb solar radiation and heat the atmosphere is one of the main uncertainties in climate modeling and the prediction of climate change. Dust absorption is not well known due to limitations of in situ measurements. New techniques to measure dust absorption are needed in order to assess the impact of dust on climate. Here we report two new independent remote sensing techniques that provide sensitive measurements of dust absorption. Both are based on remote sensing. One uses satellite spectral measurements, the second uses ground based sky measurements from the AERONET network. Both techniques demonstrate that Saharan dust absorption of solar radiation is several times smaller than the current international standards. Dust cooling of the earth system in the solar spectrum is therefore significantly stronger than recent calculations indicate. We shall also address the issue of the effects of dust non-sphericity on the aerosol optical properties.
Training and certification in endobronchial ultrasound-guided transbronchial needle aspiration
Konge, Lars; Nayahangan, Leizl Joy; Clementsen, Paul Frost
2017-01-01
Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) plays a key role in the staging of lung cancer, which is crucial for allocation to surgical treatment. EBUS-TBNA is a complicated procedure and simulation-based training is helpful in the first part of the long learning curve prior to performing the procedure on actual patients. New trainees should follow a structured training programme consisting of training on simulators to proficiency as assessed with a validated test followed by supervised practice on patients. The simulation-based training is superior to the traditional apprenticeship model and is recommended in the newest guidelines. EBUS-TBNA and oesophageal ultrasound-guided fine needle aspiration (EUS-FNA or EUS-B-FNA) are complementary to each other and the combined techniques are superior to either technique alone. It is logical to learn and to perform the two techniques in combination, however, for lung cancer staging solely EBUS-TBNA simulators exist, but hopefully in the future simulation-based training in EUS will be possible. PMID:28840013
Life-assessment technique for nuclear power plant cables
NASA Astrophysics Data System (ADS)
Bartoníček, B.; Hnát, V.; Plaček, V.
1998-06-01
The condition of polymer-based cable material can be best characterized by measuring elongation at break of its insulating materials. However, it is not often possible to take sufficiently large samples for measurement with the tensile testing machine. The problem has been conveniently solved by utilizing differential scanning calorimetry technique. From the tested cable, several microsamples are taken and the oxidation induction time (OIT) is determined. For each cable which is subject to the assessment of the lifetime, the correlation of OIT with elongation at break and the correlation of elongation at break with the cable service time has to be performed. A reliable assessment of the cable lifetime depends on accuracy of these correlations. Consequently, synergistic effects well known at this time - dose rate effects and effects resulting from the different sequence of applying radiation and elevated temperature must be taken into account.
NASA Technical Reports Server (NTRS)
Kenny, R. Jeremy; Casiano, Matthew; Fischbach, Sean; Hulka, James R.
2012-01-01
Liquid rocket engine combustion stability assessments are traditionally broken into three categories: dynamic stability, spontaneous stability, and rough combustion. This work focuses on comparing the spontaneous stability and rough combustion assessments for several liquid engine programs. The techniques used are those developed at Marshall Space Flight Center (MSFC) for the J-2X Workhorse Gas Generator program. Stability assessment data from the Integrated Powerhead Demonstrator (IPD), FASTRAC, and Common Extensible Cryogenic Engine (CECE) programs are compared against previously processed J-2X Gas Generator data. Prior metrics for spontaneous stability assessments are updated based on the compilation of all data sets.
A Novel Rules Based Approach for Estimating Software Birthmark
Binti Alias, Norma; Anwar, Sajid
2015-01-01
Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363
NASA Astrophysics Data System (ADS)
Lo, Men-Tzung; Hu, Kun; Liu, Yanhui; Peng, C.-K.; Novak, Vera
2008-12-01
Quantification of nonlinear interactions between two nonstationary signals presents a computational challenge in different research fields, especially for assessments of physiological systems. Traditional approaches that are based on theories of stationary signals cannot resolve nonstationarity-related issues and, thus, cannot reliably assess nonlinear interactions in physiological systems. In this review we discuss a new technique called multimodal pressure flow (MMPF) method that utilizes Hilbert-Huang transformation to quantify interaction between nonstationary cerebral blood flow velocity (BFV) and blood pressure (BP) for the assessment of dynamic cerebral autoregulation (CA). CA is an important mechanism responsible for controlling cerebral blood flow in responses to fluctuations in systemic BP within a few heart-beats. The MMPF analysis decomposes BP and BFV signals into multiple empirical modes adaptively so that the fluctuations caused by a specific physiologic process can be represented in a corresponding empirical mode. Using this technique, we showed that dynamic CA can be characterized by specific phase delays between the decomposed BP and BFV oscillations, and that the phase shifts are significantly reduced in hypertensive, diabetics and stroke subjects with impaired CA. Additionally, the new technique can reliably assess CA using both induced BP/BFV oscillations during clinical tests and spontaneous BP/BFV fluctuations during resting conditions.
Self-Care Behaviors in Heart Failure.
Cavalcante, Agueda Maria Ruiz Zimmer; Lopes, Camila Takao; Brunori, Evelise Fadini Reis; Swanson, Elizabeth; Moorhead, Sue Ann; Bachion, Maria Márcia; de Barros, Alba Lucia Bottura Leite
2017-05-18
To identify self-care behaviors, instruments, techniques, parameters for the assessment of self-care behaviors in people with heart failure, compare these behaviors with the indicators of the Nursing Outcomes Classification outcome, Self Management: Cardiac Disease. Integrative literature review performed in Lilacs, Medline, CINAHL, and Cochrane, including publications from 2009 to 2015. One thousand six hundred ninety-one articles were retrieved from the search, of which 165 were selected for analysis. Ten self-care behaviors and several different assessment instruments, techniques, and parameters were identified. The addition and removal of some indicators are proposed, based on this review. The data provide substrate for the development of conceptual and operational definitions of the indicators, making the outcome more applicable for use in clinical practice. © 2017 NANDA International, Inc.
Assessment of blood-brain barrier penetration: in silico, in vitro and in vivo.
Feng, Meihua Rose
2002-12-01
The amount of drug achieved and maintained in the brain after systemic administration is determined by the agent's permeability at blood-brain barrier (BBB), potential involvement of transport systems, and the distribution, metabolism and elimination properties. Passive diffusion permeability may be predicted by an in silico method based on a molecule's structure property. In vitro cell culture is another useful tool for the assessment of passive permeability and BBB transports (e.g. PGP, MRP). In situ or in vivo techniques like carotid artery single injection or perfusion, brain microdialysis, autoradiography, and others are used at various stages of drug discovery and development to estimate CNS penetration and PK/PD correlation. Each technique has its own application with specific advantages and limitations.
Plasticity models of material variability based on uncertainty quantification techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Reese E.; Rizzi, Francesco; Boyce, Brad
The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQmore » techniques can be used in model selection and assessing the quality of calibrated physical parameters.« less
Cox, Kieran D; Black, Morgan J; Filip, Natalia; Miller, Matthew R; Mohns, Kayla; Mortimor, James; Freitas, Thaise R; Greiter Loerzer, Raquel; Gerwing, Travis G; Juanes, Francis; Dudas, Sarah E
2017-12-01
Diversity estimates play a key role in ecological assessments. Species richness and abundance are commonly used to generate complex diversity indices that are dependent on the quality of these estimates. As such, there is a long-standing interest in the development of monitoring techniques, their ability to adequately assess species diversity, and the implications for generated indices. To determine the ability of substratum community assessment methods to capture species diversity, we evaluated four methods: photo quadrat, point intercept, random subsampling, and full quadrat assessments. Species density, abundance, richness, Shannon diversity, and Simpson diversity were then calculated for each method. We then conducted a method validation at a subset of locations to serve as an indication for how well each method captured the totality of the diversity present. Density, richness, Shannon diversity, and Simpson diversity estimates varied between methods, despite assessments occurring at the same locations, with photo quadrats detecting the lowest estimates and full quadrat assessments the highest. Abundance estimates were consistent among methods. Sample-based rarefaction and extrapolation curves indicated that differences between Hill numbers (richness, Shannon diversity, and Simpson diversity) were significant in the majority of cases, and coverage-based rarefaction and extrapolation curves confirmed that these dissimilarities were due to differences between the methods, not the sample completeness. Method validation highlighted the inability of the tested methods to capture the totality of the diversity present, while further supporting the notion of extrapolating abundances. Our results highlight the need for consistency across research methods, the advantages of utilizing multiple diversity indices, and potential concerns and considerations when comparing data from multiple sources.
Simultaneous computation of jet turbulence and noise
NASA Technical Reports Server (NTRS)
Berman, C. H.; Ramos, J. I.
1989-01-01
The existing flow computation methods, wave computation techniques, and theories based on noise source models are reviewed in order to assess the capabilities of numerical techniques to compute jet turbulence noise and understand the physical mechanisms governing it over a range of subsonic and supersonic nozzle exit conditions. In particular, attention is given to (1) methods for extrapolating near field information, obtained from flow computations, to the acoustic far field and (2) the numerical solution of the time-dependent Lilley equation.
Identification of Terrestrial Reflectance From Remote Sensing
NASA Technical Reports Server (NTRS)
Alter-Gartenberg, Rachel; Nolf, Scott R.; Stacy, Kathryn (Technical Monitor)
2000-01-01
Correcting for atmospheric effects is an essential part of surface-reflectance recovery from radiance measurements. Model-based atmospheric correction techniques enable an accurate identification and classification of terrestrial reflectances from multi-spectral imagery. Successful and efficient removal of atmospheric effects from remote-sensing data is a key factor in the success of Earth observation missions. This report assesses the performance, robustness and sensitivity of two atmospheric-correction and reflectance-recovery techniques as part of an end-to-end simulation of hyper-spectral acquisition, identification and classification.
Silicon ribbon technology assessment 1978-1986 - A computer-assisted analysis using PECAN
NASA Technical Reports Server (NTRS)
Kran, A.
1978-01-01
The paper presents a 1978-1986 economic outlook for silicon ribbon technology based on the capillary action shaping technique. The outlook is presented within the framework of two sets of scenarios, which develop strategy for approaching the 1986 national energy capacity cost objective of $0.50/WE peak. The PECAN (Photovoltaic Energy Conversion Analysis) simulation technique is used to develop a 1986 sheet material price ($50/sq m) which apparently can be attained without further scientific breakthrough.
Delgado San Martin, J A; Worthington, P; Yates, J W T
2015-04-01
Subcutaneous tumour xenograft volumes are generally measured using callipers. This method is susceptible to inter- and intra-observer variability and systematic inaccuracies. Non-invasive 3D measurement using ultrasound and magnetic resonance imaging (MRI) have been considered, but require immobilization of the animal. An infrared-based 3D time-of-flight (3DToF) camera was used to acquire a depth map of tumour-bearing mice. A semi-automatic algorithm based on parametric surfaces was applied to estimate tumour volume. Four clay mouse models and 18 tumour-bearing mice were assessed using callipers (applying both prolate spheroid and ellipsoid models) and 3DToF methods, and validated using tumour weight. Inter-experimentalist variability could be up to 25% in the calliper method. Experimental results demonstrated good consistency and relatively low error rates for the 3DToF method, in contrast to biased overestimation using callipers. Accuracy is currently limited by camera performance; however, we anticipate the next generation 3DToF cameras will be able to support the development of a practical system. Here, we describe an initial proof of concept for a non-invasive, non-immobilized, morphology-independent, economical and potentially more precise tumour volume assessment technique. This affordable technique should maximize the datapoints per animal, by reducing the numbers required in experiments and reduce their distress. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Microalgae harvesting techniques: A review.
Singh, Gulab; Patidar, S K
2018-07-01
Microalgae with wide range of commercial applications have attracted a lot of attention of the researchers in the last few decades. However, microalgae utilization is not economically sustainable due to high cost of harvesting. A wide range of solid - liquid separation techniques are available for microalgae harvesting. The techniques include coagulation and flocculation, flotation, centrifugation and filtration or a combination of various techniques. Despite the importance of harvesting to the economics and energy balance, there is no universal harvesting technique for microalgae. Therefore, this review focuses on assessing technical, economical and application potential of various harvesting techniques so as to allow selection of an appropriate technology for cost effectively harvesting of microalgae from their culture medium. Various harvesting and concentrating techniques of microalgae were reviewed to suggest order of suitability of the techniques for four main microalgae applications i.e biofuel, human and animal food, high valued products, and water quality restoration. For deciding the order of suitability, a comparative analysis of various harvesting techniques based on the six common criterions (i.e biomass quality, cost, biomass quantity, processing time, species specific and toxicity) has been done. Based on the order of various techniques vis-a-vis various criteria and preferred order of criteria for various applications, order of suitability of harvesting techniques for various applications has been decided. Among various harvesting techniques, coagulation and flocculation, centrifugation and filtration were found to be most suitable for considered applications. These techniques may be used alone or in combination for increasing the harvesting efficiency. Copyright © 2018 Elsevier Ltd. All rights reserved.
Synchrophasor-Assisted Prediction of Stability/Instability of a Power System
NASA Astrophysics Data System (ADS)
Saha Roy, Biman Kumar; Sinha, Avinash Kumar; Pradhan, Ashok Kumar
2013-05-01
This paper presents a technique for real-time prediction of stability/instability of a power system based on synchrophasor measurements obtained from phasor measurement units (PMUs) at generator buses. For stability assessment the technique makes use of system severity indices developed using bus voltage magnitude obtained from PMUs and generator electrical power. Generator power is computed using system information and PMU information like voltage and current phasors obtained from PMU. System stability/instability is predicted when the indices exceeds a threshold value. A case study is carried out on New England 10-generator, 39-bus system to validate the performance of the technique.
Development and use of the incremental twitch subtraction MUNE method in mice.
Hegedus, Janka; Jones, Kelvin E; Gordon, Tessa
2009-01-01
We have used a technique to estimate the number of functioning motor units (MUNE) innervating a muscle in mice based on twitch tension. The MUNE technique was verified by modeling twitch tensions from isolated ventral root stimulation. Analysis by twitch tensions allowed us to identify motor unit fiber types. The MUNE technique was used to compare normal mice with transgenic superoxide dismutase-1 mutation (G94A) mice to assess the time course of motor unit loss with respect to fiber type. Motor unit loss was found to occur well in advance of behavioral changes and the degree of reinnervation is dependent upon motor unit fiber types.
Multiple-choice examinations: adopting an evidence-based approach to exam technique.
Hammond, E J; McIndoe, A K; Sansome, A J; Spargo, P M
1998-11-01
Negatively marked multiple-choice questions (MCQs) are part of the assessment process in both the Primary and Final examinations for the fellowship of the Royal College of Anaesthetists. It is said that candidates who guess will lose marks in the MCQ paper. We studied candidates attending a pre-examination revision course and have shown that an evaluation of examination technique is an important part of an individual's preparation. All candidates benefited substantially from backing their educated guesses while only 3 out of 27 lost marks from backing their wild guesses. Failure to appreciate the relationship between knowledge and technique may significantly affect a candidate's performance in the examination.
RTI in Middle School Classrooms: Proven Tools and Strategies
ERIC Educational Resources Information Center
Esteves, Kelli J.; Whitten, Elizabeth
2014-01-01
"RTI in Middle School Classrooms" provides practical, research-based instructional techniques and interventions--geared especially to middle school teachers and administrators--that target and address specific needs of individual students. Response to intervention allows educators to assess and meet the needs of struggling students…
Applying a Continuous Quality Improvement Model To Assess Institutional Effectiveness.
ERIC Educational Resources Information Center
Roberts, Keith
This handbook outlines techniques and processes for improving institutional effectiveness and ensuring continuous quality improvement, based on strategic planning activities at Wisconsin's Milwaukee Area Technical College (MATC). First, institutional effectiveness is defined and 17 core indicators of effectiveness developed by the Wisconsin…
Money Sense Makes a Difference.
ERIC Educational Resources Information Center
Varcoe, Karen P.; Wright, Joan
1990-01-01
Assesses the degree to which clients completing the Money Sense program adopted its family resource management techniques. Finds that, among 190 low income clients from rural California counties and military bases, there were significant positive changes in food shopping and money management behaviors and significant decreases in financial…
An Approach to Goal-Statement Evaluation
ERIC Educational Resources Information Center
Reiner, John R.; Robinson, Donald W.
1969-01-01
"The results of this study support the proposition that the application of environmental assessment techniques based on CUES items provides information which can help evaluate the formal goals of an institution in terms of the degree to which the institutional environment is facilitative of those goals. (Author)
Impact of sampling techniques on measured stormwater quality data for small streams
USDA-ARS?s Scientific Manuscript database
Science-based sampling methodologies are needed to enhance water quality characterization for developing Total Maximum Daily Loads (TMDLs), setting appropriate water quality standards, and managing nonpoint source pollution. Storm event sampling, which is vital for adequate assessment of water qual...
EPA's Office of Research and Development (ORD) develops innovative methods for use in environmental monitoring and assessment by scientists in Regions, states, and Tribes. Molecular-biology-based methods are not yet established in the environmental monitoring "tool box". SRI (Sci...
Quantitative Prediction of Systemic Toxicity Points of Departure (OpenTox USA 2017)
Human health risk assessment associated with environmental chemical exposure is limited by the tens of thousands of chemicals little or no experimental in vivo toxicity data. Data gap filling techniques, such as quantitative models based on chemical structure information, are c...
Augmented assessment as a means to augmented reality.
Bergeron, Bryan
2006-01-01
Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.
EMG Processing Based Measures of Fatigue Assessment during Manual Lifting.
Shair, E F; Ahmad, S A; Marhaban, M H; Mohd Tamrin, S B; Abdullah, A R
2017-01-01
Manual lifting is one of the common practices used in the industries to transport or move objects to a desired place. Nowadays, even though mechanized equipment is widely available, manual lifting is still considered as an essential way to perform material handling task. Improper lifting strategies may contribute to musculoskeletal disorders (MSDs), where overexertion contributes as the highest factor. To overcome this problem, electromyography (EMG) signal is used to monitor the workers' muscle condition and to find maximum lifting load, lifting height and number of repetitions that the workers are able to handle before experiencing fatigue to avoid overexertion. Past researchers have introduced several EMG processing techniques and different EMG features that represent fatigue indices in time, frequency, and time-frequency domain. The impact of EMG processing based measures in fatigue assessment during manual lifting are reviewed in this paper. It is believed that this paper will greatly benefit researchers who need a bird's eye view of the biosignal processing which are currently available, thus determining the best possible techniques for lifting applications.
Chemical composition analysis and authentication of whisky.
Wiśniewska, Paulina; Dymerski, Tomasz; Wardencki, Waldemar; Namieśnik, Jacek
2015-08-30
Whisky (whiskey) is one of the most popular spirit-based drinks made from malted or saccharified grains, which should mature for at least 3 years in wooden barrels. High popularity of products usually causes a potential risk of adulteration. Thus authenticity assessment is one of the key elements of food product marketing. Authentication of whisky is based on comparing the composition of this alcohol with other spirit drinks. The present review summarizes all information about the comparison of whisky and other alcoholic beverages, the identification of type of whisky or the assessment of its quality and finally the authentication of whisky. The article also presents the various techniques used for analyzing whisky, such as gas and liquid chromatography with different types of detectors (FID, AED, UV-Vis), electronic nose, atomic absorption spectroscopy and mass spectrometry. In some cases the application of chemometric methods is also described, namely PCA, DFA, LDA, ANOVA, SIMCA, PNN, k-NN and CA, as well as preparation techniques such SPME or SPE. © 2014 Society of Chemical Industry.
Kushniruk, A W; Patel, C; Patel, V L; Cimino, J J
2001-04-01
The World Wide Web provides an unprecedented opportunity for widespread access to health-care applications by both patients and providers. The development of new methods for assessing the effectiveness and usability of these systems is becoming a critical issue. This paper describes the distance evaluation (i.e. 'televaluation') of emerging Web-based information technologies. In health informatics evaluation, there is a need for application of new ideas and methods from the fields of cognitive science and usability engineering. A framework is presented for conducting evaluations of health-care information technologies that integrates a number of methods, ranging from deployment of on-line questionnaires (and Web-based forms) to remote video-based usability testing of user interactions with clinical information systems. Examples illustrating application of these techniques are presented for the assessment of a patient clinical information system (PatCIS), as well as an evaluation of use of Web-based clinical guidelines. Issues in designing, prototyping and iteratively refining evaluation components are discussed, along with description of a 'virtual' usability laboratory.
Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments
NASA Technical Reports Server (NTRS)
Manning, Ted A.; Lawrence, Scott L.
2014-01-01
As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.
Characterization of Model-Based Reasoning Strategies for Use in IVHM Architectures
NASA Technical Reports Server (NTRS)
Poll, Scott; Iverson, David; Patterson-Hine, Ann
2003-01-01
Open architectures are gaining popularity for Integrated Vehicle Health Management (IVHM) applications due to the diversity of subsystem health monitoring strategies in use and the need to integrate a variety of techniques at the system health management level. The basic concept of an open architecture suggests that whatever monitoring or reasoning strategy a subsystem wishes to deploy, the system architecture will support the needs of that subsystem and will be capable of transmitting subsystem health status across subsystem boundaries and up to the system level for system-wide fault identification and diagnosis. There is a need to understand the capabilities of various reasoning engines and how they, coupled with intelligent monitoring techniques, can support fault detection and system level fault management. Researchers in IVHM at NASA Ames Research Center are supporting the development of an IVHM system for liquefying-fuel hybrid rockets. In the initial stage of this project, a few readily available reasoning engines were studied to assess candidate technologies for application in next generation launch systems. Three tools representing the spectrum of model-based reasoning approaches, from a quantitative simulation based approach to a graph-based fault propagation technique, were applied to model the behavior of the Hybrid Combustion Facility testbed at Ames. This paper summarizes the characterization of the modeling process for each of the techniques.
Assessment of Intervertebral Disc Degeneration Based on Quantitative MRI Analysis: an in vivo study
Grunert, Peter; Hudson, Katherine D.; Macielak, Michael R.; Aronowitz, Eric; Borde, Brandon H.; Alimi, Marjan; Njoku, Innocent; Ballon, Douglas; Tsiouris, Apostolos John; Bonassar, Lawrence J.; Härtl, Roger
2015-01-01
Study design Animal experimental study Objective To evaluate a novel quantitative imaging technique for assessing disc degeneration. Summary of Background Data T2-relaxation time (T2-RT) measurements have been used to quantitatively assess disc degeneration. T2 values correlate with the water content of inter vertebral disc tissue and thereby allow for the indirect measurement of nucleus pulposus (NP) hydration. Methods We developed an algorithm to subtract out MRI voxels not representing NP tissue based on T2-RT values. Filtered NP voxels were used to measure nuclear size by their amount and nuclear hydration by their mean T2-RT. This technique was applied to 24 rat-tail intervertebral discs’ (IVDs), which had been punctured with an 18-gauge needle according to different techniques to induce varying degrees of degeneration. NP voxel count and average T2-RT were used as parameters to assess the degeneration process at 1 and 3 months post puncture. NP voxel counts were evaluated against X-ray disc height measurements and qualitative MRI studies based on the Pfirrmann grading system. Tails were collected for histology to correlate NP voxel counts to histological disc degeneration grades and to NP cross-sectional area measurements. Results NP voxel count measurements showed strong correlations to qualitative MRI analyses (R2=0.79, p<0.0001), histological degeneration grades (R2=0.902, p<0.0001) and histological NP cross-sectional area measurements (R2=0.887, p<0.0001). In contrast to NP voxel counts, the mean T2-RT for each punctured group remained constant between months 1 and 3. The mean T2-RTs for the punctured groups did not show a statistically significant difference from those of healthy IVDs (63.55ms ±5.88ms month 1 and 62.61ms ±5.02ms) at either time point. Conclusion The NP voxel count proved to be a valid parameter to quantitatively assess disc degeneration in a needle puncture model. The mean NP T2-RT does not change significantly in needle-puncture induced degenerated IVDs. IVDs can be segmented into different tissue components according to their innate T2-RT. PMID:24384655
Assessment of Medical Risks and Optimization of their Management using Integrated Medical Model
NASA Technical Reports Server (NTRS)
Fitts, Mary A.; Madurai, Siram; Butler, Doug; Kerstman, Eric; Risin, Diana
2008-01-01
The Integrated Medical Model (IMM) Project is a software-based technique that will identify and quantify the medical needs and health risks of exploration crew members during space flight and evaluate the effectiveness of potential mitigation strategies. The IMM Project employs an evidence-based approach that will quantify probability and consequences of defined in-flight medical risks, mitigation strategies, and tactics to optimize crew member health. Using stochastic techniques, the IMM will ultimately inform decision makers at both programmatic and institutional levels and will enable objective assessment of crew health and optimization of mission success using data from relevant cohort populations and from the astronaut population. The objectives of the project include: 1) identification and documentation of conditions that may occur during exploration missions (Baseline Medical Conditions List [BMCL), 2) assessment of the likelihood of conditions in the BMCL occurring during exploration missions (incidence rate), 3) determination of the risk associated with these conditions and quantify in terms of end states (Loss of Crew, Loss of Mission, Evacuation), 4) optimization of in-flight hardware mass, volume, power, bandwidth and cost for a given level of risk or uncertainty, and .. validation of the methodologies used.
NASA Astrophysics Data System (ADS)
Themistocleous, Kyriacos; Neocleous, Kyriacos; Pilakoutas, Kypros; Hadjimitsis, Diofantos G.
2014-08-01
The predominant approach for conducting road condition surveys and analyses is still largely based on extensive field observations. However, visual assessment alone cannot identify the actual extent and severity of damage. New non-invasive and cost-effective non-destructive (NDT) remote sensing technologies can be used to monitor road pavements across their life cycle, including remotely sensed aerial and satellite visual and thermal image (AI) data, Unmanned Aerial Vehicles (UAVs), Spectroscopy and Ground Penetrating Radar (GRP). These non-contact techniques can be used to obtain surface and sub-surface information about damage in road pavements, including the crack depth, and in-depth structural failure. Thus, a smart and cost-effective methodology is required that integrates several of these non-destructive/ no-contact techniques for the damage assessment and monitoring at different levels. This paper presents an overview of how an integration of the above technologies can be used to conduct detailed road condition surveys. The proposed approach can also be used to predict the future needs for road maintenance; this information is proven to be valuable to a strategic decision making tools that optimizes maintenance based on resources and environmental issues.
Otlewska, Anna; Adamiak, Justyna; Gutarowska, Beata
2014-01-01
As a result of their unpredictable ability to adapt to varying environmental conditions, microorganisms inhabit different types of biological niches on Earth. Owing to the key role of microorganisms in many biogeochemical processes, trends in modern microbiology emphasize the need to know and understand the structure and function of complex microbial communities. This is particularly important if the strategy relates to microbial communities that cause biodeterioration of materials that constitute our cultural heritage. Until recently, the detection and identification of microorganisms inhabiting objects of cultural value was based only on cultivation-dependent methods. In spite of many advantages, these methods provide limited information because they identify only viable organisms capable of growth under standard laboratory conditions. However, in order to carry out proper conservation and renovation, it is necessary to know the complete composition of microbial communities and their activity. This paper presents and characterizes modern techniques such as genetic fingerprinting and clone library construction for the assessment of microbial diversity based on molecular biology. Molecular methods represent a favourable alternative to culture-dependent methods and make it possible to assess the biodiversity of microorganisms inhabiting technical materials and cultural heritage objects.
NASA Astrophysics Data System (ADS)
Miller, D. J.; Zhang, Z.; Ackerman, A. S.; Platnick, S. E.; Cornet, C.
2016-12-01
A remote sensing cloud retrieval simulator, created by coupling an LES cloud model with vector radiative transfer (RT) models is the ideal framework for assessing cloud remote sensing techniques. This simulator serves as a tool for understanding bi-spectral and polarimetric retrievals by comparing them directly to LES cloud properties (retrieval closure comparison) and for comparing the retrieval techniques to one another. Our simulator utilizes the DHARMA LES [Ackerman et al., 2004] with cloud properties based on marine boundary layer (MBL) clouds observed during the DYCOMS-II and ATEX field campaigns. The cloud reflectances are produced by the vectorized RT models based on polarized doubling adding and monte carlo techniques (PDA, MCPOL). Retrievals are performed utilizing techniques as similar as possible to those implemented on their corresponding well known instruments; polarimetric retrievals are based on techniques implemented for polarimeters (POLDER, AirMSPI, and RSP) and bi-spectral retrievals are performed using the Nakajima-King LUT method utilized on a number of spectral instruments (MODIS and VIIRS). Retrieval comparisons focus on cloud droplet effective radius (re), effective variance (ve), and cloud optical thickness (τ). This work explores the sensitivities of these two retrieval techniques to various observation limitations, such as spatial resolution/cloud inhomogeneity, impact of 3D radiative effects, and angular resolution requirements. With future remote sensing missions like NASA's Aerosols/Clouds/Ecosystems (ACE) planning to feature advanced polarimetric instruments it is important to understand how these retrieval techniques compare to one another. The cloud retrieval simulator we've developed allows us to probe these important questions in a realistically relevant test bed.
Flipping the Physical Examination: Web-Based Instruction and Live Assessment of Bedside Technique.
Williams, Dustyn E; Thornton, John W
2016-01-01
The skill of physicians teaching the physical examination skill has decreased, with newer faculty underperforming compared to their seniors. Improved methods of instruction with an emphasis on physical examinations are necessary to both improve the quality of medical education and alleviate the teaching burden of faculty physicians. We developed a curriculum that combines web-based instruction with real-life practice and features individualized feedback. This innovative medical education model should allow the physical examination to be taught and assessed in an effective manner. The model is under study at Baton Rouge General Medical Center. Our goals are to limit faculty burden, maximize student involvement as learners and evaluators, and effectively develop students' critical skills in performing bedside assessments.
Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.
Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin
2015-01-01
Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. PMID:26369671
Reliability of System Identification Techniques to Assess Standing Balance in Healthy Elderly
Maier, Andrea B.; Aarts, Ronald G. K. M.; van Gerven, Joop M. A.; Arendzen, J. Hans; Schouten, Alfred C.; Meskers, Carel G. M.; van der Kooij, Herman
2016-01-01
Objectives System identification techniques have the potential to assess the contribution of the underlying systems involved in standing balance by applying well-known disturbances. We investigated the reliability of standing balance parameters obtained with multivariate closed loop system identification techniques. Methods In twelve healthy elderly balance tests were performed twice a day during three days. Body sway was measured during two minutes of standing with eyes closed and the Balance test Room (BalRoom) was used to apply four disturbances simultaneously: two sensory disturbances, to the proprioceptive and the visual system, and two mechanical disturbances applied at the leg and trunk segment. Using system identification techniques, sensitivity functions of the sensory disturbances and the neuromuscular controller were estimated. Based on the generalizability theory (G theory), systematic errors and sources of variability were assessed using linear mixed models and reliability was assessed by computing indexes of dependability (ID), standard error of measurement (SEM) and minimal detectable change (MDC). Results A systematic error was found between the first and second trial in the sensitivity functions. No systematic error was found in the neuromuscular controller and body sway. The reliability of 15 of 25 parameters and body sway were moderate to excellent when the results of two trials on three days were averaged. To reach an excellent reliability on one day in 7 out of 25 parameters, it was predicted that at least seven trials must be averaged. Conclusion This study shows that system identification techniques are a promising method to assess the underlying systems involved in standing balance in elderly. However, most of the parameters do not appear to be reliable unless a large number of trials are collected across multiple days. To reach an excellent reliability in one third of the parameters, a training session for participants is needed and at least seven trials of two minutes must be performed on one day. PMID:26953694
Studies of industrial emissions by accelerator-based techniques: A review of applications at CEDAD
NASA Astrophysics Data System (ADS)
Calcagnile, L.; Quarta, G.
2012-04-01
Different research activities are in progress at the Centre for Dating and Diagnostics (CEDAD), University of Salento, in the field of environmental monitoring by exploiting the potentialities given by the different experimental beam lines implemented on the 3 MV Tande-tron accelerator and dedicated to AMS (Accelerator Mass Spectrome-try) radiocarbon dating and IB A (Ion Beam Analysis). An overview of these activities is presented by showing how accelerator-based analytical techniques can be a powerful tool for monitoring the anthropogenic carbon dioxide emissions from industrial sources and for the assessment of the biogenic content in SRF (Solid Recovered Fuel) burned in WTE (Waste to Energy) plants.
Pabon, Peter; Ternström, Sten; Lamarche, Anick
2011-06-01
To describe a method for unified description, statistical modeling, and comparison of voice range profile (VRP) contours, even from diverse sources. A morphologic modeling technique, which is based on Fourier descriptors (FDs), is applied to the VRP contour. The technique, which essentially involves resampling of the curve of the contour, is assessed and also is compared to density-based VRP averaging methods that use the overlap count. VRP contours can be usefully described and compared using FDs. The method also permits the visualization of the local covariation along the contour average. For example, the FD-based analysis shows that the population variance for ensembles of VRP contours is usually smallest at the upper left part of the VRP. To illustrate the method's advantages and possible further application, graphs are given that compare the averaged contours from different authors and recording devices--for normal, trained, and untrained male and female voices as well as for child voices. The proposed technique allows any VRP shape to be brought to the same uniform base. On this uniform base, VRP contours or contour elements coming from a variety of sources may be placed within the same graph for comparison and for statistical analysis.
Friesen, Megan R; Beggs, Jacqueline R; Gaskett, Anne C
2017-08-01
Sensory-based conservation harnesses species' natural communication and signalling behaviours to mitigate threats to wild populations. To evaluate this emerging field, we assess how sensory-based manipulations, sensory mode, and target taxa affect success. To facilitate broader, cross-species application of successful techniques, we test which behavioural and life-history traits correlate with positive conservation outcomes. We focus on seabirds, one of the world's most rapidly declining groups, whose philopatry, activity patterns, foraging, mate choice, and parental care behaviours all involve reliance on, and therefore strong selection for, sophisticated sensory physiology and accurate assessment of intra- and inter-species signals and cues in several sensory modes. We review the use of auditory, olfactory, and visual methods, especially for attracting seabirds to newly restored habitat or deterring birds from fishing boats and equipment. We found that more sensory-based conservation has been attempted with Procellariiformes (tube-nosed seabirds) and Charadriiformes (e.g. terns and gulls) than other orders, and that successful outcomes are more likely for Procellariiformes. Evolutionary and behavioural traits are likely to facilitate sensory-based techniques, such as social attraction to suitable habitat, across seabird species. More broadly, successful application of sensory-based conservation to other at-risk animal groups is likely to be associated with these behavioural and life-history traits: coloniality, philopatry, nocturnal, migratory, long-distance foraging, parental care, and pair bonds/monogamy. © 2016 Cambridge Philosophical Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Prashant, E-mail: prashantkumar@csio.res.in; Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030; Bansod, Baban K.S.
2015-02-15
Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models havemore » been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper.« less
Yardley, Lucy; Dennison, Laura; Coker, Rebecca; Webley, Frances; Middleton, Karen; Barnett, Jane; Beattie, Angela; Evans, Maggie; Smith, Peter; Little, Paul
2010-04-01
Lessons in the Alexander Technique and exercise prescription proved effective for managing low back pain in primary care in a clinical trial. To understand trial participants' expectations and experiences of the Alexander Technique and exercise prescription. A questionnaire assessing attitudes to the intervention, based on the Theory of Planned Behaviour, was completed at baseline and 3-month follow-up by 183 people assigned to lessons in the Alexander Technique and 176 people assigned to exercise prescription. Semi-structured interviews to assess the beliefs contributing to attitudes to the intervention were carried out at baseline with14 people assigned to the lessons in the Alexander Technique and 16 to exercise prescription, and at follow-up with 15 members of the baseline sample. Questionnaire responses indicated that attitudes to both interventions were positive at baseline but became more positive at follow-up only in those assigned to lessons in the Alexander Technique. Thematic analysis of the interviews suggested that at follow-up many patients who had learned the Alexander Technique felt they could manage back pain better. Whereas many obstacles to exercising were reported, few barriers to learning the Alexander Technique were described, since it 'made sense', could be practiced while carrying out everyday activities or relaxing, and the teachers provided personal advice and support. Using the Alexander Technique was viewed as effective by most patients. Acceptability may have been superior to exercise because of a convincing rationale and social support and a better perceived fit with the patient's particular symptoms and lifestyle.