Do Scaffolded Supports between Aspects of Problem Solving Enhance Assessment Usability?
ERIC Educational Resources Information Center
McCoy, Jan D.; Braun-Monegan, Jenelle; Bettesworth, Leanne; Tindal, Gerald
2015-01-01
While problem solving as an instructional technique is widely advocated, educators are often challenged in effectively assessing student skill in this area. Students failing to solve a problem might fail in any of several aspects of the effort. The purpose of this research was to validate a scaffolded technique for assessing problem solving in…
Lexicographic goal programming and assessment tools for a combinatorial production problem.
DOT National Transportation Integrated Search
2008-01-01
NP-complete combinatorial problems often necessitate the use of near-optimal solution techniques including : heuristics and metaheuristics. The addition of multiple optimization criteria can further complicate : comparison of these solution technique...
Turturro, A; Hart, R W
1987-01-01
A better understanding of chemical-induced cancer has led to appreciation of similarities to problems addressed by risk management of radiation-induced toxicity. Techniques developed for cancer risk assessment of toxic substances can be generalized to toxic agents. A recent problem-solving approach for risk management of toxic substances developed for the U.S. Department of Health and Human Services, and the role of risk assessment and how uncertainty should be treated within the context of this approach, is discussed. Finally, two different methods, research into the assumptions underlying risk assessment and the modification of risk assessment/risk management documents, are used to illustrate how the technique can be applied.
Alternative Assessment Techniques.
ERIC Educational Resources Information Center
Lowenthal, Barbara
1988-01-01
Maintaining the precision necessary for administering norm referenced tests can be a problem for the special education teacher who is trained to assist the student. Criterion-referenced tests, observations, and interviews are presented as effective alternative assessment techniques. (JDD)
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
Fostering and Assessing Creativity in Technology Education
ERIC Educational Resources Information Center
Buelin-Biesecker, Jennifer Katherine
2012-01-01
This study compared the creative outcomes in student work resulting from two pedagogical approaches to creative problem solving activities. A secondary goal was to validate the Consensual Assessment Technique (CAT) as a means of assessing creativity. Linear models for problem solving and design processes serve as the current paradigm in classroom…
A Structured Approach to Teaching Applied Problem Solving through Technology Assessment.
ERIC Educational Resources Information Center
Fischbach, Fritz A.; Sell, Nancy J.
1986-01-01
Describes an approach to problem solving based on real-world problems. Discusses problem analysis and definitions, preparation of briefing documents, solution finding techniques (brainstorming and synectics), solution evaluation and judgment, and implementation. (JM)
Using Reflection Documents to Assess Student Learning
ERIC Educational Resources Information Center
Powell, Larkin A.
2009-01-01
Traditional assessment methods such as tests and essays may not be adequate to evaluate students' ability to solve problems and think critically. I developed a qualitative assessment technique for a junior-level Wildlife Management Techniques course that incorporated written responses in a pre- and post-course reflection exercise. I provided the…
Attitudes Toward Patient Management Problems as a Self-Assessment Technique in Dermatology
ERIC Educational Resources Information Center
Ramsay, David L.; And Others
1977-01-01
Patient management problems were found to be favorable methods of self-assessment by an overwhelming majority of practicing dermatologists and those in training, regardless of the type of practice or the number of years in practice. (LBH)
Investigating gender differences in alcohol problems: a latent trait modeling approach.
Nichol, Penny E; Krueger, Robert F; Iacono, William G
2007-05-01
Inconsistent results have been found in research investigating gender differences in alcohol problems. Previous studies of gender differences used a wide range of methodological techniques, as well as limited assortments of alcohol problems. Parents (1,348 men and 1,402 women) of twins enrolled in the Minnesota Twin Family Study answered questions about a wide range of alcohol problems. A latent trait modeling technique was used to evaluate gender differences in the probability of endorsement at the problem level and for the overall 105-problem scale. Of the 34 problems that showed significant gender differences, 29 were more likely to be endorsed by men than women with equivalent overall alcohol problem levels. These male-oriented symptoms included measures of heavy drinking, duration of drinking, tolerance, and acting out behaviors. Nineteen symptoms were denoted for removal to create a scale that favored neither gender in assessment. Significant gender differences were found in approximately one-third of the symptoms assessed and in the overall scale. Further examination of the nature of gender differences in alcohol problem symptoms should be undertaken to investigate whether a gender-neutral scale should be created or if men and women should be assessed with separate criteria for alcohol dependence and abuse.
Interest and limitations of projective techniques in the assessment of personality disorders.
Petot, J M
2000-06-01
Assessing personality disorders (PD) remains a difficult task because of persistent problems linked to concurrent validity of existing instruments, which are all structured interviews or self-report inventories. It has been advocated that indirect methods, projective techniques in particular, can strengthen PD assessment methods. The thematic apperception test (TAT) may be a significant adjuvant method of PD assessment.
The Management of NASA Employee Health Problem; Status 1971
NASA Technical Reports Server (NTRS)
Arnoldi, L. B.
1971-01-01
A system for assessing employee health problems is introduced. The automated billing system is based on an input format including cost of medical services by user and measures in dollars, that portion of resources spent on preventive techniques versus therapeutic techniques. The system is capable of printing long term medical histories of any employee.
Risk prioritisation using the analytic hierarchy process
NASA Astrophysics Data System (ADS)
Sum, Rabihah Md.
2015-12-01
This study demonstrated how to use the Analytic Hierarchy Process (AHP) to prioritise risks of an insurance company. AHP is a technique to structure complex problems by arranging elements of the problems in a hierarchy, assigning numerical values to subjective judgements on the relative importance of the elements and synthesizing the judgements to determine which elements have the highest priority. The study is motivated by wide application of AHP as a prioritisation technique in complex problems. It aims to show AHP is able to minimise some limitations of risk assessment technique using likelihood and impact. The study shows AHP is able to provide consistency check on subjective judgements, organise a large number of risks into a structured framework, assist risk managers to make explicit risk trade-offs, and provide an easy to understand and systematic risk assessment process.
Mathematics Competency for Beginning Chemistry Students Through Dimensional Analysis.
Pursell, David P; Forlemu, Neville Y; Anagho, Leonard E
2017-01-01
Mathematics competency in nursing education and practice may be addressed by an instructional variation of the traditional dimensional analysis technique typically presented in beginning chemistry courses. The authors studied 73 beginning chemistry students using the typical dimensional analysis technique and the variation technique. Student quantitative problem-solving performance was evaluated. Students using the variation technique scored significantly better (18.3 of 20 points, p < .0001) on the final examination quantitative titration problem than those who used the typical technique (10.9 of 20 points). American Chemical Society examination scores and in-house assessment indicate that better performing beginning chemistry students were more likely to use the variation technique rather than the typical technique. The variation technique may be useful as an alternative instructional approach to enhance beginning chemistry students' mathematics competency and problem-solving ability in both education and practice. [J Nurs Educ. 2017;56(1):22-26.]. Copyright 2017, SLACK Incorporated.
Optimization Techniques for Design Problems in Selected Areas in WSNs: A Tutorial
Ibrahim, Ahmed; Alfa, Attahiru
2017-01-01
This paper is intended to serve as an overview of, and mostly a tutorial to illustrate, the optimization techniques used in several different key design aspects that have been considered in the literature of wireless sensor networks (WSNs). It targets the researchers who are new to the mathematical optimization tool, and wish to apply it to WSN design problems. We hence divide the paper into two main parts. One part is dedicated to introduce optimization theory and an overview on some of its techniques that could be helpful in design problem in WSNs. In the second part, we present a number of design aspects that we came across in the WSN literature in which mathematical optimization methods have been used in the design. For each design aspect, a key paper is selected, and for each we explain the formulation techniques and the solution methods implemented. We also provide in-depth analyses and assessments of the problem formulations, the corresponding solution techniques and experimental procedures in some of these papers. The analyses and assessments, which are provided in the form of comments, are meant to reflect the points that we believe should be taken into account when using optimization as a tool for design purposes. PMID:28763039
Optimization Techniques for Design Problems in Selected Areas in WSNs: A Tutorial.
Ibrahim, Ahmed; Alfa, Attahiru
2017-08-01
This paper is intended to serve as an overview of, and mostly a tutorial to illustrate, the optimization techniques used in several different key design aspects that have been considered in the literature of wireless sensor networks (WSNs). It targets the researchers who are new to the mathematical optimization tool, and wish to apply it to WSN design problems. We hence divide the paper into two main parts. One part is dedicated to introduce optimization theory and an overview on some of its techniques that could be helpful in design problem in WSNs. In the second part, we present a number of design aspects that we came across in the WSN literature in which mathematical optimization methods have been used in the design. For each design aspect, a key paper is selected, and for each we explain the formulation techniques and the solution methods implemented. We also provide in-depth analyses and assessments of the problem formulations, the corresponding solution techniques and experimental procedures in some of these papers. The analyses and assessments, which are provided in the form of comments, are meant to reflect the points that we believe should be taken into account when using optimization as a tool for design purposes.
The role of optimization in the next generation of computer-based design tools
NASA Technical Reports Server (NTRS)
Rogan, J. Edward
1989-01-01
There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.
Applications of remote sensing to estuarine problems. [estuaries of Chesapeake Bay
NASA Technical Reports Server (NTRS)
Munday, J. C., Jr.
1975-01-01
A variety of siting problems for the estuaries of the lower Chesapeake Bay have been solved with cost beneficial remote sensing techniques. Principal techniques used were repetitive 1:30,000 color photography of dye emitting buoys to map circulation patterns, and investigation of water color boundaries via color and color infrared imagery to scales of 1:120,000. Problems solved included sewage outfall siting, shoreline preservation and enhancement, oil pollution risk assessment, and protection of shellfish beds from dredge operations.
Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem.
Rajeswari, M; Amudhavel, J; Pothula, Sujatha; Dhavachelvan, P
2017-01-01
The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria.
Directed Bee Colony Optimization Algorithm to Solve the Nurse Rostering Problem
Amudhavel, J.; Pothula, Sujatha; Dhavachelvan, P.
2017-01-01
The Nurse Rostering Problem is an NP-hard combinatorial optimization, scheduling problem for assigning a set of nurses to shifts per day by considering both hard and soft constraints. A novel metaheuristic technique is required for solving Nurse Rostering Problem (NRP). This work proposes a metaheuristic technique called Directed Bee Colony Optimization Algorithm using the Modified Nelder-Mead Method for solving the NRP. To solve the NRP, the authors used a multiobjective mathematical programming model and proposed a methodology for the adaptation of a Multiobjective Directed Bee Colony Optimization (MODBCO). MODBCO is used successfully for solving the multiobjective problem of optimizing the scheduling problems. This MODBCO is an integration of deterministic local search, multiagent particle system environment, and honey bee decision-making process. The performance of the algorithm is assessed using the standard dataset INRC2010, and it reflects many real-world cases which vary in size and complexity. The experimental analysis uses statistical tools to show the uniqueness of the algorithm on assessment criteria. PMID:28473849
Davidson, Judy E
2009-03-01
The purpose of this article is to provide examples of learning activities to be used as formative (interim) evaluation of an in-hospital orientation or cross-training program. Examples are provided in the form of vignettes that have been derived from strategies described in the literature as classroom assessment techniques. Although these classroom assessment techniques were originally designed for classroom experiences, they are proposed as methods for preceptors to stimulate the development of higher-order thinking such as synthesizing information, solving problems, and learning how to learn.
Rule-governed Approaches to Physics--Newton's Third Law.
ERIC Educational Resources Information Center
Maloney, David P.
1984-01-01
Describes an approach to assessing the use of rules in solving problems related to Newton's third law of motion. Discusses the problems used, method of questioning, scoring of problem sets, and a general overview of the use of the technique in aiding the teacher in dealing with student's conceptual levels. (JM)
The Westminster Eighth Grade World Problems Course (Pilot Project).
ERIC Educational Resources Information Center
Barth, James P.; And Others
The rationale, objectives, and social studies units are provided in this curriculum guide for grade 8. Focus is upon students' assessing, hypothesizing, and synthesizing the world's critical problems. Teaching techniques are process education oriented emphasizing inquiry training, problem solving, and inductive learning in an attempt to prepare…
ERIC Educational Resources Information Center
Otani, Akira
1989-01-01
Examines several basic hypnotherapeutic techniques (rapport building, problem assessment, resistance management, and behavior change) based on Milton H. Erickson's hypnotherapeutic principles that can be translated into the general framework of counseling. (Author/CM)
NASA Technical Reports Server (NTRS)
Rado, B. Q.
1975-01-01
Automatic classification techniques are described in relation to future information and natural resource planning systems with emphasis on application to Georgia resource management problems. The concept, design, and purpose of Georgia's statewide Resource AS Assessment Program is reviewed along with participation in a workshop at the Earth Resources Laboratory. Potential areas of application discussed include: agriculture, forestry, water resources, environmental planning, and geology.
Methods for trend analysis: Examples with problem/failure data
NASA Technical Reports Server (NTRS)
Church, Curtis K.
1989-01-01
Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.
Demonstration Assessment: Measuring Conceptual Understanding and Critical Thinking with Rubrics.
ERIC Educational Resources Information Center
Radford, David L.; And Others
1995-01-01
Presents the science demonstration assessment as an authentic- assessment technique to assess whether students understand basic science concepts and can use them to solve problems. Uses rubrics to prepare students for the assessment and to assign final grades. Provides examples of science demonstration assessments and the scoring of rubrics in the…
Review of the Draw a Person: Screening Procedure for Emotional Disturbance.
ERIC Educational Resources Information Center
Trevisan, Michael S.
1996-01-01
The Draw a Person: Screening Procedures for Emotional Disturbance (DAP:SPED) is a projective technique used in the initial assessment of children suffering from emotional problems, and unlike most projective techniques, features sound psychometric development. (Author)
ERIC Educational Resources Information Center
Floyd, Randy G.; Phaneuf, Robin L.; Wilczynski, Susan M.
2005-01-01
Indirect assessment instruments used during functional behavioral assessment, such as rating scales, interviews, and self-report instruments, represent the least intrusive techniques for acquiring information about the function of problem behavior. This article provides criteria for examining the measurement properties of these instruments…
Feeding At-Risk Infants and Toddlers.
ERIC Educational Resources Information Center
Jaffe, Mata B.
1989-01-01
Speech-language pathologists working with infants or toddlers with feeding problems should obtain a feeding history, conduct an assessment of feeding practices, set appropriate preliminary and long-range goals, and investigate treatment options and appropriate feeding techniques. Feeding techniques for premature, neurologically impaired, Down…
NASA Astrophysics Data System (ADS)
Darma, I. K.
2018-01-01
This research is aimed at determining: 1) the differences of mathematical problem solving ability between the students facilitated with problem-based learning model and conventional learning model, 2) the differences of mathematical problem solving ability between the students facilitated with authentic and conventional assessment model, and 3) interaction effect between learning and assessment model on mathematical problem solving. The research was conducted in Bali State Polytechnic, using the 2x2 experiment factorial design. The samples of this research were 110 students. The data were collected using a theoretically and empirically-validated test. Instruments were validated by using Aiken’s approach of technique content validity and item analysis, and then analyzed using anova stylistic. The result of the analysis shows that the students facilitated with problem-based learning and authentic assessment models get the highest score average compared to the other students, both in the concept understanding and mathematical problem solving. The result of hypothesis test shows that, significantly: 1) there is difference of mathematical problem solving ability between the students facilitated with problem-based learning model and conventional learning model, 2) there is difference of mathematical problem solving ability between the students facilitated with authentic assessment model and conventional assessment model, and 3) there is interaction effect between learning model and assessment model on mathematical problem solving. In order to improve the effectiveness of mathematics learning, collaboration between problem-based learning model and authentic assessment model can be considered as one of learning models in class.
NASA Technical Reports Server (NTRS)
Harriss, R. C.
1980-01-01
Application of remote sensing techniques to the solution of geochemical problems is considered with emphasis on the 'carbon-cycle'. The problem of carbon dioxide sinks and the areal extent of coral reefs are treated. In order to assess the problems cited it is suggested that remote sensing techniques be utilized to: (1)monitor globally the carbonate and bicarbonate concentrations in surface waters of the world ocean; (2)monitor the freshwater and oceanic biomass and associated dissolved organic carbon; (3) inventory the coral reef areas and types and the associated oceanographic climatic conditions; and (4)measure the heavy metal fluxes from forested and vegetated areas, from volcanos, from different types of crustal rocks, from soils, and from sea surfaces.
Assessment of computer-related health problems among post-graduate nursing students.
Khan, Shaheen Akhtar; Sharma, Veena
2013-01-01
The study was conducted to assess computer-related health problems among post-graduate nursing students and to develop a Self Instructional Module for prevention of computer-related health problems in a selected university situated in Delhi. A descriptive survey with co-relational design was adopted. A total of 97 samples were selected from different faculties of Jamia Hamdard by multi stage sampling with systematic random sampling technique. Among post-graduate students, majority of sample subjects had average compliance with computer-related ergonomics principles. As regards computer related health problems, majority of post graduate students had moderate computer-related health problems, Self Instructional Module developed for prevention of computer-related health problems was found to be acceptable by the post-graduate students.
Assessing non-uniqueness: An algebraic approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasco, Don W.
Geophysical inverse problems are endowed with a rich mathematical structure. When discretized, most differential and integral equations of interest are algebraic (polynomial) in form. Techniques from algebraic geometry and computational algebra provide a means to address questions of existence and uniqueness for both linear and non-linear inverse problem. In a sense, the methods extend ideas which have proven fruitful in treating linear inverse problems.
Secondary Breast Augmentation.
Brown, Mitchell H; Somogyi, Ron B; Aggarwal, Shagun
2016-07-01
After studying this article, the participant should be able to: 1. Assess common clinical problems in the secondary breast augmentation patient. 2. Describe a treatment plan to correct the most common complications of breast augmentation. 3. Provide surgical and nonsurgical options for managing complications of breast augmentation. 4. Decrease the incidence of future complications through accurate assessment, preoperative planning, and precise surgical technique. Breast augmentation has been increasing steadily in popularity over the past three decades. Many of these patients present with secondary problems or complications following their primary breast augmentation. Two of the most common complications are capsular contracture and implant malposition. Familiarity and comfort with the assessment and management of these complications is necessary for all plastic surgeons. An up-to-date understanding of current devices and techniques may decrease the need to manage future complications from the current cohort of breast augmentation patients.
Problem-based learning in laboratory medicine resident education: a satisfaction survey.
Lepiller, Quentin; Solis, Morgane; Velay, Aurélie; Gantner, Pierre; Sueur, Charlotte; Stoll-Keller, Françoise; Barth, Heidi; Fafi-Kremer, Samira
2017-04-01
Theoretical knowledge in biology and medicine plays a substantial role in laboratory medicine resident education. In this study, we assessed the contribution of problem-based learning (PBL) to improve the training of laboratory medicine residents during their internship in the department of virology, Strasbourg University Hospital, France. We compared the residents' satisfaction regarding an educational program based on PBL and a program based on lectures and presentations. PBL induced a high level of satisfaction (100%) among residents compared to lectures and presentations (53%). The main advantages of this technique were to create a situational interest regarding virological problems, to boost the residents' motivation and to help them identify the most relevant learning objectives in virology. However, it appears pertinent to educate the residents in appropriate bibliographic research techniques prior to PBL use and to monitor their learning by regular formative assessment sessions.
Alternative approaches to forestry research evaluation: an assessment.
Pamela J. Jakes; Earl C. Leatherberry
1986-01-01
Reviews research evaluation techniques in a variety of fields an assesses the usefulness of various approaches or combinations of approaches for forestry research evaluation. Presents an evaluation framework that will help users develop an approach suitable for their specific problem.
An information diffusion technique to assess integrated hazard risks.
Huang, Chongfu; Huang, Yundong
2018-02-01
An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.
Environment effects from SRB exhaust effluents: Technique development and preliminary assessment
NASA Technical Reports Server (NTRS)
Goldford, A. I.; Adelfang, S. I.; Hickey, J. S.; Smith, S. R.; Welty, R. P.; White, G. L.
1977-01-01
Techniques to determine the environmental effects from the space shuttle SRB (Solid Rocket Booster) exhaust effluents are used to perform a preliminary climatological assessment. The exhaust effluent chemistry study was performed and the exhaust effluent species were determined. A reasonable exhaust particle size distribution is constructed for use in nozzle analyses and for the deposition model. The preliminary assessment is used to identify problems that are associated with the full-scale assessment; therefore, these preliminary air quality results are used with caution in drawing conclusion regarding the environmental effects of the space shuttle exhaust effluents.
DOT National Transportation Integrated Search
1971-06-01
A study was conducted in which performance on a non-verbal problem- solving task was correlated with the Otis Quick Scoring Mental Ability Test and the Raven Progressive Matrices Test. The problem-solving task, called 'code- lock' required the subjec...
Improving Problem-Solving Techniques for Students in Low-Performing Schools
ERIC Educational Resources Information Center
Hobbs, Robert Maurice
2012-01-01
Teachers can use culturally relevant pedagogical strategies and technologies as emerging tools to improve students' problem-solving skills. The purpose of this study was to investigate and assess the effectiveness of culturally specific computer-based instructional tasks on ninth-grade African American mathematics students. This study tried to…
Screening and Assessment of Young Children.
ERIC Educational Resources Information Center
Friedlander, Bernard Z.
Most language development hazards in infancy and early childhood fall into the categories of auditory impairment, central integrative dysfunction, inadequate environmental support, and peripheral expressive impairment. Existing knowledge and techniques are inadequate to meet the screening and assessment problems of central integrative dysfunction,…
ERIC Educational Resources Information Center
Chromy, James R.
This study addressed statistical techniques that might ameliorate some of the sampling problems currently facing states with small populations participating in State National Assessment of Educational Progress (NAEP) assessments. The study explored how the application of finite population correction factors to the between-school component of…
Downscaling Indicators of Forest Habitat Structure from National Assessments
Kurt H. Riitters
2005-01-01
Downscaling is an important problem because consistent large-area assessments of forest habitat structure, while feasible, are only feasible when using relatively coarse data and indicators. Techniques are needed to enable more detailed and local interpretations of the national statistics. Using the results of national assessments from land-cover maps, this paper...
Measuring Physical Activity in the Elderly: Some Implications for Nutrition.
ERIC Educational Resources Information Center
Shephard, Roy J.
1990-01-01
Measurement of physical activity patterns is discussed in terms of data obtained by attitude assessment, activity questionnaires, personal monitoring devices, and fitness assessment. Problems of each technique are described. Application of activity measures to the estimation of total dietary needs is discussed. (SK)
Changes in Teachers' Adaptive Expertise in an Engineering Professional Development Course
ERIC Educational Resources Information Center
Martin, Taylor; Peacock, Stephanie Baker; Ko, Pat; Rudolph, Jennifer J.
2015-01-01
Although the consensus seems to be that high-school-level introductory engineering courses should focus on design, this creates a problem for teacher training. Traditionally, math and science teachers are trained to teach and assess factual knowledge and closed-ended problem-solving techniques specific to a particular discipline, which is unsuited…
Preliminary assessment of aerial photography techniques for canvasback population analysis
Munro, R.E.; Trauger, D.L.
1976-01-01
Recent intensive research on the canvasback has focused attention on the need for more precise estimates of population parameters. During the 1972-75 period, various types of aerial photographing equipment were evaluated to determine the problems and potentials for employing these techniques in appraisals of canvasback populations. The equipment and procedures available for automated analysis of aerial photographic imagery were also investigated. Serious technical problems remain to be resolved, but some promising results were obtained. Final conclusions about the feasibility of operational implementation await a more rigorous analysis of the data collected.
Imbalanced Learning for Functional State Assessment
NASA Technical Reports Server (NTRS)
Li, Feng; McKenzie, Frederick; Li, Jiang; Zhang, Guangfan; Xu, Roger; Richey, Carl; Schnell, Tom
2011-01-01
This paper presents results of several imbalanced learning techniques applied to operator functional state assessment where the data is highly imbalanced, i.e., some function states (majority classes) have much more training samples than other states (minority classes). Conventional machine learning techniques usually tend to classify all data samples into majority classes and perform poorly for minority classes. In this study, we implemented five imbalanced learning techniques, including random undersampling, random over-sampling, synthetic minority over-sampling technique (SMOTE), borderline-SMOTE and adaptive synthetic sampling (ADASYN) to solve this problem. Experimental results on a benchmark driving lest dataset show thai accuracies for minority classes could be improved dramatically with a cost of slight performance degradations for majority classes,
NASA Technical Reports Server (NTRS)
Yunis, Isam S.; Carney, Kelly S.
1993-01-01
A new aerospace application of structural reliability techniques is presented, where the applied forces depend on many probabilistic variables. This application is the plume impingement loading of the Space Station Freedom Photovoltaic Arrays. When the space shuttle berths with Space Station Freedom it must brake and maneuver towards the berthing point using its primary jets. The jet exhaust, or plume, may cause high loads on the photovoltaic arrays. The many parameters governing this problem are highly uncertain and random. An approach, using techniques from structural reliability, as opposed to the accepted deterministic methods, is presented which assesses the probability of failure of the array mast due to plume impingement loading. A Monte Carlo simulation of the berthing approach is used to determine the probability distribution of the loading. A probability distribution is also determined for the strength of the array. Structural reliability techniques are then used to assess the array mast design. These techniques are found to be superior to the standard deterministic dynamic transient analysis, for this class of problem. The results show that the probability of failure of the current array mast design, during its 15 year life, is minute.
The contamination of the subsurface environment by dense non-aqueous phase liquids (DNAPL) is a wide-spread problem that poses a significant threat to soil and groundwater quality. Implementing different remediation techniques can lead to the removal of a high fraction of the DNA...
Using Multilevel Modeling in Language Assessment Research: A Conceptual Introduction
ERIC Educational Resources Information Center
Barkaoui, Khaled
2013-01-01
This article critiques traditional single-level statistical approaches (e.g., multiple regression analysis) to examining relationships between language test scores and variables in the assessment setting. It highlights the conceptual, methodological, and statistical problems associated with these techniques in dealing with multilevel or nested…
Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community
2016-01-01
Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement
Some Viable Techniques for Assessing and Counselling Cognitive Processing Weakness
ERIC Educational Resources Information Center
Haruna, Abubakar Sadiq
2016-01-01
Cognitive Processing weakness (CPW) is a psychological problem that impedes students' ability to learn effectively in a normal school setting. Such weakness may include; auditory, visual, conceptual, sequential, speed and attention processing. This paper therefore examines the basic assessment or diagnostic approaches such as Diagnosis by…
Assessing Institutional Ineffectiveness: A Strategy for Improvement.
ERIC Educational Resources Information Center
Cameron, Kim S.
1984-01-01
Based on the theory that institutional change and improvement are motivated more by knowledge of problems than by knowledge of successes, a fault tree analysis technique using Boolean logic for assessing institutional ineffectiveness by determining weaknesses in the system is presented. Advantages and disadvantages of focusing on weakness rather…
Assessment and mitigation of power quality problems for PUSPATI TRIGA Reactor (RTP)
NASA Astrophysics Data System (ADS)
Zakaria, Mohd Fazli; Ramachandaramurthy, Vigna K.
2017-01-01
An electrical power systems are exposed to different types of power quality disturbances. Investigation and monitoring of power quality are necessary to maintain accurate operation of sensitive equipment especially for nuclear installations. This paper will discuss the power quality problems observed at the electrical sources of PUSPATI TRIGA Reactor (RTP). Assessment of power quality requires the identification of any anomalous behavior on a power system, which adversely affects the normal operation of electrical or electronic equipment. A power quality assessment involves gathering data resources; analyzing the data (with reference to power quality standards) then, if problems exist, recommendation of mitigation techniques must be considered. Field power quality data is collected by power quality recorder and analyzed with reference to power quality standards. Normally the electrical power is supplied to the RTP via two sources in order to keep a good reliability where each of them is designed to carry the full load. The assessment of power quality during reactor operation was performed for both electrical sources. There were several disturbances such as voltage harmonics and flicker that exceeded the thresholds. To reduce these disturbances, mitigation techniques have been proposed, such as to install passive harmonic filters to reduce harmonic distortion, dynamic voltage restorer (DVR) to reduce voltage disturbances and isolate all sensitive and critical loads.
Test Information. Using the Essay as an Assessment Technique. Set 77. Number One. Item 13.
ERIC Educational Resources Information Center
Cowie, Colin
Certain testing procedures will overcome some of the problems associated with the use of essay tests. Essay tests may not validly indicate achievement because the questions included in the test may not fairly represent instructional content. Reliability may be a problem because of variations in examinee response in different situations, in test…
Benchmark Problems Used to Assess Computational Aeroacoustics Codes
NASA Technical Reports Server (NTRS)
Dahl, Milo D.; Envia, Edmane
2005-01-01
The field of computational aeroacoustics (CAA) encompasses numerical techniques for calculating all aspects of sound generation and propagation in air directly from fundamental governing equations. Aeroacoustic problems typically involve flow-generated noise, with and without the presence of a solid surface, and the propagation of the sound to a receiver far away from the noise source. It is a challenge to obtain accurate numerical solutions to these problems. The NASA Glenn Research Center has been at the forefront in developing and promoting the development of CAA techniques and methodologies for computing the noise generated by aircraft propulsion systems. To assess the technological advancement of CAA, Glenn, in cooperation with the Ohio Aerospace Institute and the AeroAcoustics Research Consortium, organized and hosted the Fourth CAA Workshop on Benchmark Problems. Participants from industry and academia from both the United States and abroad joined to present and discuss solutions to benchmark problems. These demonstrated technical progress ranging from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The results are documented in the proceedings of the workshop. Problems were solved in five categories. In three of the five categories, exact solutions were available for comparison with CAA results. A fourth category of problems representing sound generation from either a single airfoil or a blade row interacting with a gust (i.e., problems relevant to fan noise) had approximate analytical or completely numerical solutions. The fifth category of problems involved sound generation in a viscous flow. In this case, the CAA results were compared with experimental data.
NASA Astrophysics Data System (ADS)
Rahmawati; Rustaman, Nuryani Y.; Hamidah, Ida; Rusdiana, Dadi
2017-02-01
The aim of this study was to explore the use of assessment strategy which can measure problem solving skills of pre-service teachers based on their cognitive style in basic physics course. The sample consisted of 95 persons (male = 15, female = 75). This study used an exploratory research with observation techniques by interview, questionnaire, and test. The results indicated that the lecturer only used paper-pencil test assessment strategy to measure pre-service teachers’ achievement and also used conventional learning strategy. It means that the lecturer did not measure pre-services’ thinking process in learning, like problem solving skills. One of the factors which can influence student problem solving skills is cognitive style as an internal factor. Field Dependent (FD) and Field Independent (FI) are two cognitive styles which were measured with using Group Embedded Figure Test (GEFT) test. The result showed that 82% of pre-service teachers were FD cognitive style and only 18% of pre-service teachers had FI cognitive style. Furthermore, these findings became the fundamental design to develop a problem solving assessment model to measure pre-service teachers’ problem solving skills and process in basic physics course.
Marchand, C; Gagnayre, R; d'Ivernois, J F
1996-01-01
There are very few examples of health training assessment in developing countries. Such an undertaking faces a number of difficulties concerning the problems inherent to assessment, the particular and unstable nature of the environment, and the problems associated with humanitarian action and development aid. It is difficult to choose between a formal and a natural approach. Indeed, a dual approach, combining quantitative and qualitative data seems best suited to a variety of cultural contexts of variable stability. Faced with these difficulties, a criteria-based, formative, quality-oriented assessment aimed at improving teaching and learning methods should be able to satisfy the needs of training professionals. We propose a training assessment guide based on an assessment model which aims to improve training techniques using comprehensive, descriptive and prescriptive approaches.
Student Recruitment: A Market Research Primer.
ERIC Educational Resources Information Center
Moore, Richard W.
1987-01-01
Illustrates how proprietary schools and community colleges have used market research techniques to identify marketing problems, determine student characteristics, measure market penetration, understand market position vis-a-vis the competition, and assess applicants' perceptions. (AYC)
Digital Photography as a Tool to Measure School Cafeteria Consumption
ERIC Educational Resources Information Center
Swanson, Mark
2008-01-01
Background: Assessing actual consumption of school cafeteria meals presents challenges, given recall problems of children, the cost of direct observation, and the time constraints in the school cafeteria setting. This study assesses the use of digital photography as a technique to measure what elementary-aged students select and actually consume…
Physiological correlates of mental workload
NASA Technical Reports Server (NTRS)
Zacharias, G. L.
1980-01-01
A literature review was conducted to assess the basis of and techniques for physiological assessment of mental workload. The study findings reviewed had shortcomings involving one or more of the following basic problems: (1) physiologic arousal can be easily driven by nonworkload factors, confounding any proposed metric; (2) the profound absence of underlying physiologic models has promulgated a multiplicity of seemingly arbitrary signal processing techniques; (3) the unspecified multidimensional nature of physiological "state" has given rise to a broad spectrum of competing noncommensurate metrics; and (4) the lack of an adequate definition of workload compels physiologic correlations to suffer either from the vagueness of implicit workload measures or from the variance of explicit subjective assessments. Using specific studies as examples, two basic signal processing/data reduction techniques in current use, time and ensemble averaging are discussed.
Meyer, Frans J C; Davidson, David B; Jakobus, Ulrich; Stuchly, Maria A
2003-02-01
A hybrid finite-element method (FEM)/method of moments (MoM) technique is employed for specific absorption rate (SAR) calculations in a human phantom in the near field of a typical group special mobile (GSM) base-station antenna. The MoM is used to model the metallic surfaces and wires of the base-station antenna, and the FEM is used to model the heterogeneous human phantom. The advantages of each of these frequency domain techniques are, thus, exploited, leading to a highly efficient and robust numerical method for addressing this type of bioelectromagnetic problem. The basic mathematical formulation of the hybrid technique is presented. This is followed by a discussion of important implementation details-in particular, the linear algebra routines for sparse, complex FEM matrices combined with dense MoM matrices. The implementation is validated by comparing results to MoM (surface equivalence principle implementation) and finite-difference time-domain (FDTD) solutions of human exposure problems. A comparison of the computational efficiency of the different techniques is presented. The FEM/MoM implementation is then used for whole-body and critical-organ SAR calculations in a phantom at different positions in the near field of a base-station antenna. This problem cannot, in general, be solved using the MoM or FDTD due to computational limitations. This paper shows that the specific hybrid FEM/MoM implementation is an efficient numerical tool for accurate assessment of human exposure in the near field of base-station antennas.
Assessing Mission Impact of Cyberattacks: Report of the NATO IST-128 Workshop
2015-12-01
simulation) perspective. This would be natural, considering that the cybersecurity problem is highly adversarial in nature. Because it involves intelligent ...be formulated as a partial information game; artificial intelligence techniques might help here. Yet another style of problem formulation that...computational information processing for weapons, intelligence , communication, and logistics systems continues to increase the vulnerability of
Relevance Vector Machine Learning for Neonate Pain Intensity Assessment Using Digital Imaging
Gholami, Behnood; Tannenbaum, Allen R.
2011-01-01
Pain assessment in patients who are unable to verbally communicate is a challenging problem. The fundamental limitations in pain assessment in neonates stem from subjective assessment criteria, rather than quantifiable and measurable data. This often results in poor quality and inconsistent treatment of patient pain management. Recent advancements in pattern recognition techniques using relevance vector machine (RVM) learning techniques can assist medical staff in assessing pain by constantly monitoring the patient and providing the clinician with quantifiable data for pain management. The RVM classification technique is a Bayesian extension of the support vector machine (SVM) algorithm, which achieves comparable performance to SVM while providing posterior probabilities for class memberships and a sparser model. If classes represent “pure” facial expressions (i.e., extreme expressions that an observer can identify with a high degree of confidence), then the posterior probability of the membership of some intermediate facial expression to a class can provide an estimate of the intensity of such an expression. In this paper, we use the RVM classification technique to distinguish pain from nonpain in neonates as well as assess their pain intensity levels. We also correlate our results with the pain intensity assessed by expert and nonexpert human examiners. PMID:20172803
Denys Yemshanov; Frank H. Koch; Mark Ducey; Klaus Koehler
2013-01-01
Geographic mapping of risks is a useful analytical step in ecological risk assessments and in particular, in analyses aimed to estimate risks associated with introductions of invasive organisms. In this paper, we approach invasive species risk mapping as a portfolio allocation problem and apply techniques from decision theory to build an invasion risk map that combines...
NASA Technical Reports Server (NTRS)
1977-01-01
The use of computers for aircraft control, flight simulation, and inertial navigation is explored. The man-machine relation problem in aviation is addressed. Simple and self-adapting autopilots are described and the assets and liabilities of digital navigation techniques are assessed.
NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding
Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo
2016-01-01
Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
C3I Rapid Prototype Investigation.
1986-01-01
feasibility of applying rapid K prototyping techniques to Air Force C3 1 system developments . This report presents the technical progress during the...computer tunctions. The cost to use each in terms of hardware, software, analysis, and needed further developments was assessed. Prototyping approaches were...acquirer, and developer are the ". basis for problems in C3I system developments . These problems destabilize r-. the requirements determination process
Diane M. Gercke; Susan A. Stewart
2006-01-01
In 2005, eight U.S. Forest Service and Bureau of Land Management interdisciplinary teams participated in a test of strategic placement of treatments (SPOTS) techniques to maximize the effectiveness of fuel treatments in reducing problem fire behavior, adverse fire effects, and suppression costs. This interagency approach to standardizing the assessment of risks and...
ERIC Educational Resources Information Center
Erwin, T. Dary
Rating scales are a typical method for evaluating a student's performance in outcomes assessment. The analysis of the quality of information from rating scales poses special measurement problems when researchers work with faculty in their development. Generalizability measurement theory offers a set of techniques for estimating errors or…
Tracking variations in the alpha activity in an electroencephalogram
NASA Technical Reports Server (NTRS)
Prabhu, K. S.
1971-01-01
The problem of tracking Alpha voltage variations in an electroencephalogram is discussed. This problem is important in encephalographic studies of sleep and effects of different stimuli on the brain. Very often the Alpha voltage is tracked by passing the EEG signal through a bandpass filter centered at the Alpha frequency, which hopefully will filter out unwanted noise from the Alpha activity. Some alternative digital techniques are suggested and their performance is compared with the standard technique. These digital techniques can be used in an environment where an electroencephalograph is interfaced with a small digital computer via an A/D convertor. They have the advantage that statistical statements about their variability can sometimes be made so that the effect sought can be assessed correctly in the presence of random fluctuations.
Experiments with a decision-theoretic scheduler
NASA Technical Reports Server (NTRS)
Hansson, Othar; Holt, Gerhard; Mayer, Andrew
1992-01-01
This paper describes DTS, a decision-theoretic scheduler designed to employ state-of-the-art probabilistic inference technology to speed the search for efficient solutions to constraint-satisfaction problems. Our approach involves assessing the performance of heuristic control strategies that are normally hard-coded into scheduling systems, and using probabilistic inference to aggregate this information in light of features of a given problem. BPS, the Bayesian Problem-Solver, introduced a similar approach to solving single-agent and adversarial graph search problems, yielding orders-of-magnitude improvement over traditional techniques. Initial efforts suggest that similar improvements will be realizable when applied to typical constraint-satisfaction scheduling problems.
Problem gambling in the workplace, characteristics of employees seeking help.
Hawley, Carolyn E; Glenn, Margaret K; Diaz, Sebastian
2007-01-01
Few rigorous research studies exist to define the impact problem gambling may have on the workforce and the workplace. This study is an initial attempt to address this void by exploring the vocational patterns and demographics of callers with self report gambling problems to a state helpline. It utilizes Chi-squared Automatic Interaction (CHAID) Technique analysis to assess 1072 working age callers with gambling related problems. The goal of this exploratory investigation is to determine if the issue of problem gambling in the workplace warrants further research and, potentially, design of interventions. Discussion centers on the use of the information for development of employer based prevention and intervention efforts.
Assessing clutter reduction in parallel coordinates using image processing techniques
NASA Astrophysics Data System (ADS)
Alhamaydh, Heba; Alzoubi, Hussein; Almasaeid, Hisham
2018-01-01
Information visualization has appeared as an important research field for multidimensional data and correlation analysis in recent years. Parallel coordinates (PCs) are one of the popular techniques to visual high-dimensional data. A problem with the PCs technique is that it suffers from crowding, a clutter which hides important data and obfuscates the information. Earlier research has been conducted to reduce clutter without loss in data content. We introduce the use of image processing techniques as an approach for assessing the performance of clutter reduction techniques in PC. We use histogram analysis as our first measure, where the mean feature of the color histograms of the possible alternative orderings of coordinates for the PC images is calculated and compared. The second measure is the extracted contrast feature from the texture of PC images based on gray-level co-occurrence matrices. The results show that the best PC image is the one that has the minimal mean value of the color histogram feature and the maximal contrast value of the texture feature. In addition to its simplicity, the proposed assessment method has the advantage of objectively assessing alternative ordering of PC visualization.
Problems faced and coping strategies used by adolescents with mentally ill parents in Delhi.
George, Shoba; Shaiju, Bindu; Sharma, Veena
2012-01-01
The present study was conducted to assess the problems faced by adolescents whose parents suffer from major mental illness at selected mental health institutes of Delhi. The objectives also included assessment of the coping strategies of the adolescents in dealing with these problems. The Stuart Stress Adaptation Model of Psychiatric Nursing Care was used as the conceptual framework. A descriptive survey approach with cross-sectional design was used in the study. A structured interview schedule was prepared. Purposive non-probability sampling technique was employed to interview 50 adolescents whose parents suffer from major mental illness. Data gathered was analysed and interpreted using both descriptive and inferential statistics. The study showed that majority of the adolescents had moderate problems as a result of their parent's mental illness. Area-wise analysis of the problems revealed that the highest problems faced were in family relationship and support and majority of the adolescents used maladaptive coping strategies. A set of guidelines on effective coping strategies was disseminated to these adolescents.
On-the-spot damage detection methodology for highway bridges.
DOT National Transportation Integrated Search
2010-07-01
Vibration-based damage identification (VBDI) techniques have been developed in part to address the problems associated with an aging civil infrastructure. To assess the potential of VBDI as it applies to highway bridges in Iowa, three applications of...
Zimmermann, Johannes; Löffler-Stastka, Henriette; Huber, Dorothea; Klug, Günther; Alhabbo, Sarah; Bock, Astrid; Benecke, Cord
2015-01-01
Empirical evidence for the effectiveness of long-term psychodynamic psychotherapy (LTPP) in patients with mood disorders is growing. However, it is unclear whether the effectiveness of LTPP is due to distinctive features of psychodynamic/psychoanalytic techniques or to a higher number of sessions. We tested these rival hypotheses in a quasi-experimental study comparing psychoanalytic therapy (i.e., high-dose LTPP) with psychodynamic therapy (i.e., low-dose LTPP) and cognitive-behavioural therapy (CBT) for depression. Analyses were based on a subsample of 77 subjects, with 27 receiving psychoanalytic therapy, 26 receiving psychodynamic therapy and 24 receiving CBT. Depressive symptoms, interpersonal problems and introject affiliation were assessed prior to treatment, after treatment and at the 1-, 2- and 3-year follow-ups. Psychoanalytic techniques were assessed from three audiotaped middle sessions per treatment using the Psychotherapy Process Q-Set. Subjects receiving psychoanalytic therapy reported having fewer interpersonal problems, treated themselves in a more affiliative way directly after treatment and tended to improve in depressive symptoms and interpersonal problems during follow-up as compared with patients receiving psychodynamic therapy and/or CBT. Multilevel mediation analyses suggested that post-treatment differences in interpersonal problems and introject affiliation were mediated by the higher number of sessions, and follow-up differences in depressive symptoms were mediated by the more pronounced application of psychoanalytic techniques. We also found some evidence for indirect treatment effects via psychoanalytic techniques on changes in introject affiliation during follow-up. These results provide support for the prediction that both a high dose and the application of psychoanalytic techniques facilitate therapeutic change in patients with major depression. Psychoanalytic therapy is an effective treatment for major depression, especially in the long run. The differential effectiveness of psychoanalytic therapy cannot be fully explained by its higher dose. Distinctive features of psychoanalytic technique (e.g., focusing on patients' dreams, fantasies, sexual experiences or childhood memories) may play an important role in establishing sustained therapeutic change. Copyright © 2014 John Wiley & Sons, Ltd.
Flight Studies of Problems Pertinent to High-Speed Operation of Jet Transports
NASA Technical Reports Server (NTRS)
Butchart, Stanley P.; Fischel, Jack; Tremant, Robert A.; Robinson, Glenn H.
1959-01-01
A flight investigation was made to assess the potential operational problems of jet transports in the transonic cruise range. In this study a large multiengine jet airplane having geometric characteristics fairly representative of the jet transport was used; however, in order to ensure general applicability of the results, the aerodynamic characteristics of the test airplane were varied to simulate a variety of jet- transport airplanes. Some of the specific areas investigated include: (1) an overall evaluation of longitudinal stability and control characteristics at transonic speeds, with an assessment of pitch-up characteristics, (2) the effect of buffeting on airplane operational speeds and maneuvering, (3) the desirable lateral-directional damping characteristics, (4) the desirable lateral-control characteristics, (5) an assessment of over-speed and speed-spread requirements, including the upset maneuver, and (6) an assessment of techniques and airplane characteristics for rapid descent and slow-down. The results presented include pilots' evaluation of the various problem areas and specific recommendations for possible improvement of jet-transport operations in the cruising speed range.
Scattering features for lung cancer detection in fibered confocal fluorescence microscopy images.
Rakotomamonjy, Alain; Petitjean, Caroline; Salaün, Mathieu; Thiberville, Luc
2014-06-01
To assess the feasibility of lung cancer diagnosis using fibered confocal fluorescence microscopy (FCFM) imaging technique and scattering features for pattern recognition. FCFM imaging technique is a new medical imaging technique for which interest has yet to be established for diagnosis. This paper addresses the problem of lung cancer detection using FCFM images and, as a first contribution, assesses the feasibility of computer-aided diagnosis through these images. Towards this aim, we have built a pattern recognition scheme which involves a feature extraction stage and a classification stage. The second contribution relies on the features used for discrimination. Indeed, we have employed the so-called scattering transform for extracting discriminative features, which are robust to small deformations in the images. We have also compared and combined these features with classical yet powerful features like local binary patterns (LBP) and their variants denoted as local quinary patterns (LQP). We show that scattering features yielded to better recognition performances than classical features like LBP and their LQP variants for the FCFM image classification problems. Another finding is that LBP-based and scattering-based features provide complementary discriminative information and, in some situations, we empirically establish that performance can be improved when jointly using LBP, LQP and scattering features. In this work we analyze the joint capability of FCFM images and scattering features for lung cancer diagnosis. The proposed method achieves a good recognition rate for such a diagnosis problem. It also performs well when used in conjunction with other features for other classical medical imaging classification problems. Copyright © 2014 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Vazquez, Lorna Thomas
2008-01-01
This article describes the A, E, I, O, U technique, designed to help teachers ensure that teaching and learning are not mutually exclusive in the classroom. Most teachers would agree that motivating average teenagers to communicate how they got an answer or justify their problem-solving strategies can be as difficult as teaching a dog to whistle.…
Clinical ophthalmic ultrasound improvements
NASA Technical Reports Server (NTRS)
Garrison, J. B.; Piro, P. A.
1981-01-01
The use of digital synthetic aperture techniques to obtain high resolution ultrasound images of eye and orbit was proposed. The parameters of the switched array configuration to reduce data collection time to a few milliseconds to avoid eye motion problems in the eye itself were established. An assessment of the effects of eye motion on the performance of the system was obtained. The principles of synthetic techniques are discussed. Likely applications are considered.
Cellular automatons applied to gas dynamic problems
NASA Technical Reports Server (NTRS)
Long, Lyle N.; Coopersmith, Robert M.; Mclachlan, B. G.
1987-01-01
This paper compares the results of a relatively new computational fluid dynamics method, cellular automatons, with experimental data and analytical results. This technique has been shown to qualitatively predict fluidlike behavior; however, there have been few published comparisons with experiment or other theories. Comparisons are made for a one-dimensional supersonic piston problem, Stokes first problem, and the flow past a normal flat plate. These comparisons are used to assess the ability of the method to accurately model fluid dynamic behavior and to point out its limitations. Reasonable results were obtained for all three test cases, but the fundamental limitations of cellular automatons are numerous. It may be misleading, at this time, to say that cellular automatons are a computationally efficient technique. Other methods, based on continuum or kinetic theory, would also be very efficient if as little of the physics were included.
Behavioral management of the hyperactive child.
Murray, M E
1980-09-01
This paper provides a general outline of the principles of behavioral management of the hyperactive child. The use of stimulant medications and special considerations in school are briefly discussed and then suggestions for initial parent counseling, family assessment, and the analysis of specific behavior problems are reviewed. Techniques of behavioral management are presented for the younger hyperactive child (3 to 7 years) and for the older hyperactive child (8 to 13 years). Among management techniques discussed are positive reinforcement, extinction procedures, and punishment through isolation. The problems involved in the use of corporal punishment are outlined as well as specific guidelines for parents as to when and how corporal punishment can be used effectively. A step-by-step summary of how to employ a token economy to manage both home and school behavior problems in older hyperactive children is presented.
Nowlin, Jon O.; Brown, W.M.; Smith, L.H.; Hoffman, R.J.
1980-01-01
The objectives of the Geological Survey 's river-quality assessment in the Truckee and Carson River basins in California and Nevada are to identify the significant resource management problems; to develop techniques to assess the problems; and to effectively communicate results to responsible managers. Six major elements of the assessment to be completed by October 1981 are (1) a detailing of the legal, institutional, and structural development of water resources in the basins and the current problems and conflicts; (2) a compilation and synthesis of the physical hydrology of the basins; (3) development of a special workshop approach to involve local management in the direction and results of the study; (4) development of a comprehensive streamflow model emcompassing both basins to provide a quantitative hydrologic framework for water-quality analysis; (5) development of a water-quality transport model for selected constituents and characteristics on selected reaches of the Truckee River; and (6) a detailed examination of selected fish habitats for specified reaches of the Truckee River. Progress will be periodically reported in reports, maps, computer data files, mathematical models, a bibliography, and public presentations. In building a basic framework to develop techniques, the basins were viewed as a single hydrologic unit because of interconnecting diversion structures. The framework comprises 13 hydrographic subunits to facilitate modeling and sampling. Several significant issues beyond the scope of the assessment were considered as supplementary proposals; water-quality loadings in Truckee and Carson Rivers, urban runoff in Reno and management alternatives, and a model of limnological processes in Lahontan Reservoir. (USGS)
Trail resource impacts and an examination of alternative assessment techniques
Marion, J.L.; Leung, Y.-F.
2001-01-01
Trails are a primary recreation resource facility on which recreation activities are performed. They provide safe access to non-roaded areas, support recreational opportunities such as hiking, biking, and wildlife observation, and protect natural resources by concentrating visitor traffic on resistant treads. However, increasing recreational use, coupled with poorly designed and/or maintained trails, has led to a variety of resource impacts. Trail managers require objective information on trails and their conditions to monitor trends, direct trail maintenance efforts, and evaluate the need for visitor management and resource protection actions. This paper reviews trail impacts and different types of trail assessments, including inventory, maintenance, and condition assessment approaches. Two assessment methods, point sampling and problem assessment, are compared empirically from separate assessments of a 15-mile segment of the Appalachian Trail in Great Smoky Mountains National Park. Results indicate that point sampling and problem assessment methods yield distinctly different types of quantitative information. The point sampling method provides more accurate and precise measures of trail characteristics that are continuous or frequent (e.g., tread width or exposed soil). The problem assessment method is a preferred approach for monitoring trail characteristics that can be easily predefined or are infrequent (e.g., excessive width or secondary treads), particularly when information on the location of specific trail impact problems is needed. The advantages and limitations of these two assessment methods are examined in relation to various management and research information needs. The choice and utility of these assessment methods are also discussed.
The impact of environmental factors on the performance of millimeter wave seekers in smart munitions
NASA Astrophysics Data System (ADS)
Hager, R.
1987-08-01
An assessment has been made of the degradation in performance of horizontal-glide smart munitions incorporating millimeter wave seekers operating in three types of environments. Atmospheric effects are shown to degrade performance appreciably only in very severe weather conditions. Electromagnetic line-of-sight masking due to foliage (forest canopy and tree-lined roads) will limit submunition usage and may be a potential problem. The most serious problem involves the confident detection of military vehicles in the presence of land clutter. Standard signal processing techniques involving signal amplitude and signal averaging are not likely to be adequate for detection. Observations regarding more sophisticated techniques and the current state of research are included.
Wavelet-sparsity based regularization over time in the inverse problem of electrocardiography.
Cluitmans, Matthijs J M; Karel, Joël M H; Bonizzi, Pietro; Volders, Paul G A; Westra, Ronald L; Peeters, Ralf L M
2013-01-01
Noninvasive, detailed assessment of electrical cardiac activity at the level of the heart surface has the potential to revolutionize diagnostics and therapy of cardiac pathologies. Due to the requirement of noninvasiveness, body-surface potentials are measured and have to be projected back to the heart surface, yielding an ill-posed inverse problem. Ill-posedness ensures that there are non-unique solutions to this problem, resulting in a problem of choice. In the current paper, it is proposed to restrict this choice by requiring that the time series of reconstructed heart-surface potentials is sparse in the wavelet domain. A local search technique is introduced that pursues a sparse solution, using an orthogonal wavelet transform. Epicardial potentials reconstructed from this method are compared to those from existing methods, and validated with actual intracardiac recordings. The new technique improves the reconstructions in terms of smoothness and recovers physiologically meaningful details. Additionally, reconstruction of activation timing seems to be improved when pursuing sparsity of the reconstructed signals in the wavelet domain.
Child Psychotherapy, Child Analysis, and Medication: A Flexible, Integrative Approach.
Whitman, Laura
2015-01-01
For children with moderate to severe emotional or behavioral problems, the current approach in child psychiatry is to make an assessment for the use of both psychotherapy and medication. This paper describes integration of antidepressants and stimulants with psychoanalytically oriented techniques.
SESSION: EMERGING POLLUTANT ASSESSMENT TECHNIQUES TITLE: BACTERIAL SOURCE TRACKING
Fecal contamination of surface waters used for recreation, drinking water and aquaculture are a continuous environmental problem and pose significant human health risks. An alarming amount of the United States rivers/streams (39%), lakes (45%), and estuaries (51%) are not safe f...
Geospatial methods and data analysis for assessing distribution of grazing livestock
USDA-ARS?s Scientific Manuscript database
Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...
DOT National Transportation Integrated Search
2014-09-01
Corrosion of steel-reinforced concrete bridges is a serious problem facing the WVDOT. This : paper provides an overview of techniques for evaluating the condition of reinforced concrete : bridge elements; methods for modeling the remaining service li...
Assessment of Navy Alcohol and Drug Abuse Education and Training Curricula, Revision Requirements
1986-02-01
include the following: Introduction to Psychology Adolescent Psychology Maslow’s Hierarchy Abnormal Psychology Defense Mechanisms Anxiety... abnormal psychological development and behavior, techniques of psychological assessment and treatment, and the application of these skills in a variety...block number) The Chief of Naval Operations has taken a firm, constructive approach to drug and alcohol abuse problems in the Navy. Navy policy
Operator Performance Measures for Assessing Voice Communication Effectiveness
1989-07-01
performance and work- load assessment techniques have been based.I Broadbent (1958) described a limited capacity filter model of human information...INFORMATION PROCESSING 20 3.1.1. Auditory Attention 20 3.1.2. Auditory Memory 24 3.2. MODELS OF INFORMATION PROCESSING 24 3.2.1. Capacity Theories 25...Learning 0 Attention * Language Specialization • Decision Making• Problem Solving Auditory Information Processing Models of Processing Ooemtor
ERIC Educational Resources Information Center
Krain, Matthew
2016-01-01
This study revisits case learning's effects on student engagement and assesses student learning as a result of the use of case studies and problem-based learning. The author replicates a previous study that used indirect assessment techniques to get at case learning's impact, and then extends the analysis using a pre- and post-test experimental…
Decision-theoretic control of EUVE telescope scheduling
NASA Technical Reports Server (NTRS)
Hansson, Othar; Mayer, Andrew
1993-01-01
This paper describes a decision theoretic scheduler (DTS) designed to employ state-of-the-art probabilistic inference technology to speed the search for efficient solutions to constraint-satisfaction problems. Our approach involves assessing the performance of heuristic control strategies that are normally hard-coded into scheduling systems and using probabilistic inference to aggregate this information in light of the features of a given problem. The Bayesian Problem-Solver (BPS) introduced a similar approach to solving single agent and adversarial graph search patterns yielding orders-of-magnitude improvement over traditional techniques. Initial efforts suggest that similar improvements will be realizable when applied to typical constraint-satisfaction scheduling problems.
Skin blotting: a noninvasive technique for evaluating physiological skin status.
Minematsu, Takeo; Horii, Motoko; Oe, Makoto; Sugama, Junko; Mugita, Yuko; Huang, Lijuan; Nakagami, Gojiro; Sanada, Hiromi
2014-06-01
The skin performs important structural and physiological functions, and skin assessment represents an important step in identifying skin problems. Although noninvasive techniques for assessing skin status exist, no such techniques for monitoring its physiological status are available. This study aimed to develop a novel skin-assessment technique known as skin blotting, based on the leakage of secreted proteins from inside the skin following overhydration in mice. The applicability of this technique was further investigated in a clinical setting. Skin blotting involves 2 steps: collecting proteins by attaching a damp nitrocellulose membrane to the surface of the skin, and immunostaining the collected proteins. The authors implanted fluorescein-conjugated dextran (F-DEX)-containing agarose gels into mice and detected the tissue distribution of F-DEX under different blotting conditions. They also analyzed the correlations between inflammatory cytokine secretion and leakage following ultraviolet irradiation in mice and in relation to body mass index in humans. The F-DEX in mice was distributed in the deeper and shallower layers of skin and leaked through the transfollicular and transepidermal routes, respectively. Ultraviolet irradiation induced tumor necrosis factor secretion in the epidermis in mice, which was detected by skin blotting, whereas follicular tumor necrosis factor was associated with body mass index in obese human subjects. These results support the applicability of skin blotting for skin assessment. Skin blotting represents a noninvasive technique for assessing skin physiology and has potential as a predictive and diagnostic tool for skin disorders.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Editor); Venneri, Samuel L. (Editor)
1993-01-01
Various papers on flight vehicle materials, structures, and dynamics are presented. Individual topics addressed include: general modeling methods, component modeling techniques, time-domain computational techniques, dynamics of articulated structures, structural dynamics in rotating systems, structural dynamics in rotorcraft, damping in structures, structural acoustics, structural design for control, structural modeling for control, control strategies for structures, system identification, overall assessment of needs and benefits in structural dynamics and controlled structures. Also discussed are: experimental aeroelasticity in wind tunnels, aeroservoelasticity, nonlinear aeroelasticity, aeroelasticity problems in turbomachines, rotary-wing aeroelasticity with application to VTOL vehicles, computational aeroelasticity, structural dynamic testing and instrumentation.
A parallel graded-mesh FDTD algorithm for human-antenna interaction problems.
Catarinucci, Luca; Tarricone, Luciano
2009-01-01
The finite difference time domain method (FDTD) is frequently used for the numerical solution of a wide variety of electromagnetic (EM) problems and, among them, those concerning human exposure to EM fields. In many practical cases related to the assessment of occupational EM exposure, large simulation domains are modeled and high space resolution adopted, so that strong memory and central processing unit power requirements have to be satisfied. To better afford the computational effort, the use of parallel computing is a winning approach; alternatively, subgridding techniques are often implemented. However, the simultaneous use of subgridding schemes and parallel algorithms is very new. In this paper, an easy-to-implement and highly-efficient parallel graded-mesh (GM) FDTD scheme is proposed and applied to human-antenna interaction problems, demonstrating its appropriateness in dealing with complex occupational tasks and showing its capability to guarantee the advantages of a traditional subgridding technique without affecting the parallel FDTD performance.
New tools for investigating student learning in upper-division electrostatics
NASA Astrophysics Data System (ADS)
Wilcox, Bethany R.
Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.
Organization Development in Mental Health Services.
ERIC Educational Resources Information Center
Glaser, Edward M.; Backer, Thomas E.
1979-01-01
The term "organization development" (OD) encompasses techniques developed to facilitate communication and collaborative problem solving in work groups. This discussion focuses on defining OD, describing its current use in mental health and human service organizations, and assessing potential payoffs and disadvantages of implementing OD programs in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webster, Anthony J.; CCFE, Culham Science Centre, Abingdon OX14 3DB
2014-11-15
The generic question is considered: How can we determine the probability of an otherwise quasi-random event, having been triggered by an external influence? A specific problem is the quantification of the success of techniques to trigger, and hence control, edge-localised plasma instabilities (ELMs) in magnetically confined fusion (MCF) experiments. The development of such techniques is essential to ensure tolerable heat loads on components in large MCF fusion devices, and is necessary for their development into economically successful power plants. Bayesian probability theory is used to rigorously formulate the problem and to provide a formal solution. Accurate but pragmatic methods aremore » developed to estimate triggering probabilities, and are illustrated with experimental data. These allow results from experiments to be quantitatively assessed, and rigorously quantified conclusions to be formed. Example applications include assessing whether triggering of ELMs is a statistical or deterministic process, and the establishment of thresholds to ensure that ELMs are reliably triggered.« less
The mobile image quality survey game
NASA Astrophysics Data System (ADS)
Rasmussen, D. René
2012-01-01
In this paper we discuss human assessment of the quality of photographic still images, that are degraded in various manners relative to an original, for example due to compression or noise. In particular, we examine and present results from a technique where observers view images on a mobile device, perform pairwise comparisons, identify defects in the images, and interact with the display to indicate the location of the defects. The technique measures the response time and accuracy of the responses. By posing the survey in a form similar to a game, providing performance feedback to the observer, the technique attempts to increase the engagement of the observers, and to avoid exhausting observers, a factor that is often a problem for subjective surveys. The results are compared with the known physical magnitudes of the defects and with results from similar web-based surveys. The strengths and weaknesses of the technique are discussed. Possible extensions of the technique to video quality assessment are also discussed.
The transition of new technology to solve today`s problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamin, R.A.; Martin, C.J.; Turner, L.M.
1995-05-01
Extensive research has been conducted in the development of methods to predict the degradation of F-44 in storage. The Low Pressure Reactor (LPR) has greatly enhanced the stability prediction capabilities necessary to make informed decisions concerning aviation fuel in storage. This technique has in the past been primarily used for research purposes. The Naval Air Warfare Center, Aircraft Division, Trenton, NJ, has used this technique successfully to assist the Defense Fuel Supply Center, Cameron Station, Alexandria, VA, in stability assessments of F-44. The High Performance Liquid Chromatography/Electrochemical Detector (HPLC/EC) antioxidant determination technique has also aided in making stability predictions bymore » establishing the amount of inhibitor currently in the product. This paper will address two case studies in which the above new technology was used to insure the rapid detection and diagnosis of today`s field and logistic problems.« less
FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation
NASA Astrophysics Data System (ADS)
Veltri, M.
2016-09-01
This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.
Talking Physics: Two Case Studies on Short Answers and Self-explanation in Learning Physics
NASA Astrophysics Data System (ADS)
Badeau, Ryan C.
This thesis explores two case studies into the use of short answers and self-explanation to improve student learning in physics. The first set of experiments focuses on the role of short answer questions in the context of computer-based instruction. Through a series of six experiments, we compare and evaluate the performance of computer-assessed short answer questions versus multiple choice for training conceptual topics in physics, controlling for feedback between the two formats. In addition to finding overall similar improvements on subsequent student performance and retention, we identify unique differences in how students interact with the treatments in terms of time spent on feedback and performance on follow-up short answer assessment. In addition, we identify interactions between the level of interactivity of the training, question format, and student attitudinal ratings of each respective training. The second case study focuses on the use of worked examples in the context of multi-concept physics problems - which we call "synthesis problems." For this part of the thesis, four experiments were designed to evaluate the effectiveness of two instructional methods employing worked examples on student performance with synthesis problems; these instructional techniques, analogical comparison and self-explanation, have previously been studied primarily in the context of single-concept problems. As such, the work presented here represents a novel focus on extending these two techniques to this class of more complicated physics problem. Across the four experiments, both self-explanation and certain kinds of analogical comparison of worked examples significantly improved student performance on a target synthesis problem, with distinct improvements in recognition of the relevant concepts. More specifically, analogical comparison significantly improved student performance when the comparisons were invoked between worked synthesis examples. In contrast, similar comparisons between corresponding pairs of worked single-concept examples did not significantly improve performance. On a more complicated synthesis problem, self-explanation was significantly more effective than analogical comparison, potentially due to differences in how successfully students encoded the full structure of the worked examples. Finally, we find that the two techniques can be combined for additional benefit, with the trade-off of slightly more time-on-task.
NASA Astrophysics Data System (ADS)
Lollino, Piernicola; Andriani, Gioacchino Francesco; Fazio, Nunzio Luciano; Perrotti, Michele
2016-04-01
Strain-softening under low confinement stress, i.e. the drop of strength that occurs in the post-failure stage, represents a key factor of the stress-strain behavior of rocks. However, this feature of the rock behavior is generally underestimated or even neglected in the assessment of boundary value problems of intact soft rock masses. This is typically the case when the stability of intact rock masses is treated by means of limit equilibrium or finite element analyses, for which rigid-plastic or elastic perfectly-plastic constitutive models, generally implementing peak strength conditions of the rock, are respectively used. In fact, the aforementioned numerical techniques are characterized by intrinsic limitations that do not allow to account for material brittleness, either for the method assumptions or due to numerical stability problems, as for the case of the finite element method, unless sophisticated regularization techniques are implemented. However, for those problems that concern the stability of intact soft rock masses at low stress levels, as for example the stability of shallow underground caves or that of rock slopes, the brittle stress-strain response of rock in the post-failure stage cannot be disregarded due to the risk of overestimation of the stability factor. This work is aimed at highlighting the role of post-peak brittleness of soft rocks in the analysis of specific ideal problems by means of the use of a hybrid finite-discrete element technique (FDEM) that allows for the simulation of the rock stress-strain brittle behavior in a proper way. In particular, the stability of two ideal cases, represented by a shallow underground rectangular cave and a vertical cliff, has been analyzed by implementing a post-peak brittle behavior of the rock and the comparison with a non-brittle response of the rock mass is also explored. To this purpose, the mechanical behavior of a soft calcarenite belonging to the Calcarenite di Gravina formation, extensively outcropping in Puglia (Southern Italy), and the corresponding features of the post-peak behavior as measured in the laboratory, have been used as a reference in this work, as well as the typical geometrical features of underground cavities and rock cliffs, as observed in Southern Italy, have been adopted for the simulations. The numerical results indicate the strong impact for the assessment of stability when rock post-peak brittleness is accounted for, if compared with perfectly plastic assumptions, and the need for adopting numerical techniques, as the FDEM approach, to take properly into account this important aspect of the rock behavior is highlighted.
Use of refractometry and colorimetry as field methods to rapidly assess antimalarial drug quality.
Green, Michael D; Nettey, Henry; Villalva Rojas, Ofelia; Pamanivong, Chansapha; Khounsaknalath, Lamphet; Grande Ortiz, Miguel; Newton, Paul N; Fernández, Facundo M; Vongsack, Latsamy; Manolin, Ot
2007-01-04
The proliferation of counterfeit and poor-quality drugs is a major public health problem; especially in developing countries lacking adequate resources to effectively monitor their prevalence. Simple and affordable field methods provide a practical means of rapidly monitoring drug quality in circumstances where more advanced techniques are not available. Therefore, we have evaluated refractometry, colorimetry and a technique combining both processes as simple and accurate field assays to rapidly test the quality of the commonly available antimalarial drugs; artesunate, chloroquine, quinine, and sulfadoxine. Method bias, sensitivity, specificity and accuracy relative to high-performance liquid chromatographic (HPLC) analysis of drugs collected in the Lao PDR were assessed for each technique. The HPLC method for each drug was evaluated in terms of assay variability and accuracy. The accuracy of the combined method ranged from 0.96 to 1.00 for artesunate tablets, chloroquine injectables, quinine capsules, and sulfadoxine tablets while the accuracy was 0.78 for enterically coated chloroquine tablets. These techniques provide a generally accurate, yet simple and affordable means to assess drug quality in resource-poor settings.
Problem-based learning biotechnology courses in chemical engineering.
Glatz, Charles E; Gonzalez, Ramon; Huba, Mary E; Mallapragada, Surya K; Narasimhan, Balaji; Reilly, Peter J; Saunders, Kevin P; Shanks, Jacqueline V
2006-01-01
We have developed a series of upper undergraduate/graduate lecture and laboratory courses on biotechnological topics to supplement existing biochemical engineering, bioseparations, and biomedical engineering lecture courses. The laboratory courses are based on problem-based learning techniques, featuring two- and three-person teams, journaling, and performance rubrics for guidance and assessment. Participants initially have found them to be difficult, since they had little experience with problem-based learning. To increase enrollment, we are combining the laboratory courses into 2-credit groupings and allowing students to substitute one of them for the second of our 2-credit chemical engineering unit operations laboratory courses.
Teachers Inservice Education for Learning Problems
ERIC Educational Resources Information Center
Smith, Stanley A.
1975-01-01
Describes workshop designed to aid kindergarten and first grade teachers in assessing a child's educational disability, deciding what to do about it and when to use referral. Concentrated on skills for evaluating the child's motor ability, intellegence level and emotional development. Also taught behavior modification techniques and remediation…
Dohrenbusch, R
2009-06-01
Chronic pain accompanied by disability and handicap is a frequent symptom necessitating medical assessment. Current guidelines for the assessment of malingering suggest discrimination between explanatory demonstration, aggravation and simulation. However, this distinction has not clearly been put into operation and validated. The necessity of assessment strategies based on general principles of psychological assessment and testing is emphasized. Standardized and normalized psychological assessment methods and symptom validation techniques should be used in the assessment of subjects with chronic pain problems. An adaptive procedure for assessing the validity of complaints is suggested to minimize effort and costs.
[Statistical prediction methods in violence risk assessment and its application].
Liu, Yuan-Yuan; Hu, Jun-Mei; Yang, Min; Li, Xiao-Song
2013-06-01
It is an urgent global problem how to improve the violence risk assessment. As a necessary part of risk assessment, statistical methods have remarkable impacts and effects. In this study, the predicted methods in violence risk assessment from the point of statistics are reviewed. The application of Logistic regression as the sample of multivariate statistical model, decision tree model as the sample of data mining technique, and neural networks model as the sample of artificial intelligence technology are all reviewed. This study provides data in order to contribute the further research of violence risk assessment.
Rinkel, Rico N; Verdonck-de Leeuw, Irma M; Doornaert, Patricia; Buter, Jan; de Bree, Remco; Langendijk, Johannes A; Aaronson, Neil K; Leemans, C René
2016-07-01
The objective of this study is to assess swallowing and speech outcome after chemoradiation therapy for head and neck cancer, based on the patient-reported outcome measures Swallowing Quality of Life Questionnaire (SWAL-QOL) and Speech Handicap Index (SHI), both provided with cut-off scores. This is a cross-sectional study. Department of Otolaryngology/Head and Neck Surgery of a University Medical Center. Sixty patients, 6 months to 5 years after chemoradiation for head and neck squamous cell carcinoma. Swallowing Quality of Life Questionnaire (SWAL-QOL) and SHI, both validated in Dutch and provided with cut-off scores. Associations were tested between the outcome measures and independent variables (age, gender, tumor stage and site, and radiotherapy technique, time since treatment, comorbidity and food intake). Fifty-two patients returned the SWAL-QOL and 47 the SHI (response rate 87 and 78 %, respectively). Swallowing and speech problems were present in 79 and 55 %, respectively. Normal food intake was noticed in 45, 35 % had a soft diet and 20 % tube feeding. Patients with soft diet and tube feeding reported more swallowing problems compared to patients with normal oral intake. Tumor subsite was significantly associated with swallowing outcome (less problems in larynx/hypopharynx compared to oral/oropharynx). Radiation technique was significantly associated with psychosocial speech problems (less problems in patients treated with IMRT). Swallowing and (to a lesser extent) speech problems in daily life are frequently present after chemoradiation therapy for head and neck cancer. Future prospective studies will give more insight into the course of speech and swallowing problems after chemoradiation and into efficacy of new radiation techniques and swallowing and speech rehabilitation programs.
Assessment of numerical techniques for unsteady flow calculations
NASA Technical Reports Server (NTRS)
Hsieh, Kwang-Chung
1989-01-01
The characteristics of unsteady flow motions have long been a serious concern in the study of various fluid dynamic and combustion problems. With the advancement of computer resources, numerical approaches to these problems appear to be feasible. The objective of this paper is to assess the accuracy of several numerical schemes for unsteady flow calculations. In the present study, Fourier error analysis is performed for various numerical schemes based on a two-dimensional wave equation. Four methods sieved from the error analysis are then adopted for further assessment. Model problems include unsteady quasi-one-dimensional inviscid flows, two-dimensional wave propagations, and unsteady two-dimensional inviscid flows. According to the comparison between numerical and exact solutions, although second-order upwind scheme captures the unsteady flow and wave motions quite well, it is relatively more dissipative than sixth-order central difference scheme. Among various numerical approaches tested in this paper, the best performed one is Runge-Kutta method for time integration and six-order central difference for spatial discretization.
Tucker, Phebe; Pfefferbaum, Betty; Doughty, Debby E; Jones, Dan E; Jordan, Fred B; Nixon, Sara Jo
2002-10-01
Posttraumatic stress and depressive symptoms were assessed in 51 body handlers after Oklahoma City's 1995 terrorist bombing. Although many handlers were inexperienced and knew someone killed, symptoms were low postdisaster and decreased significantly after 1 year. Higher symptomatology and seeking mental health treatment correlated with increases in alcohol use and new physical problems but not with demographics, exposure, or experience. Four respondents with the highest posttraumatic stress symptoms at both time points reported high physical and alcohol use problems and mental health treatment use, suggesting that these should be carefully assessed in body handlers postdisaster. Coping techniques are described, as well as possible reasons for unexpected resilience in the majority.
Establishment of a center of excellence for applied mathematical and statistical research
NASA Technical Reports Server (NTRS)
Woodward, W. A.; Gray, H. L.
1983-01-01
The state of the art was assessed with regards to efforts in support of the crop production estimation problem and alternative generic proportion estimation techniques were investigated. Topics covered include modeling the greeness profile (Badhwarmos model), parameter estimation using mixture models such as CLASSY, and minimum distance estimation as an alternative to maximum likelihood estimation. Approaches to the problem of obtaining proportion estimates when the underlying distributions are asymmetric are examined including the properties of Weibull distribution.
Assessment of remote sensing technologies to discover and characterize waste sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1992-03-11
This report presents details about waste management practices that are being developed using remote sensing techniques to characterize DOE waste sites. Once the sites and problems have been located and characterized and an achievable restoration and remediation program have been established, efforts to reclaim the environment will begin. Special problems to be considered are: concentrated waste forms in tanks and pits; soil and ground water contamination; ground safety hazards for workers; and requirement for long-term monitoring.
Errors Using Observational Methods for Ergonomics Assessment in Real Practice.
Diego-Mas, Jose-Antonio; Alcaide-Marzal, Jorge; Poveda-Bautista, Rocio
2017-12-01
The degree in which practitioners use the observational methods for musculoskeletal disorder risks assessment correctly was evaluated. Ergonomics assessment is a key issue for the prevention and reduction of work-related musculoskeletal disorders in workplaces. Observational assessment methods appear to be better matched to the needs of practitioners than direct measurement methods, and for this reason, they are the most widely used techniques in real work situations. Despite the simplicity of observational methods, those responsible for assessing risks using these techniques should have some experience and know-how in order to be able to use them correctly. We analyzed 442 risk assessments of actual jobs carried out by 290 professionals from 20 countries to determine their reliability. The results show that approximately 30% of the assessments performed by practitioners had errors. In 13% of the assessments, the errors were severe and completely invalidated the results of the evaluation. Despite the simplicity of observational method, approximately 1 out of 3 assessments conducted by practitioners in actual work situations do not adequately evaluate the level of potential musculoskeletal disorder risks. This study reveals a problem that suggests greater effort is needed to ensure that practitioners possess better knowledge of the techniques used to assess work-related musculoskeletal disorder risks and that laws and regulations should be stricter as regards qualifications and skills required by professionals.
Helping Students Overcome Depression and Anxiety: A Practical Guide. Second Edition
ERIC Educational Resources Information Center
Merrell, Kenneth W.
2008-01-01
This guide provides expert information and clear-cut strategies for assessing and treating internalizing problems in school settings. More than 40 specific psychoeducational and psychosocial intervention techniques are detailed, with a focus on approaches that are evidence based, broadly applicable, and easy to implement. Including 26…
Feedback Improvement in Automatic Program Evaluation Systems
ERIC Educational Resources Information Center
Skupas, Bronius
2010-01-01
Automatic program evaluation is a way to assess source program files. These techniques are used in learning management environments, programming exams and contest systems. However, use of automated program evaluation encounters problems: some evaluations are not clear for the students and the system messages do not show reasons for lost points.…
Sampling for Contaminants in Ecological Systems
ERIC Educational Resources Information Center
Eberhardt, L. Lee; And Others
1976-01-01
This paper is concerned with problems in assessing the behavior of trace substances introduced into natural systems, sampling models of five classes that might be used in the study of contaminants are reviewed. Adaptation of an industrial experimentation method and techniques used in economic geology to ecological sampling is recommended.…
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.
1998-01-01
Over the past few years, modem aircraft design has experienced a paradigm shift from designing for performance to designing for affordability. This report contains a probabilistic approach that will allow traditional deterministic design methods to be extended to account for disciplinary, economic, and technological uncertainty. The probabilistic approach was facilitated by the Fast Probability Integration (FPI) technique; a technique which allows the designer to gather valuable information about the vehicle's behavior in the design space. This technique is efficient for assessing multi-attribute, multi-constraint problems in a more realistic fashion. For implementation purposes, this technique is applied to illustrate how both economic and technological uncertainty associated with a Very Large Transport aircraft concept may be assessed. The assessment is evaluated with the FPI technique to determine the cumulative probability distributions of the design space, as bound by economic objectives and performance constraints. These distributions were compared to established targets for a comparable large capacity aircraft, similar in size to the Boeing 747-400. The conventional baseline configuration design space was determined to be unfeasible and marginally viable, motivating the infusion of advanced technologies, including reductions in drag, specific fuel consumption, wing weight, and Research, Development, Testing, and Evaluation costs. The resulting system design space was qualitatively assessed with technology metric "k" factors. The infusion of technologies shifted the VLT design into regions of feasibility and greater viability. The study also demonstrated a method and relationship by which the impact of new technologies may be assessed in a more system focused approach.
Practical Team-Based Learning from Planning to Implementation
Bell, Edward; Eng, Marty; Fuentes, David G.; Helms, Kristen L.; Maki, Erik D.; Vyas, Deepti
2015-01-01
Team-based learning (TBL) helps instructors develop an active teaching approach for the classroom through group work. The TBL infrastructure engages students in the learning process through the Readiness Assessment Process, problem-solving through team discussions, and peer feedback to ensure accountability. This manuscript describes the benefits and barriers of TBL, and the tools necessary for developing, implementing, and critically evaluating the technique within coursework in a user-friendly method. Specifically, the manuscript describes the processes underpinning effective TBL development, preparation, implementation, assessment, and evaluation, as well as practical techniques and advice from authors’ classroom experiences. The paper also highlights published articles in the area of TBL in education, with a focus on pharmacy education. PMID:26889061
The operational use of Landsat for lake quality assessment
NASA Technical Reports Server (NTRS)
Scarpace, F. L.; Fisher, L. T.
1980-01-01
A cooperative program between the Wisconsin Department of Natural Resources and the University of Wisconsin for the assessment, with Landsat data, of the trophic status of all the significant inland lakes in Wisconsin is described. The analysis technique is a semiautomatic data acquisition and handling system which, in conjunction with an analytical categorization scheme, can be used for classifying inland lakes into one of seven categories of eutrophication and one of four problem types.
Sadowski, Lukasz
2013-01-01
In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm × 750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.
Computationally efficient stochastic optimization using multiple realizations
NASA Astrophysics Data System (ADS)
Bayer, P.; Bürger, C. M.; Finkel, M.
2008-02-01
The presented study is concerned with computationally efficient methods for solving stochastic optimization problems involving multiple equally probable realizations of uncertain parameters. A new and straightforward technique is introduced that is based on dynamically ordering the stack of realizations during the search procedure. The rationale is that a small number of critical realizations govern the output of a reliability-based objective function. By utilizing a problem, which is typical to designing a water supply well field, several variants of this "stack ordering" approach are tested. The results are statistically assessed, in terms of optimality and nominal reliability. This study demonstrates that the simple ordering of a given number of 500 realizations while applying an evolutionary search algorithm can save about half of the model runs without compromising the optimization procedure. More advanced variants of stack ordering can, if properly configured, save up to more than 97% of the computational effort that would be required if the entire number of realizations were considered. The findings herein are promising for similar problems of water management and reliability-based design in general, and particularly for non-convex problems that require heuristic search techniques.
NASA Astrophysics Data System (ADS)
Hafezalkotob, Arian; Hafezalkotob, Ashkan
2017-06-01
A target-based MADM method covers beneficial and non-beneficial attributes besides target values for some attributes. Such techniques are considered as the comprehensive forms of MADM approaches. Target-based MADM methods can also be used in traditional decision-making problems in which beneficial and non-beneficial attributes only exist. In many practical selection problems, some attributes have given target values. The values of decision matrix and target-based attributes can be provided as intervals in some of such problems. Some target-based decision-making methods have recently been developed; however, a research gap exists in the area of MADM techniques with target-based attributes under uncertainty of information. We extend the MULTIMOORA method for solving practical material selection problems in which material properties and their target values are given as interval numbers. We employ various concepts of interval computations to reduce degeneration of uncertain data. In this regard, we use interval arithmetic and introduce innovative formula for interval distance of interval numbers to create interval target-based normalization technique. Furthermore, we use a pairwise preference matrix based on the concept of degree of preference of interval numbers to calculate the maximum, minimum, and ranking of these numbers. Two decision-making problems regarding biomaterials selection of hip and knee prostheses are discussed. Preference degree-based ranking lists for subordinate parts of the extended MULTIMOORA method are generated by calculating the relative degrees of preference for the arranged assessment values of the biomaterials. The resultant rankings for the problem are compared with the outcomes of other target-based models in the literature.
Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.
Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander
2018-04-10
A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.
Klegeris, Andis; Bahniwal, Manpreet; Hurren, Heather
2013-01-01
Problem-based learning (PBL) was originally introduced in medical education programs as a form of small-group learning, but its use has now spread to large undergraduate classrooms in various other disciplines. Introduction of new teaching techniques, including PBL-based methods, needs to be justified by demonstrating the benefits of such techniques over classical teaching styles. Previously, we demonstrated that introduction of tutor-less PBL in a large third-year biochemistry undergraduate class increased student satisfaction and attendance. The current study assessed the generic problem-solving abilities of students from the same class at the beginning and end of the term, and compared student scores with similar data obtained in three classes not using PBL. Two generic problem-solving tests of equal difficulty were administered such that students took different tests at the beginning and the end of the term. Blinded marking showed a statistically significant 13% increase in the test scores of the biochemistry students exposed to PBL, while no trend toward significant change in scores was observed in any of the control groups not using PBL. Our study is among the first to demonstrate that use of tutor-less PBL in a large classroom leads to statistically significant improvement in generic problem-solving skills of students. PMID:23463230
NASA Technical Reports Server (NTRS)
Wilson, V. E.
1980-01-01
Alternate concepts and design approaches were developed for suction panels and techniques were defined for integrating these panel designs into a complete LFC 200R wing. The design concepts and approaches were analyzed to assure that they would meet the strength, stability, and internal volume requirements. Cost and weight comparisions of the concepts were also made. Problems of integrating the concepts into a complete aircraft system were addressed. Methods for making splices both chordwise and spanwise, fuel light joints, and internal duct installations were developed. Manufacturing problems such as slot aligment, tapered slot spacing, production methods, and repair techniques were addressed. An assessment of the program was used to developed recommendations for additional research in the development of SPF/DB for LFC structure.
Chiu, Mary; Pauley, Tim; Wesson, Virginia; Pushpakumar, Dunstan; Sadavoy, Joel
2015-06-01
The value of care provided by informal carers in Canada is estimated at $26 billion annually (Hollander et al., 2009). However, carers' needs are often overlooked, limiting their capacity to provide care. Problem-solving therapy (PST), a structured approach to problem solving (PS) and a core principle of the Reitman Centre CARERS Program, has been shown to alleviate emotional distress and improve carers' competence (Chiu et al., 2013). This study evaluated the effectiveness of problem-solving techniques-based intervention based on adapted PST methods, in enhancing carers' physical and emotional capacity to care for relatives with dementia living in the community. 56 carers were equally allocated to a problem-solving techniques-based intervention group or a control arm. Carers in the intervention group received three 1 hr visits by a care coordinator (CC) who had been given advanced training in PS techniques-based intervention. Coping, mastery, competence, burden, and perceived stress of the carers were evaluated at baseline and post-intervention using standardized assessment tools. An intention-to-treat analysis utilizing repeated measures ANOVA was performed on the data. Post-intervention measures completion rate was 82% and 92% for the intervention and control groups, respectively. Carers in the intervention group showed significantly improved task-oriented coping, mastery, and competence and significantly reduced emotion-oriented coping, burden and stress (p < 0.01-0.001). Control carers showed no change. PS techniques, when learned and delivered by CCs as a tool to coach carers in their day-to-day caregiving, improves carers' caregiving competence, coping, burden, and perceived stress. This may reduce dependence on primary, psychiatric, and institutional care. Results provide evidence that establishing effective partnerships between inter-professional clinicians in academic clinical health science centers, and community agencies can extend the reach of the expertise of specialized health care institutions.
Corneal markers of diabetic neuropathy.
Pritchard, Nicola; Edwards, Katie; Shahidi, Ayda M; Sampson, Geoff P; Russell, Anthony W; Malik, Rayaz A; Efron, Nathan
2011-01-01
Diabetic neuropathy is a significant clinical problem that currently has no effective therapy, and in advanced cases, leads to foot ulceration and lower limb amputation. The accurate detection, characterization and quantification of this condition are important in order to define at-risk patients, anticipate deterioration, monitor progression, and assess new therapies. This review evaluates novel corneal methods of assessing diabetic neuropathy. Two new noninvasive corneal markers have emerged, and in cross-sectional studies have demonstrated their ability to stratify the severity of this disease. Corneal confocal microscopy allows quantification of corneal nerve parameters and noncontact corneal esthesiometry, the functional correlate of corneal structure, assesses the sensitivity of the cornea. Both these techniques are quick to perform, produce little or no discomfort for the patient, and are suitable for clinical settings. Each has advantages and disadvantages over traditional techniques for assessing diabetic neuropathy. Application of these new corneal markers for longitudinal evaluation of diabetic neuropathy has the potential to reduce dependence on more invasive, costly, and time-consuming assessments, such as skin biopsy.
Problem: The Leetown Science Center and ~ 500 acre research facility operated by the U.S. Geological Survey (USGS) Biological Resources Division (BRD) In West Virginia investigates the health and habitats of aquatic species. Large quantities of good quality cold water are needed ...
Considering the Efficacy of Web-Based Worked Examples in Introductory Chemistry
ERIC Educational Resources Information Center
Crippen, Kent J.; Earl, Boyd L.
2004-01-01
Theory suggests that studying worked examples and engaging in self-explanation will improve learning and problem solving. A growing body of evidence supports the use of web-based assessments for improving undergraduate performance in traditional large enrollment courses. This article describes a study designed to investigate these techniques in a…
Volunteer Effectiveness in Counseling Chronically Depressed Women Outpatients.
ERIC Educational Resources Information Center
Waite, John; And Others
A group of depressed women outpatients who were attending the outpatient clinic of a Midwestern state hospital were assigned women volunteers who had been trained in either problem solving and Rogerian relationship techniques or cognitive-behavioral therapy. Volunteers met with patients at least one hour per week. Patients were assessed on various…
Improving Articulation and Transfer Relationships. New Directions for Community Colleges, Number 39.
ERIC Educational Resources Information Center
Kintzer, Frederick C., Ed.
1982-01-01
With the intent of revitalizing the study of educational articulation and transfer, this collection of essays describes and assesses the current status of transfer education, points to particular problems and concerns, and highlights specific techniques, activities, and practices. The volume includes "The Transfer Function--One of Many,"…
A Meta-Analytic Investigation of Fiedler's Contingency Model of Leadership Effectiveness.
ERIC Educational Resources Information Center
Strube, Michael J.; Garcia, Joseph E.
According to Fiedler's Contingency Model of Leadership Effectiveness, group performance is a function of the leader-situation interaction. A review of past validations has found several problems associated with the model. Meta-analytic techniques were applied to the Contingency Model in order to assess the validation evidence quantitatively. The…
A community assessment of privacy preserving techniques for human genomes
2014-01-01
To answer the need for the rigorous protection of biomedical data, we organized the Critical Assessment of Data Privacy and Protection initiative as a community effort to evaluate privacy-preserving dissemination techniques for biomedical data. We focused on the challenge of sharing aggregate human genomic data (e.g., allele frequencies) in a way that preserves the privacy of the data donors, without undermining the utility of genome-wide association studies (GWAS) or impeding their dissemination. Specifically, we designed two problems for disseminating the raw data and the analysis outcome, respectively, based on publicly available data from HapMap and from the Personal Genome Project. A total of six teams participated in the challenges. The final results were presented at a workshop of the iDASH (integrating Data for Analysis, 'anonymization,' and SHaring) National Center for Biomedical Computing. We report the results of the challenge and our findings about the current genome privacy protection techniques. PMID:25521230
Nanomedicine: Techniques, Potentials, and Ethical Implications
Ebbesen, Mette; Jensen, Thomas G.
2006-01-01
Nanotechnology is concerned with materials and systems whose structures and components exhibit novel physical, chemical, and biological properties due to their nanoscale size. This paper focuses on what is known as nanomedicine, referring to the application of nanotechnology to medicine. We consider the use and potentials of emerging nanoscience techniques in medicine such as nanosurgery, tissue engineering, and targeted drug delivery, and we discuss the ethical questions that these techniques raise. The ethical considerations involved in nanomedicine are related to risk assessment in general, somatic-cell versus germline-cell therapy, the enhancement of human capabilities, research into human embryonic stem cells and the toxicity, uncontrolled function and self-assembly of nanoparticles. The ethical considerations associated with the application of nanotechnology to medicine have not been greatly discussed. This paper aims to balance clear ethical discussion and sound science and so provide nanotechnologists and biotechnologists with tools to assess ethical problems in nanomedicine. PMID:17489016
A community assessment of privacy preserving techniques for human genomes.
Jiang, Xiaoqian; Zhao, Yongan; Wang, Xiaofeng; Malin, Bradley; Wang, Shuang; Ohno-Machado, Lucila; Tang, Haixu
2014-01-01
To answer the need for the rigorous protection of biomedical data, we organized the Critical Assessment of Data Privacy and Protection initiative as a community effort to evaluate privacy-preserving dissemination techniques for biomedical data. We focused on the challenge of sharing aggregate human genomic data (e.g., allele frequencies) in a way that preserves the privacy of the data donors, without undermining the utility of genome-wide association studies (GWAS) or impeding their dissemination. Specifically, we designed two problems for disseminating the raw data and the analysis outcome, respectively, based on publicly available data from HapMap and from the Personal Genome Project. A total of six teams participated in the challenges. The final results were presented at a workshop of the iDASH (integrating Data for Analysis, 'anonymization,' and SHaring) National Center for Biomedical Computing. We report the results of the challenge and our findings about the current genome privacy protection techniques.
Radioactive nondestructive test method
NASA Technical Reports Server (NTRS)
Obrien, J. R.; Pullen, K. E.
1971-01-01
Various radioisotope techniques were used as diagnostic tools for determining the performance of spacecraft propulsion feed system elements. Applications were studied in four tasks. The first two required experimental testing involving the propellant liquid oxygen difluoride (OF2): the neutron activation analysis of dissolved or suspended metals, and the use of radioactive tracers to evaluate the probability of constrictions in passive components (orifices and filters) becoming clogged by matter dissolved or suspended in the OF2. The other tasks were an appraisal of the applicability of radioisotope techniques to problems arising from the exposure of components to liquid/gas combinations, and an assessment of the applicability of the techniques to other propellants.
Chromatography in the detection and characterization of illegal pharmaceutical preparations.
Deconinck, Eric; Sacré, Pierre-Yves; Courselle, Patricia; De Beer, Jacques O
2013-09-01
Counterfeit and illegal pharmaceutical products are an increasing worldwide problem and constitute a major challenge for analytical laboratories to detect and characterize them. Spectroscopic techniques such as infrared spectroscopy and Raman spectroscopy have always been the first methods of choice to detect counterfeits and illegal preparations, but due to the evolution in the seized products and the necessity of risk assessment, chromatographic methods are becoming more important in this domain. This review intends to give a general overview of the techniques described in literature to characterize counterfeit and illegal pharmaceutical preparations, focusing on the role of chromatographic techniques with different detection tools.
Safety Guided Design of Crew Return Vehicle in Concept Design Phase Using STAMP/STPA
NASA Astrophysics Data System (ADS)
Nakao, H.; Katahira, M.; Miyamoto, Y.; Leveson, N.
2012-01-01
In the concept development and design phase of a new space system, such as a Crew Vehicle, designers tend to focus on how to implement new technology. Designers also consider the difficulty of using the new technology and trade off several system design candidates. Then they choose an optimal design from the candidates. Safety should be a key aspect driving optimal concept design. However, in past concept design activities, safety analysis such as FTA has not used to drive the design because such analysis techniques focus on component failure and component failure cannot be considered in the concept design phase. The solution to these problems is to apply a new hazard analysis technique, called STAMP/STPA. STAMP/STPA defines safety as a control problem rather than a failure problem and identifies hazardous scenarios and their causes. Defining control flow is the essential in concept design phase. Therefore STAMP/STPA could be a useful tool to assess the safety of system candidates and to be part of the rationale for choosing a design as the baseline of the system. In this paper, we explain our case study of safety guided concept design using STPA, the new hazard analysis technique, and model-based specification technique on Crew Return Vehicle design and evaluate benefits of using STAMP/STPA in concept development phase.
Tang, Peter
2017-12-01
In situ ulnar nerve release has been gaining popularity as a simple, effective, and low-morbidity procedure for the treatment of cubital tunnel syndrome. One concern with the technique is how to manage the unstable ulnar nerve after release. It is unclear how much nerve subluxation will lead to problems and surprisingly there is no grading system to assess ulnar nerve instability. I propose such a grading system, as well as a new technique to stabilize the unstable ulnar nerve. The blocking flap technique consists of raising a rectangular flap off the flexor/pronator fascia and attaching it to the posterior subcutaneous flap so that it blocks the nerve from subluxation/dislocation.
Fan noise prediction assessment
NASA Technical Reports Server (NTRS)
Bent, Paul H.
1995-01-01
This report is an evaluation of two techniques for predicting the fan noise radiation from engine nacelles. The first is a relatively computational intensive finite element technique. The code is named ARC, an abbreviation of Acoustic Radiation Code, and was developed by Eversman. This is actually a suite of software that first generates a grid around the nacelle, then solves for the potential flowfield, and finally solves the acoustic radiation problem. The second approach is an analytical technique requiring minimal computational effort. This is termed the cutoff ratio technique and was developed by Rice. Details of the duct geometry, such as the hub-to-tip ratio and Mach number of the flow in the duct, and modal content of the duct noise are required for proper prediction.
Recommendations for benefit-risk assessment methodologies and visual representations.
Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain
2016-03-01
The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.
Analysis of the high-temperature particulate collection problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Razgaitis, R.
1977-10-01
Particulate agglomeration and separation at high temperatures and pressures are examined, with particular emphasis on the unique features of the direct-cycle application of fluidized-bed combustion. The basic long-range mechanisms of aerosol separation are examined, and the effects of high temperature and high pressure on usable collection techniques are assessed. Primary emphasis is placed on those avenues that are not currently attracting widespread research. The high-temperature, particulate-collection problem is surveyed, together with the peculiar requirements associated with operation of turbines with particulate-bearing gas streams. 238 references.
Successes and surprises with computer-extended series
NASA Astrophysics Data System (ADS)
van Dyke, M.
An alternative to purely numerical solution of flow problems showing promise is the seminumerical technique that involves extending a perturbation series to high order by delegating the mounting arithmetic to a computer. It is noted, however, that since the method is still under development, several erroneous conclusions have been published. First, three clear successes of this method are described. It is then shown how a failure to carefully assess results has in two cases led to false conclusions. Finally, two problems are discussed that yield surprising results not yet accepted by all other investigators.
Considering social and environmental concerns as reservoir operating objectives
NASA Astrophysics Data System (ADS)
Tilmant, A.; Georis, B.; Doulliez, P.
2003-04-01
Sustainability principles are now widely recognized as key criteria for water resource development schemes, such as hydroelectric and multipurpose reservoirs. Development decisions no longer rely solely on economic grounds, but also consider environmental and social concerns through the so-called environmental and social impact assessments. The objective of this paper is to show that environmental and social concerns can also be addressed in the management (operation) of existing or projected reservoir schemes. By either adequately exploiting the results of environmental and social impact assessments, or by carrying out survey of water users, experts and managers, efficient (Pareto optimal) reservoir operating rules can be derived using flexible mathematical programming techniques. By reformulating the problem as a multistage flexible constraint satisfaction problem, incommensurable and subjective operating objectives can contribute, along with classical economic objectives, to the determination of optimal release decisions. Employed in a simulation mode, the results can be used to assess the long-term impacts of various operating rules on the social well-being of affected populations as well as on the integrity of the environment. The methodology is illustrated with a reservoir reallocation problem in Chile.
A novel optical investigation technique for railroad track inspection and assessment
NASA Astrophysics Data System (ADS)
Sabato, Alessandro; Beale, Christopher H.; Niezrecki, Christopher
2017-04-01
Track failures due to cross tie degradation or loss in ballast support may result in a number of problems ranging from simple service interruptions to derailments. Structural Health Monitoring (SHM) of railway track is important for safety reasons and to reduce downtime and maintenance costs. For this reason, novel and cost-effective track inspection technologies for assessing tracks' health are currently insufficient and needed. Advancements achieved in recent years in cameras technology, optical sensors, and image-processing algorithms have made machine vision, Structure from Motion (SfM), and three-dimensional (3D) Digital Image Correlation (DIC) systems extremely appealing techniques for extracting structural deformations and geometry profiles. Therefore, optically based, non-contact measurement techniques may be used for assessing surface defects, rail and tie deflection profiles, and ballast condition. In this study, the design of two camera-based measurement systems is proposed for crossties-ballast condition assessment and track examination purposes. The first one consists of four pairs of cameras installed on the underside of a rail car to detect the induced deformation and displacement on the whole length of the track's cross tie using 3D DIC measurement techniques. The second consists of another set of cameras using SfM techniques for obtaining a 3D rendering of the infrastructure from a series of two-dimensional (2D) images to evaluate the state of the track qualitatively. The feasibility of the proposed optical systems is evaluated through extensive laboratory tests, demonstrating their ability to measure parameters of interest (e.g. crosstie's full-field displacement, vertical deflection, shape, etc.) for assessment and SHM of railroad track.
Checklists for powder inhaler technique: a review and recommendations.
Basheti, Iman A; Bosnic-Anticevich, Sinthia Z; Armour, Carol L; Reddel, Helen K
2014-07-01
Turbuhaler and Diskus are commonly used powder inhaler devices for patients with respiratory disease. Their effectiveness is limited in part by a patient's ability to use them correctly. This has led to numerous studies being conducted over the last decade to assess the correct use of these devices by patients and health care professionals. These studies have generally used device-specific checklists to assess technique, this being the most feasible and accessible method for assessment. However, divergence between the checklists and scoring systems for the same device in different studies makes direct comparison of results difficult and at times inappropriate. Little evidence is available to assess the relative importance of different criteria; however, brief patient training based on specific inhaler technique checklists leads to significant improvement in asthma outcomes. This paper reviews common checklists and scoring systems used for Turbuhaler and Diskus, discusses the problem of heterogeneity between different checklists, and finally recommends suitable checklists and scoring systems for these devices based on the literature and previous findings. Only when similar checklists are used across different research studies will accurate comparisons and meta-analysis be possible. Copyright © 2014 by Daedalus Enterprises.
Cooperative analysis expert situation assessment research
NASA Technical Reports Server (NTRS)
Mccown, Michael G.
1987-01-01
For the past few decades, Rome Air Development Center (RADC) has been conducting research in Artificial Intelligence (AI). When the recent advances in hardware technology made many AI techniques practical, the Intelligence and Reconnaissance Directorate of RADC initiated an applications program entitled Knowledge Based Intelligence Systems (KBIS). The goal of the program is the development of a generic Intelligent Analyst System, an open machine with the framework for intelligence analysis, natural language processing, and man-machine interface techniques, needing only the specific problem domain knowledge to be operationally useful. The development of KBIS is described.
An assessment of transient hydraulics phenomena and its characterization
NASA Technical Reports Server (NTRS)
Mortimer, R. W.
1974-01-01
A systematic search of the open literature was performed with the purpose of identifying the causes, effects, and characterization (modelling and solution techniques) of transient hydraulics phenomena. The governing partial differential equations are presented which were found to be used most often in the literature. Detail survey sheets are shown which contain the type of hydraulics problem, the cause, the modelling, the solution technique utilized, and experimental verification used for each paper. References and source documents are listed and a discussion of the purpose and accomplishments of the study is presented.
Applied Remote Sensing Program (ARSP)
NASA Technical Reports Server (NTRS)
Johnson, J. D.; Foster, K. E.; Mouat, D. A.; Miller, D. A.; Conn, J. S.
1976-01-01
The activities and accomplishments of the Applied Remote Sensing Program during FY 1975-1976 are reported. The principal objective of the Applied Remote Sensing Program continues to be designed projects having specific decision-making impacts as a principal goal. These projects are carried out in cooperation and collaboration with local, state and federal agencies whose responsibilities lie with planning, zoning and environmental monitoring and/or assessment in the application of remote sensing techniques. The end result of the projects is the use by the involved agencies of remote sensing techniques in problem solving.
Balancing techniques for high-speed flexible rotors
NASA Technical Reports Server (NTRS)
Smalley, A. J.
1978-01-01
Ideal and non-ideal conditions for multiplane balancing are addressed. Methodology and procedures for identifying optimum balancing configurations and for assessing, quantitatively, the penalties associated with non-optimum configurations were developed and demonstrated. The problems introduced when vibration sensors are supported on flexible mounts were assessed experimentally, and the effects of flexural asymmetry in the rotor on balancing were investigated. A general purpose method for predicting the threshold of instability of an asymmetric rotor was developed, and its predictions are compared with measurements under different degrees of asymmetry.
How to use: the neonatal neurological examination.
Wusthoff, Courtney J
2013-08-01
The neurological exam can be a challenging part of a newborn's full evaluation. At the same time, the neonatal neurological exam is a useful tool in identifying babies needing closer evaluation for potential problems. The Dubowitz assessment is a standardised approach to the neonatal neurological exam designed for use by paediatricians in routine practice. Evidence has validated this technique and delineated its utility as a screening exam in various populations. This paper reviews clinical application of the Dubowitz assessment of the newborn.
Life-assessment technique for nuclear power plant cables
NASA Astrophysics Data System (ADS)
Bartoníček, B.; Hnát, V.; Plaček, V.
1998-06-01
The condition of polymer-based cable material can be best characterized by measuring elongation at break of its insulating materials. However, it is not often possible to take sufficiently large samples for measurement with the tensile testing machine. The problem has been conveniently solved by utilizing differential scanning calorimetry technique. From the tested cable, several microsamples are taken and the oxidation induction time (OIT) is determined. For each cable which is subject to the assessment of the lifetime, the correlation of OIT with elongation at break and the correlation of elongation at break with the cable service time has to be performed. A reliable assessment of the cable lifetime depends on accuracy of these correlations. Consequently, synergistic effects well known at this time - dose rate effects and effects resulting from the different sequence of applying radiation and elevated temperature must be taken into account.
2013-01-01
In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm × 750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures. PMID:23766706
ERIC Educational Resources Information Center
Wessel, Dorothy
A 10-week classroom intervention program was implemented to facilitate the fine-motor development of eight first-grade children assessed as being deficient in motor skills. The program was divided according to five deficits to be remediated: visual motor, visual discrimination, visual sequencing, visual figure-ground, and visual memory. Each area…
ERIC Educational Resources Information Center
Habib, Masooma
2010-01-01
Teacher absenteeism is a persistent problem in Pakistani government schools. Under a new policy, teachers hired in Pakistani schools after 2002 are hired on fixed term contracts that are renewed, in part, based on low absenteeism. This study uses qualitative analysis techniques to assess the impact of contractual hiring on teacher absenteeism…
ERIC Educational Resources Information Center
Rybash, John M.; And Others
1975-01-01
This study used both verbal and videotape presentation techniques to assess the role of cognitive conflict in children's moral judgments. The results indicated that the children presented problems via videotape based their moral judgments on intentions, while verbal presentation increased the number of moral judgments based on damage. (JMB)
Some Problems of Computer-Aided Testing and "Interview-Like Tests"
ERIC Educational Resources Information Center
Smoline, D.V.
2008-01-01
Computer-based testing--is an effective teacher's tool, intended to optimize course goals and assessment techniques in a comparatively short time. However, this is accomplished only if we deal with high-quality tests. It is strange, but despite the 100-year history of Testing Theory (see, Anastasi, A., Urbina, S. (1997). Psychological testing.…
Apollo Photograph Evaluation (APE) programming manual
NASA Technical Reports Server (NTRS)
Kim, I. J.
1974-01-01
This document describes the programming techniques used to implement the equations of the Apollo Photograph Evaluation (APE) program on the UNIVAC 1108 computer and contains detailed descriptions of the program structure, a User's Guide section to provide the necessary information for proper operation of the program, and information for the assessment of the program's adaptability to future problems.
A Stimulating Approach To Teaching, Learning and Assessing Finite Element Methods: A Case Study.
ERIC Educational Resources Information Center
Karadelis, J. N.
1998-01-01
Examines the benefits of introducing finite element methods into the curriculum of undergraduate courses. Analyzes the structure of the computer-assisted-design module and the extent to which it fulfills its main objectives. Discusses the efficiency of modern teaching and learning techniques used to develop skills for solving engineering problems;…
A Comparison of Equality in Computer Algebra and Correctness in Mathematical Pedagogy (II)
ERIC Educational Resources Information Center
Bradford, Russell; Davenport, James H.; Sangwin, Chris
2010-01-01
A perennial problem in computer-aided assessment is that "a right answer", pedagogically speaking, is not the same thing as "a mathematically correct expression", as verified by a computer algebra system, or indeed other techniques such as random evaluation. Paper I in this series considered the difference in cases where there was "the right…
ERIC Educational Resources Information Center
Yu, Jennifer W.; Wei, Xin; Wagner, Mary
2014-01-01
This study used propensity score techniques on data from the National Longitudinal Transition Study-2 to assess the causal relationship between speech and behavior-based support services and rates of social communication among high school students with Autism Spectrum Disorder (ASD). Findings indicate that receptive language problems were…
ERIC Educational Resources Information Center
Okurut, Jeje Moses
2018-01-01
The impact of automatic promotion practice on students dropping out of Uganda's primary education was assessed using propensity score in difference in differences analysis technique. The analysis strategy was instrumental in addressing the selection bias problem, as well as biases arising from common trends over time, and permanent latent…
Application of computational aero-acoustics to real world problems
NASA Technical Reports Server (NTRS)
Hardin, Jay C.
1996-01-01
The application of computational aeroacoustics (CAA) to real problems is discussed in relation to the analysis performed with the aim of assessing the application of the various techniques. It is considered that the applications are limited by the inability of the computational resources to resolve the large range of scales involved in high Reynolds number flows. Possible simplifications are discussed. It is considered that problems remain to be solved in relation to the efficient use of the power of parallel computers and in the development of turbulent modeling schemes. The goal of CAA is stated as being the implementation of acoustic design studies on a computer terminal with reasonable run times.
Bai, Sunhye; Repetti, Rena L
2018-04-01
Examining emotion reactivity and recovery following minor problems in daily life can deepen our understanding of how stress affects child mental health. This study assessed children's immediate and delayed emotion responses to daily problems at school, and examined their correlations with psychological symptoms. On 5 consecutive weekdays, 83 fifth graders (M = 10.91 years, SD = 0.53, 51% female) completed brief diary forms 5 times per day, providing repeated ratings of school problems and emotions. They also completed a one-time questionnaire about symptoms of depression, and parents and teachers rated child internalizing and externalizing problems. Using multilevel modeling techniques, we assessed within-person daily associations between school problems and negative and positive emotion at school and again at bedtime. On days when children experienced more school problems, they reported more negative emotion and less positive emotion at school, and at bedtime. There were reliable individual differences in emotion reactivity and recovery. Individual-level indices of emotion responses derived from multilevel models were correlated with child psychological symptoms. Children who showed more negative emotion reactivity reported more depressive symptoms. Multiple informants described fewer internalizing problems among children who showed better recovery by bedtime, even after controlling for children's average levels of exposure to school problems. Diary methods can extend our understanding of the links between daily stress, emotions and child mental health. Recovery following stressful events may be an important target of research and intervention for child internalizing problems.
NASA Technical Reports Server (NTRS)
Davis, D. D.; Rodgers, M. O.; Fischer, S. D.; Heaps, W. S.
1981-01-01
Theoretical calculations are presented which estimate the possible magnitude of the O3/H2O derived OH interference signal resulting from the use of the laser-induced fluorescence technique in measuring natural levels of tropospheric OH. Critical to this new assessment has been the measurement of the nascent OH quantum state distribution resulting from the reaction O(1D) + H2O yields 2OH, and an assessment of the subsequent rotational relaxation of the OH species when formed in high k levels.
Using Electronic Noses to Detect Tumors During Neurosurgery
NASA Technical Reports Server (NTRS)
Homer, Margie L.; Ryan, Margaret A.; Lara, Liana M.; Kateb, Babak; Chen, Mike
2008-01-01
It has been proposed to develop special-purpose electronic noses and algorithms for processing the digitized outputs of the electronic noses for determining whether tissue exposed during neurosurgery is cancerous. At present, visual inspection by a surgeon is the only available intraoperative technique for detecting cancerous tissue. Implementation of the proposal would help to satisfy a desire, expressed by some neurosurgeons, for an intraoperative technique for determining whether all of a brain tumor has been removed. The electronic-nose technique could complement multimodal imaging techniques, which have also been proposed as means of detecting cancerous tissue. There are also other potential applications of the electronic-nose technique in general diagnosis of abnormal tissue. In preliminary experiments performed to assess the viability of the proposal, the problem of distinguishing between different types of cultured cells was substituted for the problem of distinguishing between normal and abnormal specimens of the same type of tissue. The figure presents data from one experiment, illustrating differences between patterns that could be used to distinguish between two types of cultured cancer cells. Further development can be expected to include studies directed toward answering questions concerning not only the possibility of distinguishing among various types of normal and abnormal tissue but also distinguishing between tissues of interest and other odorous substances that may be present in medical settings.
Bennema, Anne N; Schendelaar, Pamela; Seggers, Jorien; Haadsma, Maaike L; Heineman, Maas Jan; Hadders-Algra, Mijna
2016-03-01
General movement (GM) assessment is a well-established tool to predict cerebral palsy in high-risk infants. Little is known on the predictive value of GM assessment in low-risk populations. To assess the predictive value of GM quality in early infancy for the development of the clinically relevant form of minor neurological dysfunction (complex MND) and behavioral problems at preschool age. Prospective cohort study. A total of 216 members of the prospective Groningen Assisted Reproductive Techniques (ART) cohort study were included in this study. ART did not affect neurodevelopmental outcome of these relatively low-risk infants born to subfertile parents. GM quality was determined at 2 weeks and 3 months. At 18 months and 4 years, the Hempel neurological examination was used to assess MND. At 4 years, parents completed the Child Behavior Checklist; this resulted in the total problem score (TPS), internalizing problem score (IPS), and externalizing problem score (EPS). Predictive values of definitely (DA) and mildly (MA) abnormal GMs were calculated. DA GMs at 2 weeks were associated with complex MND at 18 months and atypical TPS and IPS at 4 years (all p<0.05). Sensitivity and positive predictive value of DA GMs at 2 weeks were rather low (13%-60%); specificity and negative predictive value were excellent (92%-99%). DA GMs at 3 months occurred too infrequently to calculate prediction. MA GMs were not associated with outcome. GM quality as a single predictor for complex MND and behavioral problems at preschool age has limited clinical value in children at low risk for developmental disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Houston, A. G.; Feiveson, A. H.; Chhikara, R. S.; Hsu, E. M. (Principal Investigator)
1979-01-01
A statistical methodology was developed to check the accuracy of the products of the experimental operations throughout crop growth and to determine whether the procedures are adequate to accomplish the desired accuracy and reliability goals. It has allowed the identification and isolation of key problems in wheat area yield estimation, some of which have been corrected and some of which remain to be resolved. The major unresolved problem in accuracy assessment is that of precisely estimating the bias of the LACIE production estimator. Topics covered include: (1) evaluation techniques; (2) variance and bias estimation for the wheat production estimate; (3) the 90/90 evaluation; (4) comparison of the LACIE estimate with reference standards; and (5) first and second order error source investigations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nealey, S.M.; Liebow, E.B.
1988-03-01
The US Department of Energy sponsored a one-day workshop to discuss the complex dimensions of risk judgment formation and the assessment of social and economic effects of risk perceptions related to the permanent underground storage of highly radioactive waste from commercial nuclear power plants. Affected parties have publicly expressed concerns about potentially significant risk-related effects of this approach to waste management. A selective review of relevant literature in psychology, decision analysis, economics, sociology, and anthropology was completed, along with an examination of decision analysis techniques that might assist in developing suitable responses to public risk-related concerns. The workshop was organizedmore » as a forum in which a set of distinguished experts could exchange ideas and observations about the problems of characterizing the effects of risk judgments. Out of the exchange emerged the issues or themes of problems with probabilistic risk assessment techniques are evident; differences exist in the way experts and laypersons view risk, and this leads to higher levels of public concern than experts feel are justified; experts, risk managers, and decision-makers sometimes err in assessing risk and in dealing with the public; credibility and trust are important contributing factors in the formation of risk judgments; social and economic consequences of perceived risk should be properly anticipated; improvements can be made in informing the public about risk; the role of the public in risk assessment, risk management and decisions about risk should be reconsidered; and mitigation and compensation are central to resolving conflicts arising from divergent risk judgments. 1 tab.« less
Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation
NASA Astrophysics Data System (ADS)
Sleesongsom, S.; Bureerat, S.
2018-03-01
This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.
Swanson, Helen; Power, Kevin; Collin, Paula; Deas, Suzanne; Paterson, Gillian; Grierson, David; Yellowlees, Alex; Park, Katy; Taylor, Louise
2010-01-01
Parental relationships and maladaptive problem solving have been associated with anorexic symptomatology. This study investigates the relationship between perceived parental bonding, social problem solving and eating psychopathology. Forty three female inpatients with anorexia nervosa and 76 student controls were assessed using the Parental Bonding Instrument, the Social Problem Solving Inventory and the Eating Disorders Examination or the Eating Disorders Examination-Questionnaire. The anorexic group reported significantly lower levels of parental care than the student control group and used more negative and avoidance style coping. In the anorexic group, disordered eating was significantly correlated with low maternal care and high control. Maternal bonding was found to mediate the relationship between avoidance style coping and eating pathology. Findings suggest a relationship between maternal bonding, the use of maladaptive problem solving techniques and eating disorder pathology in inpatients with anorexia nervosa.
Assessment of BCG vaccination in India
1957-01-01
A second assessment of the mass BCG-vaccination campaign in India is described in this report. Data were collected to corroborate the findings of the first assessment and to study certain aspects of the problems they posed. Sample retesting of children vaccinated in the mass campaign reveals a higher and less variable allergy than that reported from the preliminary assessment work. The results indicate that a uniform and reasonably high level of allergy has been induced in Indian schoolchildren vaccinated in the campaign period assessed and that deficiencies in the tuberculin test by which the allergy was measured rather than defects of vaccine or vaccination technique were responsible for the disappointing variability initially reported. Testing of unvaccinated village populations in Madras and Mysore confirms previous observations that low-grade, non-specific tuberculin sensitivity is widely prevalent in South India, making it virtually impossible to separate the infected from the uninfected with the tuberculin tests in use today. The development of new techniques for use in areas where the low-grade, non-specific sensitivity is widespread is discussed. PMID:13489464
Simulation and statistics: Like rhythm and song
NASA Astrophysics Data System (ADS)
Othman, Abdul Rahman
2013-04-01
Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.
Bravo, Adrian J; Kelley, Michelle L; Hollis, Brittany F
2017-10-01
This study examined how work stressors were associated with sleep quality and alcohol-related problems among U.S. Navy members over the course of deployment. Participants were 101 U.S. Navy members assigned to an Arleigh Burke-class destroyer who experienced an 8-month deployment after Operational Enduring Freedom/Operation Iraqi Freedom. Approximately 6 weeks prior to deployment, 6 weeks after deployment, and 6 months reintegration, participants completed measures that assessed work stressors, sleep quality, and alcohol-related problems. A piecewise latent growth model was conducted in which the structural paths assessed if work stressors influenced sleep quality or its growth over time, and in turn if sleep quality influenced alcohol-related problems intercepts or growth over time. A significant indirect effect was found such that increases in work stressors from pre- to postdeployment predicted decreases in sleep quality, which in turn were associated with increases in alcohol-related problems from pre- to postdeployment. These effects were maintained from postdeployment through the 6-month reintegration. Findings suggest that work stressors may have important implications for sleep quality and alcohol-related problems. Positive methods of addressing stress and techniques to improve sleep quality are needed as both may be associated with alcohol-related problems among current Navy members. Copyright © 2016 John Wiley & Sons, Ltd.
Optimization technique for problems with an inequality constraint
NASA Technical Reports Server (NTRS)
Russell, K. J.
1972-01-01
General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.
Lopez-Teros, Veronica; Chileshe, Justin; Idohou-Dossou, Nicole; Fajarwati, Tetra; Medoua Nama, Gabriel; Newton, Sam; Vinod Kumar, Malavika; Wang, Zhixu; Wasantwisut, Emorn; Hunt, Janet R
2014-01-01
Inadequate vitamin A (VA) nutrition continues to be a major problem worldwide, and many interventions being implemented to improve VA status in various populations need to be evaluated. The interpretation of results after an intervention depends greatly on the method selected to assess VA status. To evaluate the effect of an intervention on VA status, researchers in Cameroon, India, Indonesia, Mexico, Senegal and Zambia have used serum retinol as an indicator, and have not always found improvement in response to supplementation. One problem is that homeostatic control of serum retinol may mask positive effects of treatment in that changes in concentration are observed only when status is either moderately to severely depleted or excessive. Because VA is stored mainly in the liver, measurements of hepatic VA stores are the gold standard for assessing VA status. Dose response tests such as the relative dose response (RDR) and the modified relative dose response (MRDR), allow a qualitative assessment of VA liver stores. On the other hand, the use of the vitamin A-labeled isotope dilution (VALID) technique, (using 13C or 2H-labeled retinyl acetate) serves as an indirect method to quantitatively estimate total body and liver VA stores. Countries including Cameroon, China, Ghana, Mexico, Thailand and Zambia are now applying the VALID method to sensitively assess changes in VA status during interventions, or to estimate a populations dietary requirement for VA. Transition to the use of more sensitive biochemical indicators of VA status such as the VALID technique is needed to effectively assess interventions in populations where mild to moderate VA deficiency is more prevalent than severe deficiency.
A constrained reconstruction technique of hyperelasticity parameters for breast cancer assessment
NASA Astrophysics Data System (ADS)
Mehrabian, Hatef; Campbell, Gordon; Samani, Abbas
2010-12-01
In breast elastography, breast tissue usually undergoes large compression resulting in significant geometric and structural changes. This implies that breast elastography is associated with tissue nonlinear behavior. In this study, an elastography technique is presented and an inverse problem formulation is proposed to reconstruct parameters characterizing tissue hyperelasticity. Such parameters can potentially be used for tumor classification. This technique can also have other important clinical applications such as measuring normal tissue hyperelastic parameters in vivo. Such parameters are essential in planning and conducting computer-aided interventional procedures. The proposed parameter reconstruction technique uses a constrained iterative inversion; it can be viewed as an inverse problem. To solve this problem, we used a nonlinear finite element model corresponding to its forward problem. In this research, we applied Veronda-Westmann, Yeoh and polynomial models to model tissue hyperelasticity. To validate the proposed technique, we conducted studies involving numerical and tissue-mimicking phantoms. The numerical phantom consisted of a hemisphere connected to a cylinder, while we constructed the tissue-mimicking phantom from polyvinyl alcohol with freeze-thaw cycles that exhibits nonlinear mechanical behavior. Both phantoms consisted of three types of soft tissues which mimic adipose, fibroglandular tissue and a tumor. The results of the simulations and experiments show feasibility of accurate reconstruction of tumor tissue hyperelastic parameters using the proposed method. In the numerical phantom, all hyperelastic parameters corresponding to the three models were reconstructed with less than 2% error. With the tissue-mimicking phantom, we were able to reconstruct the ratio of the hyperelastic parameters reasonably accurately. Compared to the uniaxial test results, the average error of the ratios of the parameters reconstructed for inclusion to the middle and external layers were 13% and 9.6%, respectively. Given that the parameter ratios of the abnormal tissues to the normal ones range from three times to more than ten times, this accuracy is sufficient for tumor classification.
Johnson, Nathan T; Dhroso, Andi; Hughes, Katelyn J; Korkin, Dmitry
2018-06-25
The extent to which the genes are expressed in the cell can be simplistically defined as a function of one or more factors of the environment, lifestyle, and genetics. RNA sequencing (RNA-Seq) is becoming a prevalent approach to quantify gene expression, and is expected to gain better insights to a number of biological and biomedical questions, compared to the DNA microarrays. Most importantly, RNA-Seq allows to quantify expression at the gene and alternative splicing isoform levels. However, leveraging the RNA-Seq data requires development of new data mining and analytics methods. Supervised machine learning methods are commonly used approaches for biological data analysis, and have recently gained attention for their applications to the RNA-Seq data. In this work, we assess the utility of supervised learning methods trained on RNA-Seq data for a diverse range of biological classification tasks. We hypothesize that the isoform-level expression data is more informative for biological classification tasks than the gene-level expression data. Our large-scale assessment is done through utilizing multiple datasets, organisms, lab groups, and RNA-Seq analysis pipelines. Overall, we performed and assessed 61 biological classification problems that leverage three independent RNA-Seq datasets and include over 2,000 samples that come from multiple organisms, lab groups, and RNA-Seq analyses. These 61 problems include predictions of the tissue type, sex, or age of the sample, healthy or cancerous phenotypes and, the pathological tumor stage for the samples from the cancerous tissue. For each classification problem, the performance of three normalization techniques and six machine learning classifiers was explored. We find that for every single classification problem, the isoform-based classifiers outperform or are comparable with gene expression based methods. The top-performing supervised learning techniques reached a near perfect classification accuracy, demonstrating the utility of supervised learning for RNA-Seq based data analysis. Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Graff, Mario; Poli, Riccardo; Flores, Juan J
2013-01-01
Modeling the behavior of algorithms is the realm of evolutionary algorithm theory. From a practitioner's point of view, theory must provide some guidelines regarding which algorithm/parameters to use in order to solve a particular problem. Unfortunately, most theoretical models of evolutionary algorithms are difficult to apply to realistic situations. However, in recent work (Graff and Poli, 2008, 2010), where we developed a method to practically estimate the performance of evolutionary program-induction algorithms (EPAs), we started addressing this issue. The method was quite general; however, it suffered from some limitations: it required the identification of a set of reference problems, it required hand picking a distance measure in each particular domain, and the resulting models were opaque, typically being linear combinations of 100 features or more. In this paper, we propose a significant improvement of this technique that overcomes the three limitations of our previous method. We achieve this through the use of a novel set of features for assessing problem difficulty for EPAs which are very general, essentially based on the notion of finite difference. To show the capabilities or our technique and to compare it with our previous performance models, we create models for the same two important classes of problems-symbolic regression on rational functions and Boolean function induction-used in our previous work. We model a variety of EPAs. The comparison showed that for the majority of the algorithms and problem classes, the new method produced much simpler and more accurate models than before. To further illustrate the practicality of the technique and its generality (beyond EPAs), we have also used it to predict the performance of both autoregressive models and EPAs on the problem of wind speed forecasting, obtaining simpler and more accurate models that outperform in all cases our previous performance models.
The Noggin Factor in Survey Research: Developing New Techniques for Assessing Nonresponse Bias.
ERIC Educational Resources Information Center
Clark, Sheldon B.
The primary objective of this paper is to encourage survey researchers not to become overly reliant on the literature for generic solutions to non-response bias problems. In addition, the paper recounts an example of how a non-traditional approach was used to maximize the usefulness of data collected under unusual constraints and with an a priori…
Techniques of fisheries management: water quality assessment with stream insects
A. Dennis Lemly
2000-01-01
Nutrient enrichment of streams is a long-standing problem that continues to have substantial local and regional consequences. For example, water quality of streams in the southern Appalachian Mountains of the U.S. can be seriously degraded by organic nutrients leached from animal wastes if cattle or other livestock are allowed to graze in the riparian zone. Local...
ERIC Educational Resources Information Center
Gliddon, C. M.; Rosengren, R. J.
2012-01-01
This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and…
ERIC Educational Resources Information Center
Douglass, Helen
2016-01-01
This column presents ideas and techniques to enhance science teaching. In today's classrooms, teachers face numerous challenges. They are preparing students for jobs and careers that are not even conceived of yet. Assessments are being used to address students' college and career readiness and to promote critical thinking and problem solving.…
Jonathan M. Cohen; Jean C. Mangun; Mae A. Davenport; Andrew D. Carver
2008-01-01
Diverse public opinions, competing management goals, and polarized interest groups combine with problems of scale to create a complex management arena for managers in the Central Hardwood Forest region. A mixed-methods approach that incorporated quantitative analysis of data from a photo evaluation-attitude scale survey instrument was used to assess attitudes toward...
ERIC Educational Resources Information Center
Polanin, Joshua R.; Espelage, Dorothy L.
2015-01-01
School bullying and delinquent behaviors are persistent and pervasive problems for schools, and have lasting effects for all individuals involved (Copeland et al., "JAMA Psychiatry" 70:419-426, 2013; Espelage et al., "J Res Adolesc" 24(2):337-349, 2013a). As a result, policymakers and practitioners have attempted to thwart…
ERIC Educational Resources Information Center
Aybek, Birsel; Aslan, Serkan
2016-01-01
Problem Statement: Various research have been conducted investigating the quality and quantity of textbooks such as wording, content, design, visuality, physical properties, activities, methods and techniques, questions and experiments, events, misconceptions, organizations, pictures, text selection, end of unit questions and assessments, indexes…
MPATHav: A software prototype for multiobjective routing in transportation risk assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ganter, J.H.; Smith, J.D.
Most routing problems depend on several important variables: transport distance, population exposure, accident rate, mandated roads (e.g., HM-164 regulations), and proximity to emergency response resources are typical. These variables may need to be minimized or maximized, and often are weighted. `Objectives` to be satisfied by the analysis are thus created. The resulting problems can be approached by combining spatial analysis techniques from geographic information systems (GIS) with multiobjective analysis techniques from the field of operations research (OR); we call this hybrid multiobjective spatial analysis` (MOSA). MOSA can be used to discover, display, and compare a range of solutions that satisfymore » a set of objectives to varying degrees. For instance, a suite of solutions may include: one solution that provides short transport distances, but at a cost of high exposure; another solution that provides low exposure, but long distances; and a range of solutions between these two extremes.« less
Collaborative learning in radiologic science education.
Yates, Jennifer L
2006-01-01
Radiologic science is a complex health profession, requiring the competent use of technology as well as the ability to function as part of a team, think critically, exercise independent judgment, solve problems creatively and communicate effectively. This article presents a review of literature in support of the relevance of collaborative learning to radiologic science education. In addition, strategies for effective design, facilitation and authentic assessment of activities are provided for educators wishing to incorporate collaborative techniques into their program curriculum. The connection between the benefits of collaborative learning and necessary workplace skills, particularly in the areas of critical thinking, creative problem solving and communication skills, suggests that collaborative learning techniques may be particularly useful in the education of future radiologic technologists. This article summarizes research identifying the benefits of collaborative learning for adult education and identifying the link between these benefits and the necessary characteristics of medical imaging technologists.
Quantitative ultrasonic evaluation of concrete structures using one-sided access
NASA Astrophysics Data System (ADS)
Khazanovich, Lev; Hoegh, Kyle
2016-02-01
Nondestructive diagnostics of concrete structures is an important and challenging problem. A recent introduction of array ultrasonic dry point contact transducer systems offers opportunities for quantitative assessment of the subsurface condition of concrete structures, including detection of defects and inclusions. The methods described in this paper are developed for signal interpretation of shear wave impulse response time histories from multiple fixed distance transducer pairs in a self-contained ultrasonic linear array. This included generalizing Kirchoff migration-based synthetic aperture focusing technique (SAFT) reconstruction methods to handle the spatially diverse transducer pair locations, creating expanded virtual arrays with associated reconstruction methods, and creating automated reconstruction interpretation methods for reinforcement detection and stochastic flaw detection. Interpretation of the reconstruction techniques developed in this study were validated using the results of laboratory and field forensic studies. Applicability of the developed methods for solving practical engineering problems was demonstrated.
Asphaltene dispersants as demulsification aids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manek, M.B.
1995-11-01
Destabilization of petroleum asphaltenes may cause a multitude of problems in crude oil recovery and production. One major problem is their agglomeration at the water-oil interface of crude oil emulsions. Once agglomeration occurs, destabilized asphaltenes can form a thick pad in the dehydration equipment, which significantly reduces the demulsification rate. Certain polymeric dispersants increase asphaltene solubilization in hydrocarbon media, and when used in conjunction with emulsion breakers, facilitate the demulsification process. Two case studies are presented that demonstrate how asphaltene dispersants can efficiently inhibit pad formation and help reduce demulsifier dosage. Criteria for dispersant application and selection are discussed, whichmore » include the application of a novel laboratory technique to assess asphaltene stabilization in the crude oil. The technique monitors asphaltene agglomeration while undergoing titration with an incompatible solvent (precipitant). The method was used to evaluate stabilization of asphaltenes in the crude oil and to screen asphaltene dispersants.« less
Gilbert, Luiram R.; Lohra, Parul; Mandlik, V.B.; Rath, S.K.; Jha, A.K.
2012-01-01
Background Esthetics represents an inseparable part of today's oral therapy, and several procedures have been proposed to preserve or enhance it. Gingival recessions may cause hypersensitivity, impaired esthetics and root caries. Keeping in mind patient's desire for improved esthetics and other related problems, every effort should be made to achieve complete root coverage. Methods Different types of modalities have been introduced to treat gingival recession including displaced flaps, free gingival graft, connective tissue graft, different type of barrier membranes and combination of different techniques. The aim of this study was to compare the commonly used techniques for gingival recession coverage and evaluate the results obtained. 73 subjects were selected for the present study who were randomly divided into four groups and were followed at baseline and 180 days where following parameters were recorded: (a) Assessment of gingival recession depth (RD); (b) Assessment of pocket depth (PD); (c) Assessment of clinical attachment level (CAL) and (d) Assessment of width of attached gingiva (WAG). Results Results of this study showed statistically significant reduction of gingival recession, with concomitant attachment gain, following treatment with all tested surgical techniques. However, SCTG with CAF technique showed the highest percentage gain in coverage of recession depth as well as gain in keratinized gingiva. Similar results were obtained with CAF alone. The use of GTR and other techniques showed less predictable coverage and gain in keratinized gingiva. Conclusion Connective tissue grafts were statistically significantly superior to guided tissue regeneration for improvement in gingival recession reduction. PMID:25609865
Impact Hazard Monitoring: Theory and Implementation
NASA Astrophysics Data System (ADS)
Farnocchia, Davide
2015-08-01
Impact monitoring is a crucial component of the mitigation or elimination of the hazard posed by asteroid impacts. Once an asteroid is discovered, it is important to achieve an early detection and an accurate assessment of the risk posed by future Earth encounters. Here we review the most standard impact monitoring techniques. Linear methods are the fastest approach but their applicability regime is limited because of the chaotic dynamics of near-Earth asteroids, whose orbits are often scattered by planetary encounters. Among nonlinear methods, Monte Carlo algorithms are the most reliable ones. However, the large number of near-Earth asteroids and the computational load required to detect low probability impact events make Monte Carlo approaches impractical in the framework of monitoring all near-Earth asteroids. In the last 15 years, the Line of Variations (LOV) method has been the most successful technique as it strikes a remarkable compromise between computational efficiency and the capability of detecting low probability events deep in the nonlinear regime. As a matter of fact, the LOV method is the engine of JPL’s Sentry and University of Pisa’s NEODyS, which the two fully automated impact monitoring systems that routinely search for potential impactors among known near-Earth asteroids. We also present some more recent techniques developed to deal with the new challenges arising in the impact hazard assessment problem. In particular, we describe how to use keyhole maps to go beyond strongly scattering encounters and push forward in time the impact prediction horizon. In these cases asteroids usually have a very well constrained orbit and we often need to account for the action of nongravitational perturbations, especially the Yarkovsky effect. Finally, we discuss the short-term hazard assessment problem for newly discovered asteroids, when only a short observed arc is available. The limited amount of observational data generally leads to severe degeneracies in the orbit estimation process. We overcome these degeneracies by employing ranging techniques, which scan the poorly constrained space of topocentric range and range rate.
Diego-Mas, Jose-Antonio; Poveda-Bautista, Rocio; Garzon-Leal, Diana-Carolina
2015-01-01
Most observational methods for musculoskeletal disorder risk assessment have been developed by researchers to be applied in specific situations, and practitioners could find difficulties in their use in real-work conditions. The main objective of this study was to identify the factors which have an influence on how useful the observational techniques are perceived to be by practitioners and to what extent these factors influence their perception. A survey was conducted on practitioners regarding the problems normally encountered when implementing these methods, as well as the perceived overall utility of these techniques. The results show that practitioners place particular importance on the support the methods provide in making decisions regarding changes in work systems and how applicable they are to different types of jobs. The results of this study can serve as guide to researchers for the development of new assessment techniques that are more useful and applicable in real-work situations.
Assessment of knowledge transfer in the context of biomechanics
NASA Astrophysics Data System (ADS)
Hutchison, Randolph E.
The dynamic act of knowledge transfer, or the connection of a student's prior knowledge to features of a new problem, could be considered one of the primary goals of education. Yet studies highlight more instances of failure than success. This dissertation focuses on how knowledge transfer takes place during individual problem solving, in classroom settings and during group work. Through the lens of dynamic transfer, or how students connect prior knowledge to problem features, this qualitative study focuses on a methodology to assess transfer in the context of biomechanics. The first phase of this work investigates how a pedagogical technique based on situated cognition theory affects students' ability to transfer knowledge gained in a biomechanics class to later experiences both in and out of the classroom. A post-class focus group examined events the students remembered from the class, what they learned from them, and how they connected them to later relevant experiences inside and outside the classroom. These results were triangulated with conceptual gains evaluated through concept inventories and pre- and post- content tests. Based on these results, the next two phases of the project take a more in-depth look at dynamic knowledge transfer during independent problem-solving and group project interactions, respectively. By categorizing prior knowledge (Source Tools), problem features (Target Tools) and the connections between them, results from the second phase of this study showed that within individual problem solving, source tools were almost exclusively derived from "propagated sources," i.e. those based on an authoritative source. This differs from findings in the third phase of the project, in which a mixture of "propagated" sources and "fabricated" sources, i.e. those based on student experiences, were identified within the group project work. This methodology is effective at assessing knowledge transfer in the context of biomechanics through evidence of the ability to identify differing patterns of how different students apply prior knowledge and make new connections between prior knowledge and current problem features in different learning situations. Implications for the use of this methodology include providing insight into not only students' prior knowledge, but also how they connect this prior knowledge to problem features (i.e. dynamic knowledge transfer). It also allows the identification of instances in which external input from other students or the instructor prompted knowledge transfer to take place. The use of this dynamic knowledge transfer lens allows the addressing of gaps in student understanding, and permits further investigations of techniques that increase instances of successful knowledge transfer.
A comparative review of optical surface contamination assessment techniques
NASA Technical Reports Server (NTRS)
Heaney, James B.
1987-01-01
This paper will review the relative sensitivities and practicalities of the common surface analytical methods that are used to detect and identify unwelcome adsorbants on optical surfaces. The compared methods include visual inspection, simple reflectometry and transmissiometry, ellipsometry, infrared absorption and attenuated total reflectance spectroscopy (ATR), Auger electron spectroscopy (AES), scanning electron microscopy (SEM), secondary ion mass spectrometry (SIMS), and mass accretion determined by quartz crystal microbalance (QCM). The discussion is biased toward those methods that apply optical thin film analytical techniques to spacecraft optical contamination problems. Examples are cited from both ground based and in-orbit experiments.
Flow-induced Vibration of SSME Main Injector Liquid-oxygen Posts
NASA Technical Reports Server (NTRS)
Chen, S. S.; Jendrzejczyk, J. A.; Wambsganss, M. W.
1985-01-01
The liquid-oxygen (LOX) posts are exposed to hot hydrogen flowing over the tubes on its way to the combustion chamber. Fatigue cracking of some LOX posts was observed after test firing of the SSMEs. A current design modification consists of attaching impingement shields to the LOX posts in the outer row. The modification improved the vibration/fatigue problem of the LOX posts, but resulted in an increased pressure drop that ultimately shortened the life expectancy of other components. A fundamental study of vibration of the LOX posts was initiated to understand the flow-induced vibration problem and to develop techniques to avoid detrimental vibrational effects with the overall objective of improving engine life. This effort, including an assessment of the problem, scoping calculation and experiment, and a work plan for an integrated theoretical/experimental study of the problem is summarized.
The development and nature of problem-solving among first-semester calculus students
NASA Astrophysics Data System (ADS)
Dawkins, Paul Christian; Mendoza Epperson, James A.
2014-08-01
This study investigates interactions between calculus learning and problem-solving in the context of two first-semester undergraduate calculus courses in the USA. We assessed students' problem-solving abilities in a common US calculus course design that included traditional lecture and assessment with problem-solving-oriented labs. We investigate this blended instruction as a local representative of the US calculus reform movements that helped foster it. These reform movements tended to emphasize problem-solving as well as multiple mathematical registers and quantitative modelling. Our statistical analysis reveals the influence of the blended traditional/reform calculus instruction on students' ability to solve calculus-related, non-routine problems through repeated measures over the semester. The calculus instruction in this study significantly improved students' performance on non-routine problems, though performance improved more regarding strategies and accuracy than it did for drawing conclusions and providing justifications. We identified problem-solving behaviours that characterized top performance or attrition in the course. Top-performing students displayed greater algebraic proficiency, calculus skills, and more general heuristics than their peers, but overused algebraic techniques even when they proved cumbersome or inappropriate. Students who subsequently withdrew from calculus often lacked algebraic fluency and understanding of the graphical register. The majority of participants, when given a choice, relied upon less sophisticated trial-and-error approaches in the numerical register and rarely used the graphical register, contrary to the goals of US calculus reform. We provide explanations for these patterns in students' problem-solving performance in view of both their preparation for university calculus and the courses' assessment structure, which preferentially rewarded algebraic reasoning. While instruction improved students' problem-solving performance, we observe that current instruction requires ongoing refinement to help students develop multi-register fluency and the ability to model quantitatively, as is called for in current US standards for mathematical instruction.
Improved Vote Aggregation Techniques for the Geo-Wiki Cropland Capture Crowdsourcing Game
NASA Astrophysics Data System (ADS)
Baklanov, Artem; Fritz, Steffen; Khachay, Michael; Nurmukhametov, Oleg; Salk, Carl; See, Linda; Shchepashchenko, Dmitry
2016-04-01
Crowdsourcing is a new approach for solving data processing problems for which conventional methods appear to be inaccurate, expensive, or time-consuming. Nowadays, the development of new crowdsourcing techniques is mostly motivated by so called Big Data problems, including problems of assessment and clustering for large datasets obtained in aerospace imaging, remote sensing, and even in social network analysis. By involving volunteers from all over the world, the Geo-Wiki project tackles problems of environmental monitoring with applications to flood resilience, biomass data analysis and classification of land cover. For example, the Cropland Capture Game, which is a gamified version of Geo-Wiki, was developed to aid in the mapping of cultivated land, and was used to gather 4.5 million image classifications from the Earth's surface. More recently, the Picture Pile game, which is a more generalized version of Cropland Capture, aims to identify tree loss over time from pairs of very high resolution satellite images. Despite recent progress in image analysis, the solution to these problems is hard to automate since human experts still outperform the majority of machine learning algorithms and artificial systems in this field on certain image recognition tasks. The replacement of rare and expensive experts by a team of distributed volunteers seems to be promising, but this approach leads to challenging questions such as: how can individual opinions be aggregated optimally, how can confidence bounds be obtained, and how can the unreliability of volunteers be dealt with? In this paper, on the basis of several known machine learning techniques, we propose a technical approach to improve the overall performance of the majority voting decision rule used in the Cropland Capture Game. The proposed approach increases the estimated consistency with expert opinion from 77% to 86%.
NASA Astrophysics Data System (ADS)
Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia
2016-07-01
This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is considered. Many tasks in computational materials science can be posed as optimization problems. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The last approach is concerned with the generation of realizations of materials with specified but limited microstructural information: an intriguing inverse problem of both fundamental and practical importance. Computational models based upon the theories of molecular dynamics or quantum mechanics would enable the prediction and modification of fundamental materials properties. This problem is solved using deterministic and stochastic optimization techniques. The main optimization approaches in the frame of the EU project "Superlight-weight thermal protection system for space application" are discussed. Optimization approach to the alloys for obtaining materials with required properties using modeling techniques and experimental data will be also considered. This report is supported by the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)"
Fractals and Spatial Methods for Mining Remote Sensing Imagery
NASA Technical Reports Server (NTRS)
Lam, Nina; Emerson, Charles; Quattrochi, Dale
2003-01-01
The rapid increase in digital remote sensing and GIS data raises a critical problem -- how can such an enormous amount of data be handled and analyzed so that useful information can be derived quickly? Efficient handling and analysis of large spatial data sets is central to environmental research, particularly in global change studies that employ time series. Advances in large-scale environmental monitoring and modeling require not only high-quality data, but also reliable tools to analyze the various types of data. A major difficulty facing geographers and environmental scientists in environmental assessment and monitoring is that spatial analytical tools are not easily accessible. Although many spatial techniques have been described recently in the literature, they are typically presented in an analytical form and are difficult to transform to a numerical algorithm. Moreover, these spatial techniques are not necessarily designed for remote sensing and GIS applications, and research must be conducted to examine their applicability and effectiveness in different types of environmental applications. This poses a chicken-and-egg problem: on one hand we need more research to examine the usability of the newer techniques and tools, yet on the other hand, this type of research is difficult to conduct if the tools to be explored are not accessible. Another problem that is fundamental to environmental research are issues related to spatial scale. The scale issue is especially acute in the context of global change studies because of the need to integrate remote-sensing and other spatial data that are collected at different scales and resolutions. Extrapolation of results across broad spatial scales remains the most difficult problem in global environmental research. There is a need for basic characterization of the effects of scale on image data, and the techniques used to measure these effects must be developed and implemented to allow for a multiple scale assessment of the data before any useful process-oriented modeling involving scale-dependent data can be conducted. Through the support of research grants from NASA, we have developed a software module called ICAMS (Image Characterization And Modeling System) to address the need to develop innovative spatial techniques and make them available to the broader scientific communities. ICAMS provides new spatial techniques, such as fractal analysis, geostatistical functions, and multiscale analysis that are not easily available in commercial GIS/image processing software. By bundling newer spatial methods in a user-friendly software module, researchers can begin to test and experiment with the new spatial analysis methods and they can gauge scale effects using a variety of remote sensing imagery. In the following, we describe briefly the development of ICAMS and present application examples.
A free response test of interpersonal effectiveness.
Getter, H; Nowinski, J K
1981-06-01
Development of the Interpersonal Problem Solving Assessment Technique (IPSAT), College form, is described. Guided by Rotter's Social Learning Theory, problem-solving, and assertiveness research, a semi-structured free response format was designed to assess components of interpersonal effectiveness, The instrument yields patterns of self-reported behaviors in six classes of problematic social situations. A detailed manual enabled reliable scoring of the following response categories: Effectiveness, avoidance, appropriateness, dependency and solution productivity. Scores were not materially affected by sex, verbal ability, or social desirability response sets. Correlations with the College Self-Expression Scale, the Edwards Personal Preference Schedule and the Lanyon Psychological Screening Inventory provided initial evidence of validity. Comparison of mean IPSAT scores of 23 psychotherapy clients with those of 78 normative subjects showed that clients report less interpersonal effectiveness and more avoidance than controls. Implications for utility of the IPSAT are discussed.
Protection of Health Imagery by Region Based Lossless Reversible Watermarking Scheme
Priya, R. Lakshmi; Sadasivam, V.
2015-01-01
Providing authentication and integrity in medical images is a problem and this work proposes a new blind fragile region based lossless reversible watermarking technique to improve trustworthiness of medical images. The proposed technique embeds the watermark using a reversible least significant bit embedding scheme. The scheme combines hashing, compression, and digital signature techniques to create a content dependent watermark making use of compressed region of interest (ROI) for recovery of ROI as reported in literature. The experiments were carried out to prove the performance of the scheme and its assessment reveals that ROI is extracted in an intact manner and PSNR values obtained lead to realization that the presented scheme offers greater protection for health imageries. PMID:26649328
Role of Square Flap in Post Burn Axillary Contractures.
Karki, Durga; Narayan, Ravi Prakash
2017-09-01
Post-burn contractures are a commonly encountered problem and many techniques have been described in their treatment. Z-plasties are the commonest local flap procedure done for linear bands with adjacent healthy tissue. Our aim was to assess the use of square flap technique in axillary contractures. Ten patients with type I and II axillary contractures underwent release by the square flap technique. All cases were followed up for at least one year and analysed for range of motion and aesthetic outcome. All cases achieved full range of movement postoperatively with no recurrence during follow up period and a good cosmetic outcome. Square flap was shown to be a reliable technique for mild to moderate axillary contractures of the anterior or posterior axillary folds even when there is significant adjacent scarring of chest wall or back of types I and II.
Primal-dual techniques for online algorithms and mechanisms
NASA Astrophysics Data System (ADS)
Liaghat, Vahid
An offline algorithm is one that knows the entire input in advance. An online algorithm, however, processes its input in a serial fashion. In contrast to offline algorithms, an online algorithm works in a local fashion and has to make irrevocable decisions without having the entire input. Online algorithms are often not optimal since their irrevocable decisions may turn out to be inefficient after receiving the rest of the input. For a given online problem, the goal is to design algorithms which are competitive against the offline optimal solutions. In a classical offline scenario, it is often common to see a dual analysis of problems that can be formulated as a linear or convex program. Primal-dual and dual-fitting techniques have been successfully applied to many such problems. Unfortunately, the usual tricks come short in an online setting since an online algorithm should make decisions without knowing even the whole program. In this thesis, we study the competitive analysis of fundamental problems in the literature such as different variants of online matching and online Steiner connectivity, via online dual techniques. Although there are many generic tools for solving an optimization problem in the offline paradigm, in comparison, much less is known for tackling online problems. The main focus of this work is to design generic techniques for solving integral linear optimization problems where the solution space is restricted via a set of linear constraints. A general family of these problems are online packing/covering problems. Our work shows that for several seemingly unrelated problems, primal-dual techniques can be successfully applied as a unifying approach for analyzing these problems. We believe this leads to generic algorithmic frameworks for solving online problems. In the first part of the thesis, we show the effectiveness of our techniques in the stochastic settings and their applications in Bayesian mechanism design. In particular, we introduce new techniques for solving a fundamental linear optimization problem, namely, the stochastic generalized assignment problem (GAP). This packing problem generalizes various problems such as online matching, ad allocation, bin packing, etc. We furthermore show applications of such results in the mechanism design by introducing Prophet Secretary, a novel Bayesian model for online auctions. In the second part of the thesis, we focus on the covering problems. We develop the framework of "Disk Painting" for a general class of network design problems that can be characterized by proper functions. This class generalizes the node-weighted and edge-weighted variants of several well-known Steiner connectivity problems. We furthermore design a generic technique for solving the prize-collecting variants of these problems when there exists a dual analysis for the non-prize-collecting counterparts. Hence, we solve the online prize-collecting variants of several network design problems for the first time. Finally we focus on designing techniques for online problems with mixed packing/covering constraints. We initiate the study of degree-bounded graph optimization problems in the online setting by designing an online algorithm with a tight competitive ratio for the degree-bounded Steiner forest problem. We hope these techniques establishes a starting point for the analysis of the important class of online degree-bounded optimization on graphs.
NASA Astrophysics Data System (ADS)
Yang, Jing; Reichert, Peter; Abbaspour, Karim C.; Yang, Hong
2007-07-01
SummaryCalibration of hydrologic models is very difficult because of measurement errors in input and response, errors in model structure, and the large number of non-identifiable parameters of distributed models. The difficulties even increase in arid regions with high seasonal variation of precipitation, where the modelled residuals often exhibit high heteroscedasticity and autocorrelation. On the other hand, support of water management by hydrologic models is important in arid regions, particularly if there is increasing water demand due to urbanization. The use and assessment of model results for this purpose require a careful calibration and uncertainty analysis. Extending earlier work in this field, we developed a procedure to overcome (i) the problem of non-identifiability of distributed parameters by introducing aggregate parameters and using Bayesian inference, (ii) the problem of heteroscedasticity of errors by combining a Box-Cox transformation of results and data with seasonally dependent error variances, (iii) the problems of autocorrelated errors, missing data and outlier omission with a continuous-time autoregressive error model, and (iv) the problem of the seasonal variation of error correlations with seasonally dependent characteristic correlation times. The technique was tested with the calibration of the hydrologic sub-model of the Soil and Water Assessment Tool (SWAT) in the Chaohe Basin in North China. The results demonstrated the good performance of this approach to uncertainty analysis, particularly with respect to the fulfilment of statistical assumptions of the error model. A comparison with an independent error model and with error models that only considered a subset of the suggested techniques clearly showed the superiority of the approach based on all the features (i)-(iv) mentioned above.
Braeken, Anna P B M; Lechner, Lilian; van Gils, Francis C J M; Houben, Ruud M A; Eekers, Daniëlle; Ambergen, Ton; Kempen, Gertrudis I J M
2009-06-09
The Screening Inventory of Psychosocial Problems (SIPP) is a short, validated self-reported questionnaire to identify psychosocial problems in Dutch cancer patients. The one-page 24-item questionnaire assesses physical complaints, psychological complaints and social and sexual problems. Very little is known about the effects of using the SIPP in consultation settings. Our study aims are to test the hypotheses that using the SIPP (a) may contribute to adequate referral to relevant psychosocial caregivers, (b) should facilitate communication between radiotherapists and cancer patients about psychosocial distress and (c) may prevent underdiagnosis of early symptoms reflecting psychosocial problems. This paper presents the design of a cluster randomised controlled trial (CRCT) evaluating the effectiveness of using the SIPP in cancer patients treated with radiotherapy. A CRCT is developed using a Solomon four-group design (two intervention and two control groups) to evaluate the effects of using the SIPP. Radiotherapists, instead of cancer patients, are randomly allocated to the experimental or control groups. Within these groups, all included cancer patients are randomised into two subgroups: with and without pre-measurement. Self-reported assessments are conducted at four times: a pre-test at baseline before the first consultation and a post-test directly following the first consultation, and three and 12 months after baseline measurement. The primary outcome measures are the number and types of referrals of cancer patients with psychosocial problems to relevant (psychosocial) caregivers. The secondary outcome measures are patients' satisfaction with the radiotherapist-patient communication, psychosocial distress and quality of life. Furthermore, a process evaluation will be carried out. Data of the effect-evaluation will be analysed according to the intention-to-treat principle and data regarding the types of referrals to health care providers and patient satisfaction about the with radiotherapists will be analysed by means of descriptive techniques. The process evaluation data will also be analysed by means of descriptive techniques. Using the SIPP may prevent underdiagnosis of early symptoms reflecting psychosocial problems, should facilitate communication between physicians and patients about psychosocial distress and may contribute to adequate referral to relevant (psychosocial) caregivers. NCT00859768.
Quantification of Liver Iron with MRI: State of the Art and Remaining Challenges
Hernando, Diego; Levin, Yakir S; Sirlin, Claude B; Reeder, Scott B
2015-01-01
Liver iron overload is the histological hallmark of hereditary hemochromatosis and transfusional hemosiderosis, and can also occur in chronic hepatopathies. Iron overload can result in liver damage, with the eventual development of cirrhosis, liver failure and hepatocellular carcinoma. Assessment of liver iron levels is necessary for detection and quantitative staging of iron overload, and monitoring of iron-reducing treatments. This article discusses the need for non-invasive assessment of liver iron, and reviews qualitative and quantitative methods with a particular emphasis on MRI. Specific MRI methods for liver iron quantification include signal intensity ratio as well as R2 and R2* relaxometry techniques. Methods that are in clinical use, as well as their limitations, are described. Remaining challenges, unsolved problems, and emerging techniques to provide improved characterization of liver iron deposition are discussed. PMID:24585403
NASA Astrophysics Data System (ADS)
Volkov, D.
2017-12-01
We introduce an algorithm for the simultaneous reconstruction of faults and slip fields on those faults. We define a regularized functional to be minimized for the reconstruction. We prove that the minimum of that functional converges to the unique solution of the related fault inverse problem. Due to inherent uncertainties in measurements, rather than seeking a deterministic solution to the fault inverse problem, we consider a Bayesian approach. The advantage of such an approach is that we obtain a way of quantifying uncertainties as part of our final answer. On the downside, this Bayesian approach leads to a very large computation. To contend with the size of this computation we developed an algorithm for the numerical solution to the stochastic minimization problem which can be easily implemented on a parallel multi-core platform and we discuss techniques to save on computational time. After showing how this algorithm performs on simulated data and assessing the effect of noise, we apply it to measured data. The data was recorded during a slow slip event in Guerrero, Mexico.
NASA Astrophysics Data System (ADS)
Metcalfe, C.; Bennett, E.; Chappell, M.; Steevens, J.; Depledge, M.; Goss, G.; Goudey, S.; Kaczmar, S.; O'Brien, N.; Picado, A.; Ramadan, A. B.
Traditional risk assessment procedures are inadequate for predicting the ecological risks associated with the release of nanomaterials (NM) into the environment. The root of the problem lies in an inadequate application of solid phase chemical principles (e.g. particle size, shape, functionality) for the risk assessment of NMs. Thus, the "solubility" paradigm used to evaluate the risks associated with other classes of contaminants must be replaced by a "dispersivity" paradigm for evaluating the risks associated with NM. The pace of development of NM will exceed the capacity to conduct adequate risk assessments using current methods and approaches. Each NM product will be available in a variety of size classes and with different surface functionalizations; probably requiring multiple risk assessments for each NM. The "SMARTEN" approach to risk assessment involves having risk assessors play a more proactive role in evaluating all aspects of the NM life cycle and in making decisions to develop lower risk NM products. Improved problem formulation could come from considering the chemical, physical and biological properties of NMs. New effects assessment techniques are needed to evaluate cellular binding and uptake potential, such as biological assays for binding to macromolecules or organelles, phagocytic activity, and active/passive uptake processes. Tests should be developed to evaluate biological effects with multiple species across a range of trophic levels. Despite our best efforts to assess the risks associated with NM, previous experience indicates that some NM products will enter the environment and cause biological effects. Therefore, risk assessors should support programs for reconnaissance and surveillance to detect the impacts of NM before irreversible damage occurs. New analytical tools are needed for surveillance, including sensors for detecting NMs, passive sampling systems, and improved methods for separation and characterization of NMs in environmental matrices, as well as biomarker techniques to evaluate exposure to NMs. Risk assessors should use this information to refine data quality, determine future risk assessment objectives and to communicate interim conclusions to a wide group of stakeholders.1
Risk assessment techniques with applicability in marine engineering
NASA Astrophysics Data System (ADS)
Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.
2015-11-01
Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.
NASA Technical Reports Server (NTRS)
Oren, J. A.
1981-01-01
Candidate techniques for thermal management of unmanned modules docked to a large 250 kW platform were evaluated. Both automatically deployed and space constructed radiator systems were studied to identify characteristics and potential problems. Radiator coating requirements and current state-of-the-art were identified. An assessment of the technology needs was made and advancements were recommended.
Mace, Georgina M; Gittleman, John L; Purvis, Andy
2003-06-13
Phylogenies provide new ways to measure biodiversity, to assess conservation priorities, and to quantify the evolutionary history in any set of species. Methodological problems and a lack of knowledge about most species have so far hampered their use. In the future, as techniques improve and more data become accessible, we will have an expanded set of conservation options, including ways to prioritize outcomes from evolutionary and ecological processes.
Khiami, F; Di Schino, M; Sariali, E; Cao, D; Rolland, E; Catonné, Y
2013-09-01
The Bosworth technique is old but still widely used. It involves problems of precisely determining the length of the Achilles tendon and of a volume effect in the turndown area. A new reconstruction technique is assessed, based on free sural triceps aponeurosis transfer without turndown, associated to tendon shortening suture. Twenty-three patients were assessed by AOFAS score and clinical examination (plus MRI in 14 cases) at a mean 24.5 months' follow-up. Mean age was 52.1 years. Mean pre-operative AOFAS score was 63.6/100. Mean postoperative AOFAS score was 96.1. Mean graft length was 7.5 cm. Surgical revision was required for one case of postoperative infection. Twelve patients resumed leisure sports at their previous level by a mean 9.4 ± 2 months; three competitive sportsmen resumed sport at their previous level by a mean 7.6 months. None were dissatisfied or disappointed with their operation. MRI performed at 1 year found increased tendon volume without abnormality in 57% of cases; 43% showed abnormal images. Functional results were comparable to literature reports. It can be difficult to determine Achilles length for the Bosworth technique: this is made easier by conserving a fibrous support of a length determined with reference to the healthy side. The technique avoids aponeurosis turndown, and thus avoids the problem of plasty volume effect. The two cases of cutaneous complication occurred in the two most elderly patients, raising the question of the indications for reconstructive surgery in the elderly. The abnormalities found on MRI concerned scar tissue remodeling in patients with good or excellent clinical results. Level IV, retrospective study. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Assessment of bone health in children with disabilities.
Kecskemethy, Heidi H; Harcke, H Theodore
2014-01-01
Evaluating the bone health of children with disabilities is challenging and requires consideration of many factors in clinical decision-making. Feeding problems and growth deficits, immobility/inability to bear weight, effect of medications, and the nature of his or her disease can all directly affect a child's overall picture of bone health. Familiarity with the tools available to assess bone health is important for practitioners. The most commonly used method to assess bone density, dual energy x-ray absorptiometry, can be performed effectively when one appreciates the techniques that make scanning patients with disabilities possible. There are specific techniques that are especially useful for measuring bone density in children with disabilities; standard body sites are not always obtainable. Consideration of clinical condition and treatment must be considered when interpreting dual energy x-ray absorptiometry scans. Serial measurements have been shown to be effective in monitoring change in bone content and in providing information on which to base decisions regarding medical treatment.
Stage fright: its experience as a problem and coping with it.
Studer, Regina; Gomez, Patrick; Hildebrandt, Horst; Arial, Marc; Danuser, Brigitta
2011-10-01
This questionnaire survey of 190 university music students assessed negative feelings of music performance anxiety (MPA) before performing, the experience of stage fright as a problem, and how closely they are associated with each other. The study further investigated whether the experience of stage fright as a problem and negative feelings of MPA predict the coping behavior of the music students. Rarely addressed coping issues were assessed, i.e., self-perceived effectiveness of different coping strategies, knowledge of possible risks and acceptance of substance-based coping strategies, and need for more support. The results show that one-third of the students experienced stage fright as a problem and that this was only moderately correlated with negative feelings of MPA. The experience of stage fright as a problem significantly predicted the frequency of use and the acceptance of medication as a coping strategy. Breathing exercises and self-control techniques were rated as effective as medication. Finally, students expressed a strong need to receive more support (65%) and more information (84%) concerning stage fright. Stage fright was experienced as a problem and perceived as having negative career consequences by a considerable percentage of the surveyed students. In addition to a desire for more help and support, the students expressed an openness and willingness to seriously discuss and address the topic of stage fright. This provides a necessary and promising basis for optimal career preparation and, hence, an opportunity to prevent occupational problems in professional musicians.
Assessment of distraction from erotic stimuli by nonerotic interference.
Anderson, Alex B; Hamilton, Lisa Dawn
2015-01-01
Distraction from erotic cues during sexual encounters is a major contributor to sexual difficulties in men and women. Being able to assess distraction in studies of sexual arousal will help clarify underlying contributions to sexual problems. The current study aimed to identify the most accurate assessment of distraction from erotic cues in healthy men (n = 29) and women (n = 38). Participants were assigned to a no distraction, low distraction, or high distraction condition. Distraction was induced using an auditory distraction task presented during the viewing of an erotic video. Attention to erotic cues was assessed using three methods: a written quiz, a visual quiz, and a self-reported distraction measure. Genital and psychological sexual responses were also measured. Self-reported distraction and written quiz scores most accurately represented the level of distraction present, while self-reported distraction also corresponded with a decrease in genital arousal. Findings support the usefulness of self-report measures in conjunction with a brief quiz on the erotic material as the most accurate and sensitive ways to simply measure experimentally-induced distraction. Insight into distraction assessment techniques will enable evaluation of naturally occurring distraction in patients suffering from sexual problems.
A boundary element alternating method for two-dimensional mixed-mode fracture problems
NASA Technical Reports Server (NTRS)
Raju, I. S.; Krishnamurthy, T.
1992-01-01
A boundary element alternating method, denoted herein as BEAM, is presented for two dimensional fracture problems. This is an iterative method which alternates between two solutions. An analytical solution for arbitrary polynomial normal and tangential pressure distributions applied to the crack faces of an embedded crack in an infinite plate is used as the fundamental solution in the alternating method. A boundary element method for an uncracked finite plate is the second solution. For problems of edge cracks a technique of utilizing finite elements with BEAM is presented to overcome the inherent singularity in boundary element stress calculation near the boundaries. Several computational aspects that make the algorithm efficient are presented. Finally, the BEAM is applied to a variety of two dimensional crack problems with different configurations and loadings to assess the validity of the method. The method gives accurate stress intensity factors with minimal computing effort.
What if ? On alternative conceptual models and the problem of their implementation
NASA Astrophysics Data System (ADS)
Neuberg, Jurgen
2015-04-01
Seismic and other monitoring techniques rely on a set of conceptual models on the base of which data sets can be interpreted. In order to do this on an operational level in volcano observatories these models need to be tested and ready for an interpretation in a timely manner. Once established, scientists in charge advising stakeholders and decision makers often stick firmly to these models to avoid confusion by giving alternative versions of interpretations to non-experts. This talk gives an overview of widely accepted conceptual models to interpret seismic and deformation data, and highlights in a few case studies some of the arising problems. Aspects covered include knowledge transfer between research institutions and observatories, data sharing, the problem of up-taking advice, and some hidden problems which turn out to be much more critical in assessing volcanic hazard than the actual data interpretation.
NASA Astrophysics Data System (ADS)
Aminot, A.
1996-09-01
An essential prerequisite for quality assurance of the colorimetric determination of nutrients in seawater is the use of suitable photometric equipment. Based on a knowledge of the optical characteristics of a particular system and the absorption coefficient of the analyte, a statistical approach can be used to predict the limit of detection and the limit of quantitation for a given determinand. The microplate technique, widely used for bioassays, is applicable to colorimetric analysis in general, and its use for the determination of nutrients in seawater has been suggested. This paper reports a theoretical assessment of its capabilities in this context and a practical check on its performance, taking the determination of nitrite in seawater as typical. The conclusion is that short optical path length and insufficient repeatability of the absorbance measurement render it unsuitable for the determination of the low concentrations generally encountered in marine work, with the possible exception of nitrate. The perceived advantage of high-speed analysis is a secondary consideration in the overall process of determining nutrients, and the microplate technique's small scale of operation is a definite disadvantage as this increases the risk of exposure to contamination problems, in comparison with conventional techniques.
Postoperative pain management in the postanesthesia care unit: an update
Luo, Jie; Min, Su
2017-01-01
Acute postoperative pain remains a major problem, resulting in multiple undesirable outcomes if inadequately controlled. Most surgical patients spend their immediate postoperative period in the postanesthesia care unit (PACU), where pain management, being unsatisfactory and requiring improvements, affects further recovery. Recent studies on postoperative pain management in the PACU were reviewed for the advances in assessments and treatments. More objective assessments of pain being independent of patients’ participation may be potentially appropriate in the PACU, including photoplethysmography-derived parameters, analgesia nociception index, skin conductance, and pupillometry, although further studies are needed to confirm their utilities. Multimodal analgesia with different analgesics and techniques has been widely used. With theoretical basis of preventing central sensitization, preventive analgesia is increasingly common. New opioids are being developed with minimization of adverse effects of traditional opioids. More intravenous nonopioid analgesics and adjuncts (such as dexmedetomidine and dexamethasone) are introduced for their opioid-sparing effects. Current evidence suggests that regional analgesic techniques are effective in the reduction of pain and stay in the PACU. Being available alternatives to epidural analgesia, perineural techniques and infiltrative techniques including wound infiltration, transversus abdominis plane block, local infiltration analgesia, and intraperitoneal administration have played a more important role for their effectiveness and safety. PMID:29180895
Distal Tracheal Resection and Reconstruction: State of the Art and Lessons Learned.
Mathisen, Douglas
2018-05-01
Tracheal disease is an infrequent problem requiring surgery. A high index of suspicion is necessary to correctly diagnose the problems. Primary concerns are safe control and assessment of the airway, familiarity with the principles of airway surgery, preserving tracheal blood supply, and avoiding anastomotic tension. A precise reproducible anastomotic technique must be mastered. Operation requires close cooperation with a knowledgeable anesthesia team. The surgeon must understand how to achieve the least tension on the anastomosis to avoid. It is advisable to examine the airway before discharge to check for normal healing and airway patency. Copyright © 2018 Elsevier Inc. All rights reserved.
Novel sensing technology in fall risk assessment in older adults: a systematic review.
Sun, Ruopeng; Sosnoff, Jacob J
2018-01-16
Falls are a major health problem for older adults with significant physical and psychological consequences. A first step of successful fall prevention is to identify those at risk of falling. Recent advancement in sensing technology offers the possibility of objective, low-cost and easy-to-implement fall risk assessment. The objective of this systematic review is to assess the current state of sensing technology on providing objective fall risk assessment in older adults. A systematic review was conducted in accordance to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis statement (PRISMA). Twenty-two studies out of 855 articles were systematically identified and included in this review. Pertinent methodological features (sensing technique, assessment activities, outcome variables, and fall discrimination/prediction models) were extracted from each article. Four major sensing technologies (inertial sensors, video/depth camera, pressure sensing platform and laser sensing) were reported to provide accurate fall risk diagnostic in older adults. Steady state walking, static/dynamic balance, and functional mobility were used as the assessment activity. A diverse range of diagnostic accuracy across studies (47.9% - 100%) were reported, due to variation in measured kinematic/kinetic parameters and modelling techniques. A wide range of sensor technologies have been utilized in fall risk assessment in older adults. Overall, these devices have the potential to provide an accurate, inexpensive, and easy-to-implement fall risk assessment. However, the variation in measured parameters, assessment tools, sensor sites, movement tasks, and modelling techniques, precludes a firm conclusion on their ability to predict future falls. Future work is needed to determine a clinical meaningful and easy to interpret fall risk diagnosis utilizing sensing technology. Additionally, the gap between functional evaluation and user experience to technology should be addressed.
Experimental evaluation of certification trails using abstract data type validation
NASA Technical Reports Server (NTRS)
Wilson, Dwight S.; Sullivan, Gregory F.; Masson, Gerald M.
1993-01-01
Certification trails are a recently introduced and promising approach to fault-detection and fault-tolerance. Recent experimental work reveals many cases in which a certification-trail approach allows for significantly faster program execution time than a basic time-redundancy approach. Algorithms for answer-validation of abstract data types allow a certification trail approach to be used for a wide variety of problems. An attempt to assess the performance of algorithms utilizing certification trails on abstract data types is reported. Specifically, this method was applied to the following problems: heapsort, Hullman tree, shortest path, and skyline. Previous results used certification trails specific to a particular problem and implementation. The approach allows certification trails to be localized to 'data structure modules,' making the use of this technique transparent to the user of such modules.
An Island Grouping Genetic Algorithm for Fuzzy Partitioning Problems
Salcedo-Sanz, S.; Del Ser, J.; Geem, Z. W.
2014-01-01
This paper presents a novel fuzzy clustering technique based on grouping genetic algorithms (GGAs), which are a class of evolutionary algorithms especially modified to tackle grouping problems. Our approach hinges on a GGA devised for fuzzy clustering by means of a novel encoding of individuals (containing elements and clusters sections), a new fitness function (a superior modification of the Davies Bouldin index), specially tailored crossover and mutation operators, and the use of a scheme based on a local search and a parallelization process, inspired from an island-based model of evolution. The overall performance of our approach has been assessed over a number of synthetic and real fuzzy clustering problems with different objective functions and distance measures, from which it is concluded that the proposed approach shows excellent performance in all cases. PMID:24977235
NASA Technical Reports Server (NTRS)
Sidney, T.; Aylott, B.; Christensen, N.; Farr, B.; Farr, W.; Feroz, F.; Gair, J.; Grover, K.; Graff, P.; Hanna, C.;
2014-01-01
The problem of reconstructing the sky position of compact binary coalescences detected via gravitational waves is a central one for future observations with the ground-based network of gravitational-wave laser interferometers, such as Advanced LIGO and Advanced Virgo. Different techniques for sky localization have been independently developed. They can be divided in two broad categories: fully coherent Bayesian techniques, which are high latency and aimed at in-depth studies of all the parameters of a source, including sky position, and "triangulation-based" techniques, which exploit the data products from the search stage of the analysis to provide an almost real-time approximation of the posterior probability density function of the sky location of a detection candidate. These techniques have previously been applied to data collected during the last science runs of gravitational-wave detectors operating in the so-called initial configuration. Here, we develop and analyze methods for assessing the self consistency of parameter estimation methods and carrying out fair comparisons between different algorithms, addressing issues of efficiency and optimality. These methods are general, and can be applied to parameter estimation problems other than sky localization. We apply these methods to two existing sky localization techniques representing the two above-mentioned categories, using a set of simulated inspiralonly signals from compact binary systems with a total mass of equal to or less than 20M solar mass and nonspinning components. We compare the relative advantages and costs of the two techniques and show that sky location uncertainties are on average a factor approx. equals 20 smaller for fully coherent techniques than for the specific variant of the triangulation-based technique used during the last science runs, at the expense of a factor approx. equals 1000 longer processing time.
NASA Astrophysics Data System (ADS)
Sidery, T.; Aylott, B.; Christensen, N.; Farr, B.; Farr, W.; Feroz, F.; Gair, J.; Grover, K.; Graff, P.; Hanna, C.; Kalogera, V.; Mandel, I.; O'Shaughnessy, R.; Pitkin, M.; Price, L.; Raymond, V.; Röver, C.; Singer, L.; van der Sluys, M.; Smith, R. J. E.; Vecchio, A.; Veitch, J.; Vitale, S.
2014-04-01
The problem of reconstructing the sky position of compact binary coalescences detected via gravitational waves is a central one for future observations with the ground-based network of gravitational-wave laser interferometers, such as Advanced LIGO and Advanced Virgo. Different techniques for sky localization have been independently developed. They can be divided in two broad categories: fully coherent Bayesian techniques, which are high latency and aimed at in-depth studies of all the parameters of a source, including sky position, and "triangulation-based" techniques, which exploit the data products from the search stage of the analysis to provide an almost real-time approximation of the posterior probability density function of the sky location of a detection candidate. These techniques have previously been applied to data collected during the last science runs of gravitational-wave detectors operating in the so-called initial configuration. Here, we develop and analyze methods for assessing the self consistency of parameter estimation methods and carrying out fair comparisons between different algorithms, addressing issues of efficiency and optimality. These methods are general, and can be applied to parameter estimation problems other than sky localization. We apply these methods to two existing sky localization techniques representing the two above-mentioned categories, using a set of simulated inspiral-only signals from compact binary systems with a total mass of ≤20M⊙ and nonspinning components. We compare the relative advantages and costs of the two techniques and show that sky location uncertainties are on average a factor ≈20 smaller for fully coherent techniques than for the specific variant of the triangulation-based technique used during the last science runs, at the expense of a factor ≈1000 longer processing time.
A technique for solving constraint satisfaction problems using Prolog's definite clause grammars
NASA Technical Reports Server (NTRS)
Nachtsheim, Philip R.
1988-01-01
A new technique for solving constraint satisfaction problems using Prolog's definite clause grammars is presented. It exploits the fact that the grammar rule notation can be viewed as a state exchange notation. The novel feature of the technique is that it can perform informed as well as blind search. It provides the Prolog programmer with a new technique for application to a wide range of design, scheduling, and planning problems.
High Temperature Corrosion Problem of Boiler Components in presence of Sulfur and Alkali based Fuels
NASA Astrophysics Data System (ADS)
Ghosh, Debashis; Mitra, Swapan Kumar
2011-04-01
Material degradation and ageing is of particular concern for fossil fuel fired power plant components. New techniques/approaches have been explored in recent years for Residual Life assessment of aged components and material degradation due to different damage mechanism like creep, fatigue, corrosion and erosion etc. Apart from the creep, the high temperature corrosion problem in a fossil fuel fired boiler is a matter of great concern if the fuel contains sulfur, chlorine sodium, potassium and vanadium etc. This paper discusses the material degradation due to high temperature corrosion in different critical components of boiler like water wall, superheater and reheater tubes and also remedial measures to avoid the premature failure. This paper also high lights the Residual Life Assessment (RLA) methodology of the components based on high temperature fireside corrosion. of different critical components of boiler.
The ``battle of gold'' under the light of green economics: a case study from Greece
NASA Astrophysics Data System (ADS)
Damigos, D.; Kaliampakos, D.
2006-05-01
Mining firms stimulate local and national economies but this comes at a certain cost. In the light of increasing public concern, external costs of environmental degradation and social disruption are no longer of pure academic interest. The assessment of mining projects on the grounds of sustainable development is critical in order to decide whether the exploitation of mineral resources is socially desirable. In practice, few steps have been taken towards this end. In this paper, a case study is illustrated that provides the means for evaluating the social worthiness of mining projects. The analysis, which is the first of its kind in Greece, deals with a major problem of the mining industry: the gold debate on the grounds of green economics. The assessment is based on the social cost benefit approach. Well-established techniques (e.g. benefit transfer) and innovative approaches have been adopted to overcome various practical problems
NASA Astrophysics Data System (ADS)
Omoragbon, Amen
Although, the Aerospace and Defense (A&D) industry is a significant contributor to the United States' economy, national prestige and national security, it experiences significant cost and schedule overruns. This problem is related to the differences between technology acquisition assessments and aerospace vehicle conceptual design. Acquisition assessments evaluate broad sets of alternatives with mostly qualitative techniques, while conceptual design tools evaluate narrow set of alternatives with multidisciplinary tools. In order for these two fields to communicate effectively, a common platform for both concerns is desired. This research is an original contribution to a three-part solution to this problem. It discusses the decomposition step of an innovation technology and sizing tool generation framework. It identifies complex multidisciplinary system definitions as a bridge between acquisition and conceptual design. It establishes complex multidisciplinary building blocks that can be used to build synthesis systems as well as technology portfolios. It also describes a Graphical User Interface Designed to aid in decomposition process. Finally, it demonstrates an application of the methodology to a relevant acquisition and conceptual design problem posed by the US Air Force.
Giddens, Jean
2006-03-01
Rapid changes in health care have underscored the need for reform in health professions education, including nursing education. One of many problems cited in the nursing and other health sciences education literature is overcrowded curricula; therefore, an evaluation of content is necessary. The purpose of this study was to determine whether differences exist in the frequency that physical examination techniques are performed by associate and baccalaureate degree prepared nurses. Participants completed a survey on performance of various physical examination techniques. A Mann-Whitney test showed no differences between the two groups in terms of frequency of techniques performed. A small negative correlation was found between frequency and years of experience with the nutrition assessment category. A comparison of physical examination content covered in baccalaureate and associate degree nursing programs is needed to further understand these findings.
Single-incision Laparoscopic Surgery (SILS) in general surgery: a review of current practice.
Froghi, Farid; Sodergren, Mikael Hans; Darzi, Ara; Paraskeva, Paraskevas
2010-08-01
Single-incision laparoscopic surgery (SILS) aims to eliminate multiple port incisions. Although general operative principles of SILS are similar to conventional laparoscopic surgery, operative techniques are not standardized. This review aims to evaluate the current use of SILS published in the literature by examining the types of operations performed, techniques employed, and relevant complications and morbidity. This review considered a total of 94 studies reporting 1889 patients evaluating 17 different general surgical operations. There were 8 different access techniques reported using conventional laparoscopic instruments and specifically designed SILS ports. There is extensive heterogeneity associated with operating methods and in particular ways of overcoming problems with retraction and instrumentation. Published complications, morbidity, and hospital length of stay are comparable to conventional laparoscopy. Although SILS provides excellent cosmetic results and morbidity seems similar to conventional laparoscopy, larger randomized controlled trials are needed to assess the safety and efficacy of this novel technique.
Aeronautical Decision Making for Helicopter Pilots
1987-02-01
ileOV through a self-assessment inventory and provides detailed explanations of pre-f light and in-flight stress management techniques. The assumption...of their knowledge to the improvement of helicopter safety. We also wish to thank Mr. John Christy who did such a fine job of illustrating the...one’s knowledge , skills, and experience. A judgmental decision always involves a problem or choice, an unknown element, and usually a time constraint
Pacific Basin conference on hazardous waste: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This conference was held November 4--8, 1996 in Kuala Lumpur, Malaysia. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on the problems of hazardous waste. Topics of discussion deal with pollution prevention, waste treatment technology, health and ecosystem effects research, analysis and assessment, and regulatory management techniques. Individual papers have been processed separately for inclusion in the appropriate data bases.
Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence
2013-03-01
Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.
Application of neural networks and sensitivity analysis to improved prediction of trauma survival.
Hunter, A; Kennedy, L; Henry, J; Ferguson, I
2000-05-01
The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.
NASA Technical Reports Server (NTRS)
Stricklin, J. A.; Haisler, W. E.; Von Riesemann, W. A.
1972-01-01
This paper presents an assessment of the solution procedures available for the analysis of inelastic and/or large deflection structural behavior. A literature survey is given which summarized the contribution of other researchers in the analysis of structural problems exhibiting material nonlinearities and combined geometric-material nonlinearities. Attention is focused at evaluating the available computation and solution techniques. Each of the solution techniques is developed from a common equation of equilibrium in terms of pseudo forces. The solution procedures are applied to circular plates and shells of revolution in an attempt to compare and evaluate each with respect to computational accuracy, economy, and efficiency. Based on the numerical studies, observations and comments are made with regard to the accuracy and economy of each solution technique.
Molinari, Filippo; Meiburger, Kristen M; Suri, Jasjit
2011-01-01
The evaluation of the carotid artery wall is fundamental for the assessment of cardiovascular risk. This paper presents the general architecture of an automatic strategy, which segments the lumen-intima and media-adventitia borders, classified under a class of Patented AtheroEdge™ systems (Global Biomedical Technologies, Inc, CA, USA). Guidelines to produce accurate and repeatable measurements of the intima-media thickness are provided and the problem of the different distance metrics one can adopt is confronted. We compared the results of a completely automatic algorithm that we developed with those of a semi-automatic algorithm, and showed final segmentation results for both techniques. The overall rationale is to provide user-independent high-performance techniques suitable for screening and remote monitoring.
NASA Astrophysics Data System (ADS)
Astley, R. J.; Sugimoto, R.; Mustafi, P.
2011-08-01
Novel techniques are presented to reduce noise from turbofan aircraft engines by optimising the acoustic treatment in engine ducts. The application of Computational Aero-Acoustics (CAA) to predict acoustic propagation and absorption in turbofan ducts is reviewed and a critical assessment of performance indicates that validated and accurate techniques are now available for realistic engine predictions. A procedure for integrating CAA methods with state of the art optimisation techniques is proposed in the remainder of the article. This is achieved by embedding advanced computational methods for noise prediction within automated and semi-automated optimisation schemes. Two different strategies are described and applied to realistic nacelle geometries and fan sources to demonstrate the feasibility of this approach for industry scale problems.
Wales, Andrew; Breslin, Mark; Davies, Robert
2006-09-10
Salmonella infection of laying flocks in the UK is predominantly a problem of the persistent contamination of layer houses and associated wildlife vectors by Salmonella Enteritidis. Methods for its control and elimination include effective cleaning and disinfection of layer houses between flocks, and it is important to be able to measure the success of such decontamination. A method for the environmental detection and semi-quantitative enumeration of salmonellae was used and compared with a standard qualitative method, in 12 Salmonella-contaminated caged layer houses before and after cleaning and disinfection. The quantitative technique proved to have comparable sensitivity to the standard method, and additionally provided insights into the numerical Salmonella challenge that replacement flocks would encounter. Elimination of S. Enteritidis was not achieved in any of the premises examined although substantial reductions in the prevalence and numbers of salmonellae were demonstrated, whilst in others an increase in contamination was observed after cleaning and disinfection. Particular problems with feeders and wildlife vectors were highlighted. The use of a quantitative method assisted the identification of problem areas, such as those with a high initial bacterial load or those experiencing only a modest reduction in bacterial count following decontamination.
NASA Astrophysics Data System (ADS)
Wooh, Shi-Chang; Azar, Lawrence
1999-01-01
The degradation of civil infrastructure has placed a focus on effective nondestructive evaluation techniques to correctly assess the condition of existing concrete structures. Conventional high frequency ultrasonic response are severely affected by scattering and material attenuation, resulting in weak and confusing signal returns. Therefore, low frequency ultrasonic transducers, which avoid this problem of wave attenuation, are commonly used for concrete with limited capabilities. The focus of this research is to ascertain some benefits and limitations of a low frequency ultrasonic phased array transducer. In this paper, we investigate a novel low-frequency ultrasonic phased array and the results of experimental feasibility test for practical condition assessment of concrete structures are reported.
NASA Technical Reports Server (NTRS)
Fear, J. S.
1983-01-01
An assessment is made of the results of Phase 1 screening testing of current and advanced combustion system concepts using several broadened-properties fuels. The severity of each of several fuels-properties effects on combustor performance or liner life is discussed, as well as design techniques with the potential to offset these adverse effects. The selection of concepts to be pursued in Phase 2 refinement testing is described. This selection takes into account the relative costs and complexities of the concepts, the current outlook on pollutant emissions control, and practical operational problems.
Klegeris, Andis; Hurren, Heather
2011-12-01
Problem-based learning (PBL) can be described as a learning environment where the problem drives the learning. This technique usually involves learning in small groups, which are supervised by tutors. It is becoming evident that PBL in a small-group setting has a robust positive effect on student learning and skills, including better problem-solving skills and an increase in overall motivation. However, very little research has been done on the educational benefits of PBL in a large classroom setting. Here, we describe a PBL approach (using tutorless groups) that was introduced as a supplement to standard didactic lectures in University of British Columbia Okanagan undergraduate biochemistry classes consisting of 45-85 students. PBL was chosen as an effective method to assist students in learning biochemical and physiological processes. By monitoring student attendance and using informal and formal surveys, we demonstrated that PBL has a significant positive impact on student motivation to attend and participate in the course work. Student responses indicated that PBL is superior to traditional lecture format with regard to the understanding of course content and retention of information. We also demonstrated that student problem-solving skills are significantly improved, but additional controlled studies are needed to determine how much PBL exercises contribute to this improvement. These preliminary data indicated several positive outcomes of using PBL in a large classroom setting, although further studies aimed at assessing student learning are needed to further justify implementation of this technique in courses delivered to large undergraduate classes.
Simulating and assessing boson sampling experiments with phase-space representations
NASA Astrophysics Data System (ADS)
Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.
2018-04-01
The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.
Cell proliferation assessment in oncology.
Hofstädter, F; Knüchel, R; Rüschoff, J
1995-01-01
A review of the current knowledge on cell cycle control and the techniques used to assess proliferation of normal and neoplastic cells was the focus of a workshop in Regensburg, Germany, held under the joint auspices of the Graduiertenkolleg: Therapieforschung Onkologie and the Committee on AgNOR Quantification. An overview of the recently discovered group of cyclins and their specific kinases, and of other proliferation-associated antigens, such as Ki67, PCNA and topoiseromase II alpha, was given. The topics continued with a reappraisal of modern imaging and flow-cytometric techniques. An update of the relation of AgNORs to cellular proliferation and differentiation was the link to presentations on clinical data, problems and strategies for standardization, as well as guidelines to establish the prognostic value of marker molecules. These lectures were supported by posters. Bringing together researchers from life sciences, technically oriented workers, pathologists, and clinicians resulted in a lively and constructive discussion, which is briefly summarized in the Concluding remarks.
Shallow Reflection Method for Water-Filled Void Detection and Characterization
NASA Astrophysics Data System (ADS)
Zahari, M. N. H.; Madun, A.; Dahlan, S. H.; Joret, A.; Hazreek, Z. A. M.; Mohammad, A. H.; Izzaty, R. A.
2018-04-01
Shallow investigation is crucial in enhancing the characteristics of subsurface void commonly encountered in civil engineering, and one such technique commonly used is seismic-reflection technique. An assessment of the effectiveness of such an approach is critical to determine whether the quality of the works meets the prescribed requirements. Conventional quality testing suffers limitations including: limited coverage (both area and depth) and problems with resolution quality. Traditionally quality assurance measurements use laboratory and in-situ invasive and destructive tests. However geophysical approaches, which are typically non-invasive and non-destructive, offer a method by which improvement of detection can be measured in a cost-effective way. Of this seismic reflection have proved useful to assess void characteristic, this paper evaluates the application of shallow seismic-reflection method in characterizing the water-filled void properties at 0.34 m depth, specifically for detection and characterization of void measurement using 2-dimensional tomography.
Mechanical systems readiness assessment and performance monitoring study
NASA Technical Reports Server (NTRS)
1972-01-01
The problem of mechanical devices which lack the real-time readiness assessment and performance monitoring capability required for future space missions is studied. The results of a test program to establish the feasibility of implementing structure borne acoustics, a nondestructive test technique, are described. The program included the monitoring of operational acoustic signatures of five separate mechanical components, each possessing distinct sound characteristics. Acoustic signatures were established for normal operation of each component. Critical failure modes were then inserted into the test components, and faulted acoustic signatures obtained. Predominant features of the sound signature were related back to operational events occurring within the components both for normal and failure mode operations. All of these steps can be automated. The structure borne acoustics technique lends itself to reducing checkout time, simplifying maintenance procedures, and reducing manual involvement in the checkout, operation, maintenance, and fault diagnosis of mechanical systems.
Zulu, Rodah M; Byrne, Nuala M; Munthali, Grace K; Chipeta, James; Handema, Ray; Musonda, Mofu; Hills, Andrew P
2011-09-21
Zambia is a sub-Saharan country with one of the highest prevalence rates of HIV, currently estimated at 14%. Poor nutritional status due to both protein-energy and micronutrient malnutrition has worsened this situation. In an attempt to address this combined problem, the government has instigated a number of strategies, including the provision of antiretroviral (ARV) treatment coupled with the promotion of good nutrition. High-energy protein supplement (HEPS) is particularly promoted; however, the impact of this food supplement on the nutritional status of people living with HIV/AIDS (PLHA) beyond weight gain has not been assessed. Techniques for the assessment of nutritional status utilising objective measures of body composition are not commonly available in Zambia. The aim of this study is therefore to assess the impact of a food supplement on nutritional status using a comprehensive anthropometric protocol including measures of skinfold thickness and circumferences, plus the criterion deuterium dilution technique to assess total body water (TBW) and derive fat-free mass (FFM) and fat mass (FM). This community-based controlled and longitudinal study aims to recruit 200 HIV-infected females commencing ARV treatment at two clinics in Lusaka, Zambia. Data will be collected at four time points: baseline, 4-month, 8-month and 12-month follow-up visits. Outcome measures to be assessed include body height and weight, body mass index (BMI), body composition, CD4, viral load and micronutrient status. This protocol describes a study that will provide a longitudinal assessment of the impact of a food supplement on the nutritional status of HIV-infected females initiating ARVs using a range of anthropometric and body composition assessment techniques. Pan African Clinical Trial Registry PACTR201108000303396.
2011-01-01
Background Zambia is a sub-Saharan country with one of the highest prevalence rates of HIV, currently estimated at 14%. Poor nutritional status due to both protein-energy and micronutrient malnutrition has worsened this situation. In an attempt to address this combined problem, the government has instigated a number of strategies, including the provision of antiretroviral (ARV) treatment coupled with the promotion of good nutrition. High-energy protein supplement (HEPS) is particularly promoted; however, the impact of this food supplement on the nutritional status of people living with HIV/AIDS (PLHA) beyond weight gain has not been assessed. Techniques for the assessment of nutritional status utilising objective measures of body composition are not commonly available in Zambia. The aim of this study is therefore to assess the impact of a food supplement on nutritional status using a comprehensive anthropometric protocol including measures of skinfold thickness and circumferences, plus the criterion deuterium dilution technique to assess total body water (TBW) and derive fat-free mass (FFM) and fat mass (FM). Methods/Design This community-based controlled and longitudinal study aims to recruit 200 HIV-infected females commencing ARV treatment at two clinics in Lusaka, Zambia. Data will be collected at four time points: baseline, 4-month, 8-month and 12-month follow-up visits. Outcome measures to be assessed include body height and weight, body mass index (BMI), body composition, CD4, viral load and micronutrient status. Discussion This protocol describes a study that will provide a longitudinal assessment of the impact of a food supplement on the nutritional status of HIV-infected females initiating ARVs using a range of anthropometric and body composition assessment techniques. Trial Registration Pan African Clinical Trial Registry PACTR201108000303396. PMID:21936938
Orme-Zavaleta, Jennifer; Munns, Wayne R
2008-01-01
Environmental and public health policy continues to evolve in response to new and complex social, economic and environmental drivers. Globalization and centralization of commerce, evolving patterns of land use (e.g., urbanization, deforestation), and technological advances in such areas as manufacturing and development of genetically modified foods have created new and complex classes of stressors and risks (e.g., climate change, emergent and opportunist disease, sprawl, genomic change). In recognition of these changes, environmental risk assessment and its use are changing from stressor-endpoint specific assessments used in command and control types of decisions to an integrated approach for application in community-based decisions. As a result, the process of risk assessment and supporting risk analyses are evolving to characterize the human-environment relationship. Integrating risk paradigms combine the process of risk estimation for humans, biota, and natural resources into one assessment to improve the information used in environmental decisions (Suter et al. 2003b). A benefit to this approach includes a broader, system-wide evaluation that considers the interacting effects of stressors on humans and the environment, as well the interactions between these entities. To improve our understanding of the linkages within complex systems, risk assessors will need to rely on a suite of techniques for conducting rigorous analyses characterizing the exposure and effects relationships between stressors and biological receptors. Many of the analytical techniques routinely employed are narrowly focused and unable to address the complexities of an integrated assessment. In this paper, we describe an approach to integrated risk assessment, and discuss qualitative community modeling and Probabilistic Relational Modeling techniques that address these limitations and evaluate their potential for use in an integrated risk assessment of cyanobacteria.
ERIC Educational Resources Information Center
Hale, Norman; Lindelow, John
Chapter 12 in a volume on school leadership, this chapter cites the work of several authorities concerning problem-solving or decision-making techniques based on the belief that group problem-solving effort is preferable to individual effort. The first technique, force-field analysis, is described as a means of dissecting complex problems into…
Jakeman, Anthony J.; Jakeman, John Davis
2018-03-14
Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jakeman, Anthony J.; Jakeman, John Davis
Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less
Engineering Risk Assessment of Space Thruster Challenge Problem
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie
2014-01-01
The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.
3D gravity inversion and uncertainty assessment of basement relief via Particle Swarm Optimization
NASA Astrophysics Data System (ADS)
Pallero, J. L. G.; Fernández-Martínez, J. L.; Bonvalot, S.; Fudym, O.
2017-04-01
Nonlinear gravity inversion in sedimentary basins is a classical problem in applied geophysics. Although a 2D approximation is widely used, 3D models have been also proposed to better take into account the basin geometry. A common nonlinear approach to this 3D problem consists in modeling the basin as a set of right rectangular prisms with prescribed density contrast, whose depths are the unknowns. Then, the problem is iteratively solved via local optimization techniques from an initial model computed using some simplifications or being estimated using prior geophysical models. Nevertheless, this kind of approach is highly dependent on the prior information that is used, and lacks from a correct solution appraisal (nonlinear uncertainty analysis). In this paper, we use the family of global Particle Swarm Optimization (PSO) optimizers for the 3D gravity inversion and model appraisal of the solution that is adopted for basement relief estimation in sedimentary basins. Synthetic and real cases are illustrated, showing that robust results are obtained. Therefore, PSO seems to be a very good alternative for 3D gravity inversion and uncertainty assessment of basement relief when used in a sampling while optimizing approach. That way important geological questions can be answered probabilistically in order to perform risk assessment in the decisions that are made.
Houston, Eric; Tatum, Alexander K.; Guy, Arryn; Mikrut, Cassandra; Yoder, Wren
2016-01-01
Objective: Poor treatment adherence is a major problem among individuals with chronic illness. Research indicates that adherence is worsened when accompanied by depressive symptoms. In this preliminary study, we aimed to describe how a patient-centered approach could be employed to aid patients with depressive symptoms in following their treatment regimens. Methods: The sample consisted of 14 patients undergoing antiretroviral therapy (ART) for HIV who reported clinically-significant depressive symptoms. Participant ratings of 23 treatment-related statements were examined using two assessment and analytic techniques. Interviews were conducted with participants to determine their views of information based on the technique. Results: Results indicate that while participants with optimal adherence focused on views of treatment associated with side effects to a greater extent than participants with poor adherence, they tended to relate these side effects to sources of intrinsic motivation. Conclusion: The study provides examples of how practitioners could employ the assessment techniques outlined to better understand how patients think about treatment and aid them in effectively framing their health-related goals. PMID:26755463
Houston, Eric; Tatum, Alexander K; Guy, Arryn; Mikrut, Cassandra; Yoder, Wren
2015-10-26
Poor treatment adherence is a major problem among individuals with chronic illness. Research indicates that adherence is worsened when accompanied by depressive symptoms. In this preliminary study, we aimed to describe how a patient-centered approach could be employed to aid patients with depressive symptoms in following their treatment regimens. The sample consisted of 14 patients undergoing antiretroviral therapy (ART) for HIV who reported clinically-significant depressive symptoms. Participant ratings of 23 treatment-related statements were examined using two assessment and analytic techniques. Interviews were conducted with participants to determine their views of information based on the technique. Results indicate that while participants with optimal adherence focused on views of treatment associated with side effects to a greater extent than participants with poor adherence, they tended to relate these side effects to sources of intrinsic motivation. The study provides examples of how practitioners could employ the assessment techniques outlined to better understand how patients think about treatment and aid them in effectively framing their health-related goals.
NASA Astrophysics Data System (ADS)
Huang, Min-Wei; Lo, Pei-Yu; Cheng, Kuo-Sheng
2010-12-01
Military personnel movement is exposed to solar radiation and sunburn is a major problem which can cause lost workdays and lead to disciplinary action. This study was designed to identify correlation parameters in evaluating in vivo doses and epidermis changes following sunburn inflammation. Several noninvasive bioengineering techniques have made objective evaluations possible. The volar forearms of healthy volunteers ([InlineEquation not available: see fulltext.]), 2 areas, 20 mm in diameter, were irradiated with UVB 100 mj/[InlineEquation not available: see fulltext.] and 200 mj/[InlineEquation not available: see fulltext.], respectively. The skin changes were recorded by several monitored techniques before and 24 hours after UV exposures. Our results showed that chromameter [InlineEquation not available: see fulltext.] value provides more reliable information and can be adopted with mathematical model in predicting the minimal erythema dose (MED) which showed lower than visual assessment by 10 mj/[InlineEquation not available: see fulltext.] (Pearson correlation coefficient [InlineEquation not available: see fulltext.]). A more objective measure for evaluation of MED was established for photosensitive subjects' prediction and sunburn risks prevention.
Stapleton, Tadhg; Connelly, Deirdre
2010-01-01
Practice in the area of predriving assessment for people with stroke varies, and research findings are not always easily transferred into the clinical setting, particularly when such assessment is not conducted within a dedicated driver assessment programme. This article explores the clinical predriving assessment practices and recommendations of a group of Irish occupational therapists for people with stroke. A consensus meeting of occupational therapists was facilitated using a nominal group technique (NGT) to identify specific components of cognition, perception, and executive function that may influence fitness to return to driving and should be assessed prior to referral for on-road evaluation. Standardised assessments for use in predriving assessment were recommended. Thirteen occupational therapists speed of processing; perceptual components of spatial awareness, depth perception, and visual inattention; and executive components of planning, problem solving, judgment, and self-awareness. Consensus emerged for the use of the following standardised tests: Behavioural Assessment of Dysexecutive Syndrome (BADS), Test of Everyday Attention (TEA), Brain Injury Visual Assessment Battery for Adults (biVABA), Rivermead Perceptual Assessment Battery (RPAB), and Motor Free Visual Perceptual Test (MVPT). Tests were recommended that gave an indication of the patient's underlying component skills in the area of cognition, perception, and executive functions considered important for driving. Further research is needed in this area to develop clinical practice guidelines for occupational therapists for the assessment of fitness to return to driving after stroke.
Automated parameterization of intermolecular pair potentials using global optimization techniques
NASA Astrophysics Data System (ADS)
Krämer, Andreas; Hülsmann, Marco; Köddermann, Thorsten; Reith, Dirk
2014-12-01
In this work, different global optimization techniques are assessed for the automated development of molecular force fields, as used in molecular dynamics and Monte Carlo simulations. The quest of finding suitable force field parameters is treated as a mathematical minimization problem. Intricate problem characteristics such as extremely costly and even abortive simulations, noisy simulation results, and especially multiple local minima naturally lead to the use of sophisticated global optimization algorithms. Five diverse algorithms (pure random search, recursive random search, CMA-ES, differential evolution, and taboo search) are compared to our own tailor-made solution named CoSMoS. CoSMoS is an automated workflow. It models the parameters' influence on the simulation observables to detect a globally optimal set of parameters. It is shown how and why this approach is superior to other algorithms. Applied to suitable test functions and simulations for phosgene, CoSMoS effectively reduces the number of required simulations and real time for the optimization task.
Intracranial Pressure Monitoring: Invasive versus Non-Invasive Methods—A Review
Raboel, P. H.; Bartek, J.; Andresen, M.; Bellander, B. M.; Romner, B.
2012-01-01
Monitoring of intracranial pressure (ICP) has been used for decades in the fields of neurosurgery and neurology. There are multiple techniques: invasive as well as noninvasive. This paper aims to provide an overview of the advantages and disadvantages of the most common and well-known methods as well as assess whether noninvasive techniques (transcranial Doppler, tympanic membrane displacement, optic nerve sheath diameter, CT scan/MRI and fundoscopy) can be used as reliable alternatives to the invasive techniques (ventriculostomy and microtransducers). Ventriculostomy is considered the gold standard in terms of accurate measurement of pressure, although microtransducers generally are just as accurate. Both invasive techniques are associated with a minor risk of complications such as hemorrhage and infection. Furthermore, zero drift is a problem with selected microtransducers. The non-invasive techniques are without the invasive methods' risk of complication, but fail to measure ICP accurately enough to be used as routine alternatives to invasive measurement. We conclude that invasive measurement is currently the only option for accurate measurement of ICP. PMID:22720148
Hosseini, Seyed Kianoosh; Ghalamkari, Marziyeh; Yousefshahi, Fardin; Mireskandari, Seyed Mohammad; Rezaei Hamami, Mohsen
2013-10-28
Cardiopulmonary-cerebral resuscitation (CPCR) training is essential for all hospital workers, especially junior residents who might become the manager of the resuscitation team. In our center, the traditional CPCR knowledge training curriculum for junior residents up to 5 years ago was lecture-based and had some faults. This study aimed to evaluate the effect of a problem-based method on residents' CPCR knowledge and skills as well as their evaluation of their CPCR trainers. This study, conducted at Tehran University of Medical Sciences, included 290 first-year residents in 2009-2010 - who were trained via a problem-based method (the problem-based group) - and 160 first-year residents in 2003-2004 - who were trained via a lecture-based method (the lecture-based group). Other educational techniques and facilities were similar. The participants self-evaluated their own CPCR knowledge and skills pre and post workshop and also assessed their trainers' efficacy post workshop by completing special questionnaires. The problem-based group, trained via the problem-based method, had higher self-assessment scores of CPCR knowledge and skills post workshop: the difference as regards the mean scores between the problem-based and lecture-based groups was 32.36 ± 19.23 vs. 22.33 ± 20.35 for knowledge (p value = 0.003) and 10.13 ± 7.17 vs. 8.19 ± 8.45 for skills (p value = 0.043). The residents' evaluation of their trainers was similar between the two study groups (p value = 0.193), with the mean scores being 15.90 ± 2.59 and 15.46 ± 2.90 in the problem-based and lecture-based groups - respectively. The problem-based method increased our residents' self-evaluation score of their own CPCR knowledge and skills.
Dunlosky, John; Rawson, Katherine A; Marsh, Elizabeth J; Nathan, Mitchell J; Willingham, Daniel T
2013-01-01
Many students are being left behind by an educational system that some people believe is in crisis. Improving educational outcomes will require efforts on many fronts, but a central premise of this monograph is that one part of a solution involves helping students to better regulate their learning through the use of effective learning techniques. Fortunately, cognitive and educational psychologists have been developing and evaluating easy-to-use learning techniques that could help students achieve their learning goals. In this monograph, we discuss 10 learning techniques in detail and offer recommendations about their relative utility. We selected techniques that were expected to be relatively easy to use and hence could be adopted by many students. Also, some techniques (e.g., highlighting and rereading) were selected because students report relying heavily on them, which makes it especially important to examine how well they work. The techniques include elaborative interrogation, self-explanation, summarization, highlighting (or underlining), the keyword mnemonic, imagery use for text learning, rereading, practice testing, distributed practice, and interleaved practice. To offer recommendations about the relative utility of these techniques, we evaluated whether their benefits generalize across four categories of variables: learning conditions, student characteristics, materials, and criterion tasks. Learning conditions include aspects of the learning environment in which the technique is implemented, such as whether a student studies alone or with a group. Student characteristics include variables such as age, ability, and level of prior knowledge. Materials vary from simple concepts to mathematical problems to complicated science texts. Criterion tasks include different outcome measures that are relevant to student achievement, such as those tapping memory, problem solving, and comprehension. We attempted to provide thorough reviews for each technique, so this monograph is rather lengthy. However, we also wrote the monograph in a modular fashion, so it is easy to use. In particular, each review is divided into the following sections: General description of the technique and why it should work How general are the effects of this technique? 2a. Learning conditions 2b. Student characteristics 2c. Materials 2d. Criterion tasks Effects in representative educational contexts Issues for implementation Overall assessment The review for each technique can be read independently of the others, and particular variables of interest can be easily compared across techniques. To foreshadow our final recommendations, the techniques vary widely with respect to their generalizability and promise for improving student learning. Practice testing and distributed practice received high utility assessments because they benefit learners of different ages and abilities and have been shown to boost students' performance across many criterion tasks and even in educational contexts. Elaborative interrogation, self-explanation, and interleaved practice received moderate utility assessments. The benefits of these techniques do generalize across some variables, yet despite their promise, they fell short of a high utility assessment because the evidence for their efficacy is limited. For instance, elaborative interrogation and self-explanation have not been adequately evaluated in educational contexts, and the benefits of interleaving have just begun to be systematically explored, so the ultimate effectiveness of these techniques is currently unknown. Nevertheless, the techniques that received moderate-utility ratings show enough promise for us to recommend their use in appropriate situations, which we describe in detail within the review of each technique. Five techniques received a low utility assessment: summarization, highlighting, the keyword mnemonic, imagery use for text learning, and rereading. These techniques were rated as low utility for numerous reasons. Summarization and imagery use for text learning have been shown to help some students on some criterion tasks, yet the conditions under which these techniques produce benefits are limited, and much research is still needed to fully explore their overall effectiveness. The keyword mnemonic is difficult to implement in some contexts, and it appears to benefit students for a limited number of materials and for short retention intervals. Most students report rereading and highlighting, yet these techniques do not consistently boost students' performance, so other techniques should be used in their place (e.g., practice testing instead of rereading). Our hope is that this monograph will foster improvements in student learning, not only by showcasing which learning techniques are likely to have the most generalizable effects but also by encouraging researchers to continue investigating the most promising techniques. Accordingly, in our closing remarks, we discuss some issues for how these techniques could be implemented by teachers and students, and we highlight directions for future research. © The Author(s) 2013.
A retrospective photometric study of 82 published reports of mastopexy and breast reduction.
Swanson, Eric
2011-12-01
Numerous publications claim to improve breast projection and upper pole fullness after mastopexy or breast reduction. Fascial sutures and "autoaugmentation" with local flaps are advocated. However, there is no objective evidence that these efforts are effective. The author has proposed a measuring system to quantitate results. Not only is this system useful for assessing one's own results, but it may also be used to assess and compare results in published studies. Eighty-two international publications on mastopexies and breast reductions were analyzed. The studies were grouped by technique: inverted-T (superior/medial, central, and inferior pedicles), vertical, periareolar, inframammary, lateral, and "other." Measurements were made using the definitions and terminology reported separately and included breast projection, upper pole projection, lower pole level, nipple level, breast convexity, breast parenchymal ratio, and lower pole ratio. Areola shape was assessed. Breast projection and upper pole projection were not increased significantly by any of the mastopexy/reduction procedures or by the use of fascial sutures or autoaugmentation techniques. Nipple overelevation was common (41.9 percent). The incidence of the teardrop areola deformity (53.8 percent) was significantly higher (p < 0.001) in patients treated with the open technique of nipple placement. There was no significant difference in results when compared by follow-up times, resection weights, year of publication, or geographic region. Existing mastopexy/reduction techniques do not significantly increase breast projection or upper pole projection. Fascial sutures and autoaugmentation techniques are ineffective. Nipple overelevation and the teardrop areola deformity are common problems and should be avoided.
Concept mapping improves academic performance in problem solving questions in biochemistry subject.
Baig, Mukhtiar; Tariq, Saba; Rehman, Rehana; Ali, Sobia; Gazzaz, Zohair J
2016-01-01
To assess the effectiveness of concept mapping (CM) on the academic performance of medical students' in problem-solving as well as in declarative knowledge questions and their perception regarding CM. The present analytical and questionnaire-based study was carried out at Bahria University Medical and Dental College (BUMDC), Karachi, Pakistan. In this analytical study, students were assessed with problem-solving questions (A-type MCQs), and declarative knowledge questions (short essay questions), and 50% of the questions were from the topics learned by CM. Students also filled a 10-item, 3-point Likert scale questionnaire about their perception regarding the effectiveness of the CM approach, and two open-ended questions were also asked. There was a significant difference in the marks obtained in those problem-solving questions, which were learned by CM as compared to those topics which were taught by the traditional lectures (p<0.001), while no significant difference was observed in marks in declarative knowledge questions (p=0.704). Analysis of students' perception regarding CM showed that majority of the students perceive that CM is a helpful technique and it is enjoyed by the students. In open-ended questions, the majority of the students commented positively about the effectiveness of CM. Our results indicate that CM improves academic performance in problem solving but not in declarative knowledge questions. Students' perception about the effectiveness of CM was overwhelmingly positive.
NASA Astrophysics Data System (ADS)
Terrell, Rosalind Stephanie
2001-12-01
Because paper-and-pencil testing provides limited knowledge about what students know about chemical phenomena, we have developed video-based demonstrations to broaden measurement of student learning. For example, students might be shown a video demonstrating equilibrium shifts. Two methods for viewing equilibrium shifts are changing the concentration of the reactants and changing the temperature of the system. The students are required to combine the data collected from the video and their knowledge of chemistry to determine which way the equilibrium shifts. Video-based demonstrations are important techniques for measuring student learning because they require students to apply conceptual knowledge learned in class to a specific chemical problem. This study explores how video-based demonstration assessment tasks affect problem-solving processes, test anxiety, chemistry anxiety and achievement in general chemistry students. Several instruments were used to determine students' knowledge about chemistry, students' test and chemistry anxiety before and after treatment. Think-aloud interviews were conducted to determine students' problem-solving processes after treatment. The treatment group was compared to a control group and a group watching video demonstrations. After treatment students' anxiety increased and achievement decreased. There were also no significant differences found in students' problem-solving processes following treatment. These negative findings may be attributed to several factors that will be explored in this study.
Performance of Grey Wolf Optimizer on large scale problems
NASA Astrophysics Data System (ADS)
Gupta, Shubham; Deep, Kusum
2017-01-01
For solving nonlinear continuous problems of optimization numerous nature inspired optimization techniques are being proposed in literature which can be implemented to solve real life problems wherein the conventional techniques cannot be applied. Grey Wolf Optimizer is one of such technique which is gaining popularity since the last two years. The objective of this paper is to investigate the performance of Grey Wolf Optimization Algorithm on large scale optimization problems. The Algorithm is implemented on 5 common scalable problems appearing in literature namely Sphere, Rosenbrock, Rastrigin, Ackley and Griewank Functions. The dimensions of these problems are varied from 50 to 1000. The results indicate that Grey Wolf Optimizer is a powerful nature inspired Optimization Algorithm for large scale problems, except Rosenbrock which is a unimodal function.
Smart Training, Smart Learning: The Role of Cooperative Learning in Training for Youth Services.
ERIC Educational Resources Information Center
Doll, Carol A.
1997-01-01
Examines cooperative learning in youth services and adult education. Discusses characteristics of cooperative learning techniques; specific cooperative learning techniques (brainstorming, mini-lecture, roundtable technique, send-a-problem problem solving, talking chips technique, and three-step interview); and the role of the trainer. (AEF)
Gumz, Antje; Treese, Barbara; Marx, Christopher; Strauss, Bernhard; Wendt, Hanna
2015-01-01
Language is one of the most important “tools” of psychotherapists. The working mechanisms of verbal therapeutic techniques, however, are still marginally understood. In part, this is due to the lack of a generally acknowledged typology as well as a gold standard for the assessment of verbal techniques, which limits the possibility of conducting studies focusing this topic. The present study reviews measures used in clinical research which assess directly observable dimensions of verbal interventions in a reliable manner. All measures were evaluated with respect to their theoretical foundation, research goals, assessment modes, and various psychometric properties. A systematic search in databases (PubMed, PsycInfo, PsycArticles, PSYNDEX, Web of Science, Embase) followed by an additional “snowballing” search covering the years 1940–2013 yielded n = 179 publications eligible for review. Within these publications, 34 measures were identified showing great heterogeneity regarding the aspects under study. Only two measures reached the highest psychometric standards and can be recommended for clinical use without any reservation. Central problems include deficiencies in the systematization of techniques as well as their partly ambiguous and inconsistent definitions. To promote this field of research, it will be important to achieve a consensus concerning the terminology, conceptions and measures of verbal interventions. PMID:26617543
Santiago-Moreno, Julian; Esteso, Milagros Cristina; Villaverde-Morcillo, Silvia; Toledano-Díaz, Adolfo; Castaño, Cristina; Velázquez, Rosario; López-Sebastián, Antonio; Goya, Agustín López; Martínez, Javier Gimeno
2016-01-01
Postcopulatory sexual selection through sperm competition may be an important evolutionary force affecting many reproductive traits, including sperm morphometrics. Environmental factors such as pollutants, pesticides, and climate change may affect different sperm traits, and thus reproduction, in sensitive bird species. Many sperm-handling processes used in assisted reproductive techniques may also affect the size of sperm cells. The accurately measured dimensions of sperm cell structures (especially the head) can thus be used as indicators of environmental influences, in improving our understanding of reproductive and evolutionary strategies, and for optimizing assisted reproductive techniques (e.g., sperm cryopreservation) for use with birds. Computer-assisted sperm morphometry analysis (CASA-Morph) provides an accurate and reliable method for assessing sperm morphometry, reducing the problem of subjectivity associated with human visual assessment. Computerized systems have been standardized for use with semen from different mammalian species. Avian spermatozoa, however, are filiform, limiting their analysis with such systems, which were developed to examine the approximately spherical heads of mammalian sperm cells. To help overcome this, the standardization of staining techniques to be used in computer-assessed light microscopical methods is a priority. The present review discusses these points and describes the sperm morphometric characteristics of several wild and domestic bird species. PMID:27678467
NASA Astrophysics Data System (ADS)
Casas, Leslie; Treuillet, Sylvie; Valencia, Braulio; Llanos, Alejandro; Castañeda, Benjamín.
2015-01-01
Chronic wounds are a major problem worldwide which mainly affects to the geriatric population or patients with limited mobility. In tropical countries, Cutaneous Leishmaniasis(CL)s is also a cause for chronic wounds,being endemic in Peru in the 75% of the country. Therefore, the monitoring of these wounds represents a big challenge due to the remote location of the patients. This papers aims to develop a low-cost user-friendly technique to obtain a 3D reconstruction for chronic wounds oriented to clinical monitoring and assessment. The video is taken using a commercial hand-held video camera without the need of a rig. The algorithm has been specially designed for skin wounds which have certain characteristics in texture where techniques used in regular SFM applications with undefined edges wouldn't work. In addition, the technique has been developed using open source libraries. The 3D cloud point estimated allows the computation of metrics as volume, depth, superficial area which recently have been used by CL specialists showing good results in clinical assessment. Initial results in cork phantoms and CL wounds show an average distance error of less than 1mm when compared against models obtained with a industrial 3D laser scanner.
Kumar, Sasi; Adiga, Kasturi Ramesh; George, Anice
2014-01-01
Old age is a period when people need physical, emotional, and psychological support. Depression is the most prevalent mental health problem among older adults and it contributes to increase in medical morbidity and mortality, reduces quality of life and elevates health care costs. Therefore early diagnosis and effective management are required to improve the quality of life of older adults suffering from depression. Intervention like Mindfulness based Stress Reduction is a powerful relaxation technique to provide quick way to get rid of depression and negative emotions by increasing mindfulness. The study was undertaken to assess the effectiveness of MBSR on depression among elderly residing in residential homes, Bangalore. In this study, quasi experimental pre-test post-test control group research design was used. There were two groups: experimental and control, each group had 30 samples selected from different residential homes by non-probability convenience sampling technique. Pre-test depression and mindfulness was assessed before the first day of intervention. Experimental group participants were provided intervention on MBSR. Assessment of post-test depression and mindfulness was done at the end of the intervention programme for both group participants. The study revealed significant reduction in depression (p < 0.001) and increase in mindfulness (p < 0.001) among elderly in the experimental group who were subjected to MBSR technique.
Cancer and the cardiopulmonary system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalil, A.M.; Ewer, M.S.
1984-01-01
This volume addresses the problems induced in the cardiopulmonary function by certain advanced diagnostic techniques and treatment modalities for cancer, reviews the cardiopulmonary changes resulting from cancer itself, and assesses the limitations to surgical and nonsurgical management of diverse neoplastic conditions. Information on the effects of various tumors on cardiopulmonary function and on the spectrum of adverse cardiopulmonary reactions caused by chemotherapy and radiation theorapy is provided, with specific practical guidance on diagnosis and treatment.
Conservation in the energy industry
NASA Technical Reports Server (NTRS)
1975-01-01
The basic energy supply and utilization problems faced by the United States were described. Actions which might alleviate the domestic shortfall of petroleum and natural gas are described, analyzed and overall impacts are assessed. Specific actions included are coal gasification, in situ shale oil production, improved oil and gas recovery, importation of liquid natural gas and deregulation of natural gas prices. These actions are weighed against each other as alternate techniques of alleviating or overcoming existing shortfalls.
Retort braze bonding of borsic/aluminum composite sheet to titanium
NASA Technical Reports Server (NTRS)
Webb, B. A.; Dolowy, J. F., Jr.
1975-01-01
Braze bonding studies between Borsic/aluminum composite and titanium sheet were conducted to establish acceptable brazing techniques and to assess potential joint efficiencies. Excellent braze joints were produced which exhibited joint strengths exceeding 117 MPa (17,000 psi) and which retained up to 2/3 of this strength at 589 K (600 F). Noticeable composite strength degradation resulting from the required high temperature braze cycle was found to be a problem.
Robert B. Thomas
1990-01-01
Using a previously treated basin as a control in subsequent paired watershed studies requires the control to be stable. Basin stability can be assessed in many ways, some of which are investigated for the South Fork of Caspar Creek in northern California. This basin is recovering from logging and road building in the early 1970s. Three storm-based discharge...
The Battlefield Commander’s Assistant Project: Research in Terrain Reasoning
1987-05-22
order dissemination. In order to restrict the survey problem to a manageable level, we made the a priori decision to focus on activities related to...models Manages tools for: Conmander , tactical a explanations * situation assessment1Lplans s plan and plan option " a query/edit capabilities...from our work on the Air Land Battle Management Study ( ’Stachnick 87:) which was tasked to compare Al planning techniques with the requirements of
Automated Decision Making and Problem Solving. Volume 1: Executive Summary
NASA Technical Reports Server (NTRS)
Heer, E.
1981-01-01
The May 1980 conference is summarized. Related topics in artificial intelligence, operations research, and control theory were explored. Existing techniques were assessed, trends of development determined, and potential for application in NASA automation technology programs were identified. Formal presentations were made by experts in the three disciplines nd a workshop was held in which current technology in automation and possible NASA interfaces with the academic community to advance this technology were discussed.
NASA Technical Reports Server (NTRS)
Yelle, Roger V.; Wallace, Lloyd
1989-01-01
A versatile and efficient technique for the solution of the resonance line scattering problem with frequency redistribution in planetary atmospheres is introduced. Similar to the doubling approach commonly used in monochromatic scattering problems, the technique has been extended to include the frequency dependence of the radiation field. Methods for solving problems with external or internal sources and coupled spectral lines are presented, along with comparison of some sample calculations with results from Monte Carlo and Feautrier techniques. The doubling technique has also been applied to the solution of resonance line scattering problems where the R-parallel redistribution function is appropriate, both neglecting and including polarization as developed by Yelle and Wallace (1989). With the constraint that the atmosphere is illuminated from the zenith, the only difficulty of consequence is that of performing precise frequency integrations over the line profiles. With that problem solved, it is no longer necessary to use the Monte Carlo method to solve this class of problem.
Solving real decay and conservation problems of building materials by ultrasounds technique
NASA Astrophysics Data System (ADS)
Alvarez de Buergo, Monica; Fort, Rafael; Gomez-Heras, Miguel; Vazquez-Calvo, Carmen
2010-05-01
In this study a variety of case studies and different building materials in which ultrasounds velocity played a significant role are shown, either to characterize building materials, to measure deterioration, to assess conservation techniques or for preventive purposes. Regarding to materials properties, ultrasounds velocity provided interesting indices such as the quality index (useful when selecting replacing materials, materials for new constructions or either for sculptures); alteration index (very much related to pores and voids, and fissures); mechanical strength (assessing its reduction when materials are affected by several decay processes, being fire one of them) or anisotropy indices, which highly condition the decay of elements and materials in buildings and sculptures, and which vary themselves with decay progress. The technique is also a tool for detecting and locating elements inside structures, like metallic ones, and also to detect and locate discontinuities inside elements, both for consolidation purposes or even in cases of structures movement, which is quite common nowadays. Using some specific software, ultrasounds results can be plotted as iso-areas, which allows to define areas or zones of structures with the highest risk of detachment in a short-time in order to plan the most adequate interventions. Not new is also the aid of ultrasonics to assess consolidation products and to determine the degree of materials decay when submitted to artificial ageing. Much more innovative is the fact that ultrasonics measurement can be also helpful to determine different building periods in a same building, even the fact of determining an element's lifetime. The results obtained by this non destructive and portable technique that will be presented in this session correspond to both real case studies (results that helped to solve a real problem), some of them corresponding to emblematic monuments de España (Royal Palace of Madrid and some other monuments belonging to the Spanish National Heritage or Trust, archaeological structures and sculptures), and also to laboratory research to understand processes and helpful to see the best way of preservation. In some of the cases, other techniques have been used as complementary, such as sclerommetry, magnetometry and IR termography. Acknowledgements: to both MATERNAS (0505/MAT/0094) and GEOMATERIALES (2009-1629) research programmes, funded by the Regional Government of Madrid; and to the CONSOLIDER-INGENIO programme (CSD2007-0058), funded by the Spanish Ministry of Education and Science.
NASA Astrophysics Data System (ADS)
Manzanares-Filho, N.; Albuquerque, R. B. F.; Sousa, B. S.; Santos, L. G. C.
2018-06-01
This article presents a comparative study of some versions of the controlled random search algorithm (CRSA) in global optimization problems. The basic CRSA, originally proposed by Price in 1977 and improved by Ali et al. in 1997, is taken as a starting point. Then, some new modifications are proposed to improve the efficiency and reliability of this global optimization technique. The performance of the algorithms is assessed using traditional benchmark test problems commonly invoked in the literature. This comparative study points out the key features of the modified algorithm. Finally, a comparison is also made in a practical engineering application, namely the inverse aerofoil shape design.
Increasing mathematical problem-solving performance through relaxation training
NASA Astrophysics Data System (ADS)
Sharp, Conni; Coltharp, Hazel; Hurford, David; Cole, Amykay
2000-04-01
Two intact classes of 30 undergraduate students enrolled in the same general education mathematics course were each administered the IPSP Mathematics Problem Solving Test and the Mathematics Anxiety Rating Scale at the beginning and end of the semester. Both groups experienced the same syllabus, lectures, course requirements, and assessment techniques; however, one group received relaxation training during an initial class meeting and during the first 5 to 7 minutes of each subsequent class. The group which had received relaxation training had significantly lower mathematics anxiety and significantly higher mathematics performance at the end of the course. The results suggest that relaxation training may be a useful tool for treating anxiety in undergraduate general education mathematics students.
Past, present and future of spike sorting techniques
Rey, Hernan Gonzalo; Pedreira, Carlos; Quian Quiroga, Rodrigo
2015-01-01
Spike sorting is a crucial step to extract information from extracellular recordings. With new recording opportunities provided by the development of new electrodes that allow monitoring hundreds of neurons simultaneously, the scenario for the new generation of algorithms is both exciting and challenging. However, this will require a new approach to the problem and the development of a common reference framework to quickly assess the performance of new algorithms. In this work, we review the basic concepts of spike sorting, including the requirements for different applications, together with the problems faced by presently available algorithms. We conclude by proposing a roadmap stressing the crucial points to be addressed to support the neuroscientific research of the near future. PMID:25931392
Study of dispersed small wind systems interconnected with a utility distribution system
NASA Astrophysics Data System (ADS)
Curtice, D.; Patton, J.; Bohn, J.; Sechan, N.
1980-03-01
Operating problems for various penetrations of small wind systems connected to the distribution system on a utility are defined. Protection equipment, safety hazards, feeder voltage regulation, line losses, and voltage flicker problems are studied, assuming different small wind systems connected to an existing distribution system. To identify hardware deficiencies, possible solutions provided by off-the-shelf hardware and equipment are assessed. Results of the study indicate that existing techniques are inadequate for detecting isolated operation of a small wind system. Potential safety hazards posed by small wind systems are adequately handled by present work procedures although these procedures require a disconnect device at synchronous generator and self commutated inverter small wind systems.
Respiratory assessment in critical care units.
Cox, C L; McGrath, A
1999-08-01
As healthcare delivery changes in critical care, nursing continues to extend its practice base. Nursing practice is expanding to incorporate skills once seen as the remit of the medical profession. Critical care nurses are equipping themselves with evidence-based knowledge and skills that can enhance the care they provide to their patients. Assessment of patients is a major role in nursing and, by expanding assessment techniques, nurses can ensure patients receive the care most appropriate to their needs. Nurses in critical care are well placed to perform a more detailed assessment which can help to focus nursing care. This article describes the step-by-step process of undertaking a full and comprehensive respiratory assessment in critical care settings. It identifies many of the problems that patients may have and the signs and symptoms that a nurse may not whilst undertaking the assessment and preparing to prescribe care.
The Retrospective Iterated Analysis Scheme for Nonlinear Chaotic Dynamics
NASA Technical Reports Server (NTRS)
Todling, Ricardo
2002-01-01
Atmospheric data assimilation is the name scientists give to the techniques of blending atmospheric observations with atmospheric model results to obtain an accurate idea of what the atmosphere looks like at any given time. Because two pieces of information are used, observations and model results, the outcomes of data assimilation procedure should be better than what one would get by using one of these two pieces of information alone. There is a number of different mathematical techniques that fall under the data assimilation jargon. In theory most these techniques accomplish about the same thing. In practice, however, slight differences in the approaches amount to faster algorithms in some cases, more economical algorithms in other cases, and even give better overall results in yet some other cases because of practical uncertainties not accounted for by theory. Therefore, the key is to find the most adequate data assimilation procedure for the problem in hand. In our Data Assimilation group we have been doing extensive research to try and find just such data assimilation procedure. One promising possibility is what we call retrospective iterated analysis (RIA) scheme. This procedure has recently been implemented and studied in the context of a very large data assimilation system built to help predict and study weather and climate. Although the results from that study suggest that the RIA scheme produces quite reasonable results, a complete evaluation of the scheme is very difficult due to the complexity of that problem. The present work steps back a little bit and studies the behavior of the RIA scheme in the context of a small problem. The problem is small enough to allow full assessment of the quality of the RIA scheme, but it still has some of the complexity found in nature, namely, its chaotic-type behavior. We find that the RIA performs very well for this small but still complex problem which is a result that seconds the results of our early studies.
Manning, Todd G; Papa, Nathan; Perera, Marlon; McGrath, Shannon; Christidis, Daniel; Khan, Munad; O'Beirne, Richard; Campbell, Nicholas; Bolton, Damien; Lawrentschuk, Nathan
2018-03-01
Laparoscopic lens fogging (LLF) hampers vision and impedes operative efficiency. Attempts to reduce LLF have led to the development of various anti-fogging fluids and warming devices. Limited literature exists directly comparing these techniques. We constructed a model peritoneum to simulate LLF and to compare the efficacy of various anti-fogging techniques. Intraperitoneal space was simulated using a suction bag suspended within an 8 L container of water. LLF was induced by varying the temperature and humidity within the model peritoneum. Various anti-fogging techniques were assessed including scope warmers, FRED TM , Resoclear TM , chlorhexidine, betadine and immersion in heated saline. These products were trialled with and without the use of a disposable scope warmer. Vision scores were evaluated by the same investigator for all tests and rated according to a predetermined scale. Fogging was assessed for each product or technique 30 times and a mean vision rating was recorded. All products tested imparted some benefit, but FRED TM performed better than all other techniques. Betadine and Resoclear TM performed no better than the use of a scope warmer alone. Immersion in saline prior to insertion resulted in decreased vision ratings. The robotic scope did not result in LLF within the model. In standard laparoscopes, the most superior preventative measure was FRED TM utilised on a pre-warmed scope. Despite improvements in LLF with other products FRED TM was better than all other techniques. The robotic laparoscope performed superiorly regarding LLF compared to standard laparoscope.
Optimization techniques applied to spectrum management for communications satellites
NASA Astrophysics Data System (ADS)
Ottey, H. R.; Sullivan, T. M.; Zusman, F. S.
This paper describes user requirements, algorithms and software design features for the application of optimization techniques to the management of the geostationary orbit/spectrum resource. Relevant problems include parameter sensitivity analyses, frequency and orbit position assignment coordination, and orbit position allotment planning. It is shown how integer and nonlinear programming as well as heuristic search techniques can be used to solve these problems. Formalized mathematical objective functions that define the problems are presented. Constraint functions that impart the necessary solution bounds are described. A versatile program structure is outlined, which would allow problems to be solved in stages while varying the problem space, solution resolution, objective function and constraints.
Heat Transfer Search Algorithm for Non-convex Economic Dispatch Problems
NASA Astrophysics Data System (ADS)
Hazra, Abhik; Das, Saborni; Basu, Mousumi
2018-06-01
This paper presents Heat Transfer Search (HTS) algorithm for the non-linear economic dispatch problem. HTS algorithm is based on the law of thermodynamics and heat transfer. The proficiency of the suggested technique has been disclosed on three dissimilar complicated economic dispatch problems with valve point effect; prohibited operating zone; and multiple fuels with valve point effect. Test results acquired from the suggested technique for the economic dispatch problem have been fitted to that acquired from other stated evolutionary techniques. It has been observed that the suggested HTS carry out superior solutions.
Heat Transfer Search Algorithm for Non-convex Economic Dispatch Problems
NASA Astrophysics Data System (ADS)
Hazra, Abhik; Das, Saborni; Basu, Mousumi
2018-03-01
This paper presents Heat Transfer Search (HTS) algorithm for the non-linear economic dispatch problem. HTS algorithm is based on the law of thermodynamics and heat transfer. The proficiency of the suggested technique has been disclosed on three dissimilar complicated economic dispatch problems with valve point effect; prohibited operating zone; and multiple fuels with valve point effect. Test results acquired from the suggested technique for the economic dispatch problem have been fitted to that acquired from other stated evolutionary techniques. It has been observed that the suggested HTS carry out superior solutions.
NASA Technical Reports Server (NTRS)
Gillespie, V. G.; Kelly, R. O.
1974-01-01
The problems encountered and special techniques and procedures developed on the Skylab program are described along with the experiences and practical benefits obtained for dissemination and use on future programs. Three major topics are discussed: electrical problems, mechanical problems, and special techniques. Special techniques and procedures are identified that were either developed or refined during the Skylab program. These techniques and procedures came from all manufacturing and test phases of the Skylab program and include both flight and GSE items from component level to sophisticated spaceflight systems.
Sexual function in women in rural Tamil Nadu: disease, dysfunction, distress and norms.
Viswanathan, Shonima; Prasad, Jasmine; Jacob, K S; Kuruvilla, Anju
2014-01-01
We examined the nature, prevalence and explanatory models of sexual concerns and dysfunction among women in rural Tamil Nadu. Married women between 18 and 65 years of age, from randomly selected villages in Kaniyambadi block, Vellore district, Tamil Nadu, were chosen by stratified sampling technique. Sexual functioning was assessed using the Female Sexual Function Index (FSFI). The modified Short Explanatory Model Interview (SEMI) was used to assess beliefs about sexual concerns and the General Health Questionnaire-12 (GHQ-12) was used to screen for common mental disorders. Sociodemographic variables and other risk factors were also assessed. Most of the women (277; 98.2%) contacted agreed to participate in the study. The prevalence of sexual dysfunction, based on the cut-off score on the FSFI, was 64.3%. However, only a minority of women considered it a problem (4.7%), expressed dissatisfaction (5.8%) or sought medical help (2.5%). The most common explanatory models offered for sexual problems included an unhappy marriage,stress and physical problems. Factors associated with lower FSFI included older age, illiteracy, as well as medical illness and sexual and marital factors such as menopause, poor quality of marital relationship, history of physical abuse and lack of privacy. The diagnosis of female sexual dysfunction needs to be nuanced and based on the broader personal and social context. Our findings argue that there is a need to use models that employ personal, local and contextual standards in assessing complex behaviours such as sexual function. Copyright 2014, NMJI.
Hypothesis driven assessment of an NMR curriculum
NASA Astrophysics Data System (ADS)
Cossey, Kimberly
The goal of this project was to develop a battery of assessments to evaluate an undergraduate NMR curriculum at Penn State University. As a chemical education project, we sought to approach the problem of curriculum assessment from a scientific perspective, while remaining grounded in the education research literature and practices. We chose the phrase hypothesis driven assessment to convey this process of relating the scientific method to the study of educational methods, modules, and curricula. We began from a hypothesis, that deeper understanding of one particular analytical technique (NMR) will increase undergraduate students' abilities to solve chemical problems. We designed an experiment to investigate this hypothesis, and data collected were analyzed and interpreted in light of the hypothesis and several related research questions. The expansion of the NMR curriculum at Penn State was funded through the NSF's Course, Curriculum, and Laboratory Improvement (CCLI) program, and assessment was required. The goal of this project, as stated in the grant proposal, was to provide NMR content in greater depth by integrating NMR modules throughout the curriculum in physical chemistry, instrumental, and organic chemistry laboratory courses. Hands-on contact with the NMR spectrometer and NMR data and repeated exposure of the analytical technique within different contexts (courses) were unique factors of this curriculum. Therefore, we maintained a focus on these aspects throughout the evaluation process. The most challenging and time-consuming aspect of any assessment is the development of testing instruments and methods to provide useful data. After key variables were defined, testing instruments were designed to measure these variables based on educational literature (Chapter 2). The primary variables measured in this assessment were: depth of understanding of NMR, basic NMR knowledge, problem solving skills (HETCOR problem), confidence for skills used in class (within the hands-on NMR modules), confidence for NMR tasks (not practiced), and confidence for general science tasks. Detailed discussion of the instruments, testing methods and experimental design used in this assessment are provided (Chapter 3). All data were analyzed quantitatively using methods adapted from the educational literature (Chapter 4). Data were analyzed and the descriptive statistics, independent t-tests between the experimental and control groups, and correlation statistics were calculated for each variable. In addition, for those variables included on the pretest, dependent t-tests between pretest and posttest scores were also calculated. The results of study 1 and study 2 were used to draw conclusions based on the hypothesis and research questions proposed in this work (Chapter 4). Data collected in this assessment were used to answer the following research questions: (1) Primary research question: Is depth of understanding of NMR linked to problem solving skills? (2) Are the NMR modules working as intended? Do they promote depth of understanding of NMR? (a) Will students who complete NMR modules have a greater depth of understanding of NMR than students who do not complete the modules? (b) Is depth of understanding increasing over the course of the experiment? (3) Is confidence an intermediary between depth of understanding and problem solving skills? Is it linked to both variables? (4) What levels of confidence are affected by the NMR modules? (a) Will confidence for the NMR class skills used in the modules themselves be greater for those who have completed the modules? (b) Will confidence for NMR tasks not practiced in the course be affected? (c) Will confidence for general science tasks be affected? (d) Are different levels of confidence (class skills, NMR tasks, general science tasks) linked to each other? Results from this NMR curriculum assessment could also have implications outside of the courses studied, and so there is potential to impact the chemical education community (section 5.2.1). In addition to providing reliable testing instruments/measures that could be used outside the university, the results of this research contribute to the study of problem solving in chemistry, learner characteristics within the context of chemical education studies, and NMR specific educational evaluations. Valuable information was gathered through the current method of evaluation for the NMR curriculum. However, improvements could be made to the existing assessment, and an alternate assessment that could supplement the information found in this study has been proposed (Chapter 5).
Risk management in the competitive electric power industry
NASA Astrophysics Data System (ADS)
Dahlgren, Robert William
From 1990 until present day, the electric power industry has experienced dramatic changes worldwide. This recent evolution of the power industry has included creation and multiple iterations of competitive wholesale markets in many different forms. The creation of these competitive markets has resulted in increased short-term volatility of power prices. Vertically integrated utilities emerged from years of regulatory controls to now experience the need to perform risk assessment. The goal of this dissertation is to provide background and details of the evolution of market structures combined with examples of how to apply price risk assessment techniques such as Value-at-Risk (VaR). In Chapter 1, the history and evolution of three selected regional markets, PJM, California, and England and Wales is presented. A summary of the commonalities and differences is presented to provide an overview of the rate of transformation of the industry in recent years. The broad area of risk management in the power industry is also explored through a State-of-the-Art Literature Survey. In Chapter 2, an illustration of risk assessment to power trading is presented. The techniques of Value-at-Risk and Conditional Value-at-Risk are introduced and applied to a common scenario. The advantages and limitations of the techniques are compared through observation of their results against the common example. Volatility in the California Power Markets is presented in Chapter 3. This analysis explores the California markets in the summer of 2000 including the application of VaR analysis to the extreme volatility observed during this period. In Chapter 4, CVaR is applied to the same California historical data used in Chapter 3. In addition, the unique application of minimizing the risk of a power portfolio by minimizing CVaR is presented. The application relies on recent research into CVaR whereby the portfolio optimization problem can be reduced to a Linear Programming problem.
Parameterized Algorithmics for Finding Exact Solutions of NP-Hard Biological Problems.
Hüffner, Falk; Komusiewicz, Christian; Niedermeier, Rolf; Wernicke, Sebastian
2017-01-01
Fixed-parameter algorithms are designed to efficiently find optimal solutions to some computationally hard (NP-hard) problems by identifying and exploiting "small" problem-specific parameters. We survey practical techniques to develop such algorithms. Each technique is introduced and supported by case studies of applications to biological problems, with additional pointers to experimental results.
Modeling of tool path for the CNC sheet cutting machines
NASA Astrophysics Data System (ADS)
Petunin, Aleksandr A.
2015-11-01
In the paper the problem of tool path optimization for CNC (Computer Numerical Control) cutting machines is considered. The classification of the cutting techniques is offered. We also propose a new classification of toll path problems. The tasks of cost minimization and time minimization for standard cutting technique (Continuous Cutting Problem, CCP) and for one of non-standard cutting techniques (Segment Continuous Cutting Problem, SCCP) are formalized. We show that the optimization tasks can be interpreted as discrete optimization problem (generalized travel salesman problem with additional constraints, GTSP). Formalization of some constraints for these tasks is described. For the solution GTSP we offer to use mathematical model of Prof. Chentsov based on concept of a megalopolis and dynamic programming.
Assessing the impact of transcriptomics, proteomics and metabolomics on fungal phytopathology.
Tan, Kar-Chun; Ipcho, Simon V S; Trengove, Robert D; Oliver, Richard P; Solomon, Peter S
2009-09-01
SUMMARY Peer-reviewed literature is today littered with exciting new tools and techniques that are being used in all areas of biology and medicine. Transcriptomics, proteomics and, more recently, metabolomics are three of these techniques that have impacted on fungal plant pathology. Used individually, each of these techniques can generate a plethora of data that could occupy a laboratory for years. When used in combination, they have the potential to comprehensively dissect a system at the transcriptional and translational level. Transcriptomics, or quantitative gene expression profiling, is arguably the most familiar to researchers in the field of fungal plant pathology. Microarrays have been the primary technique for the last decade, but others are now emerging. Proteomics has also been exploited by the fungal phytopathogen community, but perhaps not to its potential. A lack of genome sequence information has frustrated proteomics researchers and has largely contributed to this technique not fulfilling its potential. The coming of the genome sequencing era has partially alleviated this problem. Metabolomics is the most recent of these techniques to emerge and is concerned with the non-targeted profiling of all metabolites in a given system. Metabolomics studies on fungal plant pathogens are only just beginning to appear, although its potential to dissect many facets of the pathogen and disease will see its popularity increase quickly. This review assesses the impact of transcriptomics, proteomics and metabolomics on fungal plant pathology over the last decade and discusses their futures. Each of the techniques is described briefly with further reading recommended. Key examples highlighting the application of these technologies to fungal plant pathogens are also reviewed.
Evaluation of a transfinite element numerical solution method for nonlinear heat transfer problems
NASA Technical Reports Server (NTRS)
Cerro, J. A.; Scotti, S. J.
1991-01-01
Laplace transform techniques have been widely used to solve linear, transient field problems. A transform-based algorithm enables calculation of the response at selected times of interest without the need for stepping in time as required by conventional time integration schemes. The elimination of time stepping can substantially reduce computer time when transform techniques are implemented in a numerical finite element program. The coupling of transform techniques with spatial discretization techniques such as the finite element method has resulted in what are known as transfinite element methods. Recently attempts have been made to extend the transfinite element method to solve nonlinear, transient field problems. This paper examines the theoretical basis and numerical implementation of one such algorithm, applied to nonlinear heat transfer problems. The problem is linearized and solved by requiring a numerical iteration at selected times of interest. While shown to be acceptable for weakly nonlinear problems, this algorithm is ineffective as a general nonlinear solution method.
Investigation of finite element: ABC methods for electromagnetic field simulation. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chatterjee, A.; Volakis, John L.; Nguyen, J.
1994-01-01
The mechanics of wave propagation in the presence of obstacles is of great interest in many branches of engineering and applied mathematics like electromagnetics, fluid dynamics, geophysics, seismology, etc. Such problems can be broadly classified into two categories: the bounded domain or the closed problem and the unbounded domain or the open problem. Analytical techniques have been derived for the simpler problems; however, the need to model complicated geometrical features, complex material coatings and fillings, and to adapt the model to changing design parameters have inevitably tilted the balance in favor of numerical techniques. The modeling of closed problems presents difficulties primarily in proper meshing of the interior region. However, problems in unbounded domains pose a unique challenge to computation, since the exterior region is inappropriate for direct implementation of numerical techniques. A large number of solutions have been proposed but only a few have stood the test of time and experiment. The goal of this thesis is to develop an efficient and reliable partial differential equation technique to model large three dimensional scattering problems in electromagnetics.
Twelve hour reproducibility of choroidal blood flow parameters in healthy subjects
Polska, E; Polak, K; Luksch, A; Fuchsjager-Mayrl, G; Petternel, V; Findl, O; Schmetterer, L
2004-01-01
Aims/background: To investigate the reproducibility and potential diurnal variation of choroidal blood flow parameters in healthy subjects over a period of 12 hours. Methods: The choroidal blood flow parameters of 16 healthy non-smoking subjects were measured at five time points during the day (8:00, 11:00, 14:00, 17:00, and 20:00). Outcome parameters were pulsatile ocular blood flow as assessed by pneumotonometry, fundus pulsation amplitude as assessed by laser interferometry, blood velocities in the opthalmic and posterior ciliary arteries as assessed by colour Doppler imaging, and choroidal blood flow, volume, and velocity as assessed by fundus camera based laser Doppler flowmetry. The coefficient of variation and the maximum change from baseline in an individual were calculated for each outcome parameter. Results: None of the techniques used found a diurnal variation in choroidal blood flow. Coefficients of variation were within 2.9% and 13.6% for all outcome parameters. The maximum change from baseline in an individual was much higher, ranging from 11.2% to 58.8%. Conclusions: These data indicate that in healthy subjects the selected techniques provide adequate reproducibility to be used in clinical studies. Variability may, however, be considerably higher in older subjects or subjects with ocular disease. The higher individual differences in flow parameter readings limit the use of the techniques in clinical practice. To overcome problems with measurement validity, a clinical trial should include as many choroidal blood flow outcome parameters as possible to check for consistency. PMID:15031172
Twelve hour reproducibility of choroidal blood flow parameters in healthy subjects.
Polska, E; Polak, K; Luksch, A; Fuchsjager-Mayrl, G; Petternel, V; Findl, O; Schmetterer, L
2004-04-01
To investigate the reproducibility and potential diurnal variation of choroidal blood flow parameters in healthy subjects over a period of 12 hours. The choroidal blood flow parameters of 16 healthy non-smoking subjects were measured at five time points during the day (8:00, 11:00, 14:00, 17:00, and 20:00). Outcome parameters were pulsatile ocular blood flow as assessed by pneumotonometry, fundus pulsation amplitude as assessed by laser interferometry, blood velocities in the opthalmic and posterior ciliary arteries as assessed by colour Doppler imaging, and choroidal blood flow, volume, and velocity as assessed by fundus camera based laser Doppler flowmetry. The coefficient of variation and the maximum change from baseline in an individual were calculated for each outcome parameter. None of the techniques used found a diurnal variation in choroidal blood flow. Coefficients of variation were within 2.9% and 13.6% for all outcome parameters. The maximum change from baseline in an individual was much higher, ranging from 11.2% to 58.8%. These data indicate that in healthy subjects the selected techniques provide adequate reproducibility to be used in clinical studies. Variability may, however, be considerably higher in older subjects or subjects with ocular disease. The higher individual differences in flow parameter readings limit the use of the techniques in clinical practice. To overcome problems with measurement validity, a clinical trial should include as many choroidal blood flow outcome parameters as possible to check for consistency.
Feedback control for fuel-optimal descents using singular perturbation techniques
NASA Technical Reports Server (NTRS)
Price, D. B.
1984-01-01
In response to rising fuel costs and reduced profit margins for the airline companies, the optimization of the paths flown by transport aircraft has been considered. It was found that application of optimal control theory to the considered problem can result in savings in fuel, time, and direct operating costs. The best solution to the aircraft trajectory problem is an onboard real-time feedback control law. The present paper presents a technique which shows promise of becoming a part of a complete solution. The application of singular perturbation techniques to the problem is discussed, taking into account the benefits and some problems associated with them. A different technique for handling the descent part of a trajectory is also discussed.
Using the nursing process to implement a Y2K computer application.
Hobbs, C F; Hardinge, T T
2000-01-01
Because of the coming year 2000, the need was assessed to upgrade the order entry system at many hospitals. At Somerset Medical Center, a training team divided the transition into phases and used a modified version of the nursing process to implement the new program. The entire process required fewer than 6 months and was relatively problem-free. This successful transition was aided by the nursing process, training team, and innovative educational techniques.
User interface issues in supporting human-computer integrated scheduling
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.; Biefeld, Eric W.
1991-01-01
Explored here is the user interface problems encountered with the Operations Missions Planner (OMP) project at the Jet Propulsion Laboratory (JPL). OMP uses a unique iterative approach to planning that places additional requirements on the user interface, particularly to support system development and maintenance. These requirements are necessary to support the concepts of heuristically controlled search, in-progress assessment, and iterative refinement of the schedule. The techniques used to address the OMP interface needs are given.
Systematic procedure for designing processes with multiple environmental objectives.
Kim, Ki-Joo; Smith, Raymond L
2005-04-01
Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.
NASA Astrophysics Data System (ADS)
Rees, S. J.; Jones, Bryan F.
1992-11-01
Once feature extraction has occurred in a processed image, the recognition problem becomes one of defining a set of features which maps sufficiently well onto one of the defined shape/object models to permit a claimed recognition. This process is usually handled by aggregating features until a large enough weighting is obtained to claim membership, or an adequate number of located features are matched to the reference set. A requirement has existed for an operator or measure capable of a more direct assessment of membership/occupancy between feature sets, particularly where the feature sets may be defective representations. Such feature set errors may be caused by noise, by overlapping of objects, and by partial obscuration of features. These problems occur at the point of acquisition: repairing the data would then assume a priori knowledge of the solution. The technique described in this paper offers a set theoretical measure for partial occupancy defined in terms of the set of minimum additions to permit full occupancy and the set of locations of occupancy if such additions are made. As is shown, this technique permits recognition of partial feature sets with quantifiable degrees of uncertainty. A solution to the problems of obscuration and overlapping is therefore available.
Numerical investigation of internal high-speed viscous flows using a parabolic technique
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Power, G. D.
1985-01-01
A feasibility study has been conducted to assess the applicability of an existing parabolic analysis (ADD-Axisymmetric Diffuser Duct), developed previously for subsonic viscous internal flows, to mixed supersonic/subsonic flows with heat addition simulating a SCRAMJET combustor. A study was conducted with the ADD code modified to include additional convection effects in the normal momentum equation when supersonic expansion and compression waves are present. A set of test problems with weak shock and expansion waves have been analyzed with this modified ADD method and stable and accurate solutions were demonstrated provided the streamwise step size was maintained at levels larger than the boundary layer displacement thickness. Calculations made with further reductions in step size encountered departure solutions consistent with strong interaction theory. Calculations were also performed for a flow field with a flame front in which a specific heat release was imposed to simulate a SCRAMJET combustor. In this case the flame front generated relatively thick shear layers which aggravated the departure solution problem. Qualitatively correct results were obtained for these cases using a marching technique with the convective terms in the normal momentum equation suppressed. It is concluded from the present study that for the class of problems where strong viscous/inviscid interactions are present a global iteration procedure is required.
Lakhan, Ram
2014-01-01
Background: Management of behavioral problems in children with intellectual disabilities (ID) is a great concern in resource-poor areas in India. This study attempted to analyze the efficacy of behavioral intervention provided in resource-poor settings. Objective: This study was aimed to examine the outcome of behavioral management provided to children with ID in a poor rural region in India. Materials and Methods: We analyzed data from 104 children between 3 and 18 years old who received interventions for behavioral problems in a clinical or a community setting. The behavioral assessment scale for Indian children with mental retardation (BASIC-MR) was used to quantify the study subjects’ behavioral problems before and after we applied behavioral management techniques (baseline and post-intervention, respectively). The baseline and post-intervention scores were analyzed using the following statistical techniques: Wilcoxon matched-pairs signed-rank test for the efficacy of intervention; χ2 for group differences. Results: The study demonstrated behavioral improvements across all behavior domains (P < 0.05). Levels of improvement varied for children with different severities of ID (P = 0.001), between children who did and did not have multiple disabilities (P = 0.011). Conclusion: The outcome of this behavioral management study suggests that behavioral intervention can be effectively provided to children with ID in poor areas. PMID:24574557
Molecular detection of pathogens in water--the pros and cons of molecular techniques.
Girones, Rosina; Ferrús, Maria Antonia; Alonso, José Luis; Rodriguez-Manzano, Jesus; Calgua, Byron; Corrêa, Adriana de Abreu; Hundesa, Ayalkibet; Carratala, Anna; Bofill-Mas, Sílvia
2010-08-01
Pollution of water by sewage and run-off from farms produces a serious public health problem in many countries. Viruses, along with bacteria and protozoa in the intestine or in urine are shed and transported through the sewer system. Even in highly industrialized countries, pathogens, including viruses, are prevalent throughout the environment. Molecular methods are used to monitor viral, bacterial, and protozoan pathogens, and to track pathogen- and source-specific markers in the environment. Molecular techniques, specifically polymerase chain reaction-based methods, provide sensitive, rapid, and quantitative analytical tools with which to study such pathogens, including new or emerging strains. These techniques are used to evaluate the microbiological quality of food and water, and to assess the efficiency of virus removal in drinking and wastewater treatment plants. The range of methods available for the application of molecular techniques has increased, and the costs involved have fallen. These developments have allowed the potential standardization and automation of certain techniques. In some cases they facilitate the identification, genotyping, enumeration, viability assessment, and source-tracking of human and animal contamination. Additionally, recent improvements in detection technologies have allowed the simultaneous detection of multiple targets in a single assay. However, the molecular techniques available today and those under development require further refinement in order to be standardized and applicable to a diversity of matrices. Water disinfection treatments may have an effect on the viability of pathogens and the numbers obtained by molecular techniques may overestimate the quantification of infectious microorganisms. The pros and cons of molecular techniques for the detection and quantification of pathogens in water are discussed. (c) 2010 Elsevier Ltd. All rights reserved.
Qualitative profiles of disability.
Annicchiarico, Roberta; Gibert, Karina; Cortés, Ulises; Campana, Fabio; Caltagirone, Carlo
2004-01-01
This study identified profiles of functional disability (FD) paralleled by increasing levels of disability. We assessed 96 subjects using the World Health Organization Disability Assessment Schedule II (WHODAS II). Clustering Based on Rules (ClBR) (a hybrid technique of Statistics and Artificial Intelligence) was used in the analysis. Four groups of subjects with different profiles of FD were ordered according to an increasing degree of disability: "Low," self-dependent subjects with no physical or emotional problems; "Intermediate I," subjects with low or moderate physical and emotional disability, with high perception of disability; "Intermediate II," subjects with moderate or severe disability concerning only physical problems related to self-dependency, without emotional problems; and "High," subjects with the highest degree of disability, both physical and emotional. The order of the four classes is paralleled by a significant difference (<0.001) in the WHODAS II standardized global score. In this paper, a new ontology for the knowledge of FD, based on the use of ClBR, is proposed. The definition of four classes, qualitatively different and with an increasing degree of FD, helps to appropriately place each patient in a group of individuals with a similar profile of disability and to propose standardized treatments for these groups.
Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework
NASA Astrophysics Data System (ADS)
Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.
2016-03-01
A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.
DOT National Transportation Integrated Search
1976-12-01
The seminar on "Construction Problems, Techniques and Solutions" held at the First Chicago Center in Chicago on October 20-22, 1975, was organized to focus on anticipated construction problems of the Chicago Central Area Transit Project to include un...
NASA Technical Reports Server (NTRS)
Goodrich, Charles H.; Kurien, James; Clancy, Daniel (Technical Monitor)
2001-01-01
We present some diagnosis and control problems that are difficult to solve with discrete or purely qualitative techniques. We analyze the nature of the problems, classify them and explain why they are frequently encountered in systems with closed loop control. This paper illustrates the problem with several examples drawn from industrial and aerospace applications and presents detailed information on one important application: In-Situ Resource Utilization (ISRU) on Mars. The model for an ISRU plant is analyzed showing where qualitative techniques are inadequate to identify certain failure modes and to maintain control of the system in degraded environments. We show why the solution to the problem will result in significantly more robust and reliable control systems. Finally, we illustrate requirements for a solution to the problem by means of examples.
Land use allocation model considering climate change impact
NASA Astrophysics Data System (ADS)
Lee, D. K.; Yoon, E. J.; Song, Y. I.
2017-12-01
In Korea, climate change adaptation plans are being developed for each administrative district based on impact assessments constructed in various fields. This climate change impact assessments are superimposed on the actual space, which causes problems in land use allocation because the spatial distribution of individual impacts may be different each other. This implies that trade-offs between climate change impacts can occur depending on the composition of land use. Moreover, the actual space is complexly intertwined with various factors such as required area, legal regulations, and socioeconomic values, so land use allocation in consideration of climate change can be very difficult problem to solve (Liu et al. 2012; Porta et al. 2013).Optimization techniques can generate a sufficiently good alternatives for land use allocation at the strategic level if only the fitness function of relationship between impact and land use composition are derived. It has also been noted that land use optimization model is more effective than the scenario-based prediction model in achieving the objectives for problem solving (Zhang et al. 2014). Therefore in this study, we developed a quantitative tool, MOGA (Multi Objective Genetic Algorithm), which can generate a comprehensive land use allocations considering various climate change impacts, and apply it to the Gangwon-do in Korea. Genetic Algorithms (GAs) are the most popular optimization technique to address multi-objective in land use allocation. Also, it allows for immediate feedback to stake holders because it can run a number of experiments with different parameter values. And it is expected that land use decision makers and planners can formulate a detailed spatial plan or perform additional analysis based on the result of optimization model. Acknowledgments: This work was supported by the Korea Ministry of Environment (MOE) as "Climate Change Correspondence Program (Project number: 2014001310006)"
Digression and Value Concatenation to Enable Privacy-Preserving Regression.
Li, Xiao-Bai; Sarkar, Sumit
2014-09-01
Regression techniques can be used not only for legitimate data analysis, but also to infer private information about individuals. In this paper, we demonstrate that regression trees, a popular data-analysis and data-mining technique, can be used to effectively reveal individuals' sensitive data. This problem, which we call a "regression attack," has not been addressed in the data privacy literature, and existing privacy-preserving techniques are not appropriate in coping with this problem. We propose a new approach to counter regression attacks. To protect against privacy disclosure, our approach introduces a novel measure, called digression , which assesses the sensitive value disclosure risk in the process of building a regression tree model. Specifically, we develop an algorithm that uses the measure for pruning the tree to limit disclosure of sensitive data. We also propose a dynamic value-concatenation method for anonymizing data, which better preserves data utility than a user-defined generalization scheme commonly used in existing approaches. Our approach can be used for anonymizing both numeric and categorical data. An experimental study is conducted using real-world financial, economic and healthcare data. The results of the experiments demonstrate that the proposed approach is very effective in protecting data privacy while preserving data quality for research and analysis.
Public health applications of remote sensing of the environment, an evaluation
NASA Technical Reports Server (NTRS)
1972-01-01
The available techniques were examined in the field of remote sensing (including aerial photography, infrared detection, radar, etc.) and applications to a number of problems in the wide field of public health determined. The specific areas of public health examined included: air pollution, water pollution, communicable disease, and the combined problems of urban growth and the effect of disasters on human communities. The assessment of the possible applications of remote sensing to these problems was made primarily by examination of the available literature in each field, and by interviews with health authorities, physicists, biologists, and other interested workers. Three types of programs employing remote sensors were outlined in the air pollution field: (1) proving ability of sensors to monitor pollutants at three levels of interest - point source, ambient levels in cities, and global patterns; (2) detection of effects of pollutants on the environment at local and global levels; and (3) routine monitoring.
A Tool to Teach Communication Skills to Pharmacy Students
2008-01-01
Objective To develop a tool to teach pharmacy students assertive communication skills to use when talking with physicians over the telephone. Design As an assignment for their Communication Skills and Counseling course, students were asked to write a script involving a patient care issue or problem covering 3 different communication styles that could be used when contacting a prescriber by telephone: passive, aggressive, and assertive. Students worked in groups to write and act out the scripts for the class. Assessment Eight scripts were developed by students and rated by peers and faculty members. The script that received the highest ratings was used in the development of a multimedia educational CD. Conclusion The development of hypothetical scripts describing a drug therapy problem and illustrating the types of interactions between physicians and pharmacists while discussing the problem allowed pharmacy students to explore different communication techniques and improve their communication skills. PMID:18698394
An Integrated Planning Representation Using Macros, Abstractions, and Cases
NASA Technical Reports Server (NTRS)
Baltes, Jacky; MacDonald, Bruce
1992-01-01
Planning will be an essential part of future autonomous robots and integrated intelligent systems. This paper focuses on learning problem solving knowledge in planning systems. The system is based on a common representation for macros, abstractions, and cases. Therefore, it is able to exploit both classical and case based techniques. The general operators in a successful plan derivation would be assessed for their potential usefulness, and some stored. The feasibility of this approach was studied through the implementation of a learning system for abstraction. New macros are motivated by trying to improve the operatorset. One heuristic used to improve the operator set is generating operators with more general preconditions than existing ones. This heuristic leads naturally to abstraction hierarchies. This investigation showed promising results on the towers of Hanoi problem. The paper concludes by describing methods for learning other problem solving knowledge. This knowledge can be represented by allowing operators at different levels of abstraction in a refinement.
Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)
2013-01-01
Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Prashant, E-mail: prashantkumar@csio.res.in; Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030; Bansod, Baban K.S.
2015-02-15
Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models havemore » been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper.« less
NASA Astrophysics Data System (ADS)
Randle, K.; Al-Jundi, J.; Mamas, C. J. V.; Sokhi, R. S.; Earwaker, L. G.
1993-06-01
Our work on heavy metals in the estuarine environment has involved the use of two multielement techniques: neutron activation analysis (NAA) and proton-induced X-ray emission (PIXE) analysis. As PIXE is essentially a surface analytical technique problems may arise due to sample inhomogeneity and surface roughness. In order to assess the contribution of these effects we have compared the results from PIXE analysis with those from a technique which analyzes a larger bulk sample rather than just the surface. An obvious method was NAA. A series of sediment samples containing particles of variable diameter were compared. Pellets containing a few mg of sediment were prepared from each sample and analyzed by the PIXE technique using both an absolute and a comparitive method. For INAA the rest of the sample was then irradiated with thermal neutrons and element concentrations determined from analyses of the subsequent gamma-ray spectrum. Results from the two methods are discussed.
Two variants of minimum discarded fill ordering
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Azevedo, E.F.; Forsyth, P.A.; Tang, Wei-Pai
1991-01-01
It is well known that the ordering of the unknowns can have a significant effect on the convergence of Preconditioned Conjugate Gradient (PCG) methods. There has been considerable experimental work on the effects of ordering for regular finite difference problems. In many cases, good results have been obtained with preconditioners based on diagonal, spiral or natural row orderings. However, for finite element problems having unstructured grids or grids generated by a local refinement approach, it is difficult to define many of the orderings for more regular problems. A recently proposed Minimum Discarded Fill (MDF) ordering technique is effective in findingmore » high quality Incomplete LU (ILU) preconditioners, especially for problems arising from unstructured finite element grids. Testing indicates this algorithm can identify a rather complicated physical structure in an anisotropic problem and orders the unknowns in the preferred'' direction. The MDF technique may be viewed as the numerical analogue of the minimum deficiency algorithm in sparse matrix technology. At any stage of the partial elimination, the MDF technique chooses the next pivot node so as to minimize the amount of discarded fill. In this work, two efficient variants of the MDF technique are explored to produce cost-effective high-order ILU preconditioners. The Threshold MDF orderings combine MDF ideas with drop tolerance techniques to identify the sparsity pattern in the ILU preconditioners. These techniques identify an ordering that encourages fast decay of the entries in the ILU factorization. The Minimum Update Matrix (MUM) ordering technique is a simplification of the MDF ordering and is closely related to the minimum degree algorithm. The MUM ordering is especially for large problems arising from Navier-Stokes problems. Some interesting pictures of the orderings are presented using a visualization tool. 22 refs., 4 figs., 7 tabs.« less
Stable Sparse Classifiers Identify qEEG Signatures that Predict Learning Disabilities (NOS) Severity
Bosch-Bayard, Jorge; Galán-García, Lídice; Fernandez, Thalia; Lirio, Rolando B.; Bringas-Vega, Maria L.; Roca-Stappung, Milene; Ricardo-Garcell, Josefina; Harmony, Thalía; Valdes-Sosa, Pedro A.
2018-01-01
In this paper, we present a novel methodology to solve the classification problem, based on sparse (data-driven) regressions, combined with techniques for ensuring stability, especially useful for high-dimensional datasets and small samples number. The sensitivity and specificity of the classifiers are assessed by a stable ROC procedure, which uses a non-parametric algorithm for estimating the area under the ROC curve. This method allows assessing the performance of the classification by the ROC technique, when more than two groups are involved in the classification problem, i.e., when the gold standard is not binary. We apply this methodology to the EEG spectral signatures to find biomarkers that allow discriminating between (and predicting pertinence to) different subgroups of children diagnosed as Not Otherwise Specified Learning Disabilities (LD-NOS) disorder. Children with LD-NOS have notable learning difficulties, which affect education but are not able to be put into some specific category as reading (Dyslexia), Mathematics (Dyscalculia), or Writing (Dysgraphia). By using the EEG spectra, we aim to identify EEG patterns that may be related to specific learning disabilities in an individual case. This could be useful to develop subject-based methods of therapy, based on information provided by the EEG. Here we study 85 LD-NOS children, divided in three subgroups previously selected by a clustering technique over the scores of cognitive tests. The classification equation produced stable marginal areas under the ROC of 0.71 for discrimination between Group 1 vs. Group 2; 0.91 for Group 1 vs. Group 3; and 0.75 for Group 2 vs. Group1. A discussion of the EEG characteristics of each group related to the cognitive scores is also presented. PMID:29379411
Bosch-Bayard, Jorge; Galán-García, Lídice; Fernandez, Thalia; Lirio, Rolando B; Bringas-Vega, Maria L; Roca-Stappung, Milene; Ricardo-Garcell, Josefina; Harmony, Thalía; Valdes-Sosa, Pedro A
2017-01-01
In this paper, we present a novel methodology to solve the classification problem, based on sparse (data-driven) regressions, combined with techniques for ensuring stability, especially useful for high-dimensional datasets and small samples number. The sensitivity and specificity of the classifiers are assessed by a stable ROC procedure, which uses a non-parametric algorithm for estimating the area under the ROC curve. This method allows assessing the performance of the classification by the ROC technique, when more than two groups are involved in the classification problem, i.e., when the gold standard is not binary. We apply this methodology to the EEG spectral signatures to find biomarkers that allow discriminating between (and predicting pertinence to) different subgroups of children diagnosed as Not Otherwise Specified Learning Disabilities (LD-NOS) disorder. Children with LD-NOS have notable learning difficulties, which affect education but are not able to be put into some specific category as reading (Dyslexia), Mathematics (Dyscalculia), or Writing (Dysgraphia). By using the EEG spectra, we aim to identify EEG patterns that may be related to specific learning disabilities in an individual case. This could be useful to develop subject-based methods of therapy, based on information provided by the EEG. Here we study 85 LD-NOS children, divided in three subgroups previously selected by a clustering technique over the scores of cognitive tests. The classification equation produced stable marginal areas under the ROC of 0.71 for discrimination between Group 1 vs. Group 2; 0.91 for Group 1 vs. Group 3; and 0.75 for Group 2 vs. Group1. A discussion of the EEG characteristics of each group related to the cognitive scores is also presented.
Hogue, Aaron; Dauber, Sarah
2013-04-01
This study describes a multimethod evaluation of treatment fidelity to the family therapy (FT) approach demonstrated by front-line therapists in a community behavioral health clinic that utilized FT as its routine standard of care. Study cases (N=50) were adolescents with conduct and/or substance use problems randomly assigned to routine family therapy (RFT) or to a treatment-as-usual clinic not aligned with the FT approach (TAU). Observational analyses showed that RFT therapists consistently achieved a level of adherence to core FT techniques comparable to the adherence benchmark established during an efficacy trial of a research-based FT. Analyses of therapist-report measures found that compared to TAU, RFT demonstrated strong adherence to FT and differentiation from three other evidence-based practices: cognitive-behavioral therapy, motivational interviewing, and drug counseling. Implications for rigorous fidelity assessments of evidence-based practices in usual care settings are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Integrated Behavior Change Techniques for Problem Drinkers in the Community
ERIC Educational Resources Information Center
Vogler, Roger E.; And Others
1977-01-01
Problem drinkers in the community were subjects in a study that evaluated the therapeutic potential of learning techniques in changing abusive drinking patterns and achieving moderation. The authors conclude that moderation is a more attainable and feasible goal for problem drinkers than for chronic alcoholics. (Author)
Solving search problems by strongly simulating quantum circuits
Johnson, T. H.; Biamonte, J. D.; Clark, S. R.; Jaksch, D.
2013-01-01
Simulating quantum circuits using classical computers lets us analyse the inner workings of quantum algorithms. The most complete type of simulation, strong simulation, is believed to be generally inefficient. Nevertheless, several efficient strong simulation techniques are known for restricted families of quantum circuits and we develop an additional technique in this article. Further, we show that strong simulation algorithms perform another fundamental task: solving search problems. Efficient strong simulation techniques allow solutions to a class of search problems to be counted and found efficiently. This enhances the utility of strong simulation methods, known or yet to be discovered, and extends the class of search problems known to be efficiently simulable. Relating strong simulation to search problems also bounds the computational power of efficiently strongly simulable circuits; if they could solve all problems in P this would imply that all problems in NP and #P could be solved in polynomial time. PMID:23390585
Hinkson, Larry; Suermann, Mia Amelie; Hinkson, Susan; Henrich, Wolfgang
2017-08-01
The primary objective is to assess the reduction in manual removal of placenta with the Windmill technique of placenta delivery in patients with retained placenta. The Windmill technique involves the application of continuous 360° umbilical cord traction and rotation in such a manner as to be perpendicular to the direction of the birth canal at the level of the introitus. This rotation through 360° is repeated slowly with movement akin to the motion of the blades of a windmill. We performed a 3-year retrospective case-control study at the Charité University Hospital in Berlin. Patients with a retained placenta more than 30min following failed traditional interventions were consented and offered the Windmill technique of placenta delivery. Study cases were compared to controls where an operative manual removal of placenta was performed. Patients with suspected placenta implantation problems, uterine atony, bleeding due to vaginal tract injury and coagulation disturbances were excluded. Over the study period 14 patients were recruited to the study arm and 17 patients were in the control group. With the Windmill technique for retained placenta, 86% (12/14, p<0.001) of patients avoided invasive operative manual removal of the placenta in theatre. There was a statistically significant reduction in mean blood loss (429ml vs 724ml, p=0.001) and mean postoperative fall in hemoglobin values (1.3g/dl vs 2.5g/dl, p=0.04). There was a reduction in the time to delivery of the placenta, antibiotic prophylaxis and use of general anesthesia. The Windmill technique for the delivery of the retained placenta is a simple, safe, effective and easy to teach technique that reduces invasive operative manual removal of the placenta, postpartum blood loss and delay in the placenta delivery. This innovative technique can also be a lifesaving intervention especially in areas with limited or no access to operative facilities. Copyright © 2017 Elsevier B.V. All rights reserved.
A practical method to assess model sensitivity and parameter uncertainty in C cycle models
NASA Astrophysics Data System (ADS)
Delahaies, Sylvain; Roulstone, Ian; Nichols, Nancy
2015-04-01
The carbon cycle combines multiple spatial and temporal scales, from minutes to hours for the chemical processes occurring in plant cells to several hundred of years for the exchange between the atmosphere and the deep ocean and finally to millennia for the formation of fossil fuels. Together with our knowledge of the transformation processes involved in the carbon cycle, many Earth Observation systems are now available to help improving models and predictions using inverse modelling techniques. A generic inverse problem consists in finding a n-dimensional state vector x such that h(x) = y, for a given N-dimensional observation vector y, including random noise, and a given model h. The problem is well posed if the three following conditions hold: 1) there exists a solution, 2) the solution is unique and 3) the solution depends continuously on the input data. If at least one of these conditions is violated the problem is said ill-posed. The inverse problem is often ill-posed, a regularization method is required to replace the original problem with a well posed problem and then a solution strategy amounts to 1) constructing a solution x, 2) assessing the validity of the solution, 3) characterizing its uncertainty. The data assimilation linked ecosystem carbon (DALEC) model is a simple box model simulating the carbon budget allocation for terrestrial ecosystems. Intercomparison experiments have demonstrated the relative merit of various inverse modelling strategies (MCMC, ENKF) to estimate model parameters and initial carbon stocks for DALEC using eddy covariance measurements of net ecosystem exchange of CO2 and leaf area index observations. Most results agreed on the fact that parameters and initial stocks directly related to fast processes were best estimated with narrow confidence intervals, whereas those related to slow processes were poorly estimated with very large uncertainties. While other studies have tried to overcome this difficulty by adding complementary data streams or by considering longer observation windows no systematic analysis has been carried out so far to explain the large differences among results. We consider adjoint based methods to investigate inverse problems using DALEC and various data streams. Using resolution matrices we study the nature of the inverse problems (solution existence, uniqueness and stability) and show how standard regularization techniques affect resolution and stability properties. Instead of using standard prior information as a penalty term in the cost function to regularize the problems we constraint the parameter space using ecological balance conditions and inequality constraints. The efficiency and rapidity of this approach allows us to compute ensembles of solutions to the inverse problems from which we can establish the robustness of the variational method and obtain non Gaussian posterior distributions for the model parameters and initial carbon stocks.
Grafting Technique to Eliminate Rootstock Suckering of Grafted Tomatoes
USDA-ARS?s Scientific Manuscript database
Vegetable grafting has been proposed as a technique for avoiding disease problems in tomatoes in open field production. In this study we investigated the current use of grafting in an open field scenario and found a serious problem with the grafting techniques. In the Fall of 2007, commercially pr...
Perry, Jonathan; Linsley, Sue
2006-05-01
Nominal group technique is a semi-quantitative/qualitative evaluative methodology. It has been used in health care education for generating ideas to develop curricula and find solutions to problems in programme delivery. This paper aims to describe the use of nominal group technique and present the data from nominal group evaluations of a developing module which used novel approaches to the teaching and assessment of interpersonal skills. Evaluations took place over 3 years. Thirty-six students took part in annual groups. Analysis of the data produced the following themes based on items generated in the groups: role play, marking, course content, teaching style and user involvement. Findings indicate that students valued the role play, feedback from service users and emphasis on engagement and collaboration elements of the module. The areas which participants found difficult and desired change included anxiety during experiential practice, the "snap shot" nature of assessment and the use of specific interventions. Indications are also given regarding the impact of changes made by teaching staff over the 3 year evaluation period. The findings support themes within the existing literature on the teaching of interpersonal skills and may to some extent point the way toward best practice in this area. The paper discusses these findings and their implications for nurse education.
NASA Astrophysics Data System (ADS)
Karimi, Milad; Moradlou, Fridoun; Hajipour, Mojtaba
2018-10-01
This paper is concerned with a backward heat conduction problem with time-dependent thermal diffusivity factor in an infinite "strip". This problem is drastically ill-posed which is caused by the amplified infinitely growth in the frequency components. A new regularization method based on the Meyer wavelet technique is developed to solve the considered problem. Using the Meyer wavelet technique, some new stable estimates are proposed in the Hölder and Logarithmic types which are optimal in the sense of given by Tautenhahn. The stability and convergence rate of the proposed regularization technique are proved. The good performance and the high-accuracy of this technique is demonstrated through various one and two dimensional examples. Numerical simulations and some comparative results are presented.
Mobile internet and technology for optical teaching reform in higher education
NASA Astrophysics Data System (ADS)
Zhou, Muchun; Zhao, Qi; Chen, Yanru
2017-08-01
There are some problems in optical education such as insufficient flexibility, individuality and adaptability to students who need information and education at present. The development of mobile internet and technology provides support to solve these problems. Basic characteristics, advantages and developments of these techniques used in education are presented in this paper. Mobile internet is introduced to reform the classroom teaching of optical courses. Mobile network tool selection, teaching resources construction and reform in teaching methods are discussed. Academic record and sampling surveys are used to assess intention to adopt mobile internet and learning effect of academic major of students, the results show that high quality optical education can be offered by adopting mobile internet and technologies in traditional instruction.
The Design and Implementation of Network Teaching Platform Basing on .NET
NASA Astrophysics Data System (ADS)
Yanna, Ren
This paper addresses the problem that students under traditional teaching model have poor operation ability and studies in depth the network teaching platform in domestic colleges and universities, proposing the design concept of network teaching platform of NET + C # + SQL excellent course and designing the overall structure, function module and back-end database of the platform. This paper emphatically expounds the use of MD5 encryption techniques in order to solve data security problems and the assessment of student learning using ADO.NET database access technology as well as the mathematical formula. The example shows that the network teaching platform developed by using WEB application technology has higher safety and availability, and thus improves the students' operation ability.
An iterative hyperelastic parameters reconstruction for breast cancer assessment
NASA Astrophysics Data System (ADS)
Mehrabian, Hatef; Samani, Abbas
2008-03-01
In breast elastography, breast tissues usually undergo large compressions resulting in significant geometric and structural changes, and consequently nonlinear mechanical behavior. In this study, an elastography technique is presented where parameters characterizing tissue nonlinear behavior is reconstructed. Such parameters can be used for tumor tissue classification. To model the nonlinear behavior, tissues are treated as hyperelastic materials. The proposed technique uses a constrained iterative inversion method to reconstruct the tissue hyperelastic parameters. The reconstruction technique uses a nonlinear finite element (FE) model for solving the forward problem. In this research, we applied Yeoh and Polynomial models to model the tissue hyperelasticity. To mimic the breast geometry, we used a computational phantom, which comprises of a hemisphere connected to a cylinder. This phantom consists of two types of soft tissue to mimic adipose and fibroglandular tissues and a tumor. Simulation results show the feasibility of the proposed method in reconstructing the hyperelastic parameters of the tumor tissue.
Failed medial patellofemoral ligament reconstruction: Causes and surgical strategies
Sanchis-Alfonso, Vicente; Montesinos-Berry, Erik; Ramirez-Fuentes, Cristina; Leal-Blanquet, Joan; Gelber, Pablo E; Monllau, Joan Carles
2017-01-01
Patellar instability is a common clinical problem encountered by orthopedic surgeons specializing in the knee. For patients with chronic lateral patellar instability, the standard surgical approach is to stabilize the patella through a medial patellofemoral ligament (MPFL) reconstruction. Foreseeably, an increasing number of revision surgeries of the reconstructed MPFL will be seen in upcoming years. In this paper, the causes of failed MPFL reconstruction are analyzed: (1) incorrect surgical indication or inappropriate surgical technique/patient selection; (2) a technical error; and (3) an incorrect assessment of the concomitant risk factors for instability. An understanding of the anatomy and biomechanics of the MPFL and cautiousness with the imaging techniques while favoring clinical over radiological findings and the use of common sense to determine the adequate surgical technique for each particular case, are critical to minimizing MPFL surgery failure. Additionally, our approach to dealing with failure after primary MPFL reconstruction is also presented. PMID:28251062
Writing memorable geophysical papers: The need for proper author coalitions
NASA Astrophysics Data System (ADS)
Baker, Daniel N.
A primary function of Eos is to serve the geophysical community. It does this by publishing meeting announcements, book reviews, advertisements for jobs, scientific news items, and the like. Recent articles have helped the membership assess the stage of their careers (Eos, 60, 1024, 1979), informed them of the advantages of having names near the beginning of the alphabet (Eos, 59, 118, 1978), and helped them maximize information transfer during scientific meetings (Eos, 62, 179, 1981). However, no one has dealt with the very difficult problem of making papers memorable. Some techniques, such as long author lists, are now passé. Everyone is doing it. Other techniques, such as writing a very short paper or a humorous paper, are beyond the ken of most AGU members. Fortunately, there remains one technique that can be used by a surprisingly large number of AGU members.
Tomographic reconstruction of tokamak plasma light emission using wavelet-vaguelette decomposition
NASA Astrophysics Data System (ADS)
Schneider, Kai; Nguyen van Yen, Romain; Fedorczak, Nicolas; Brochard, Frederic; Bonhomme, Gerard; Farge, Marie; Monier-Garbet, Pascale
2012-10-01
Images acquired by cameras installed in tokamaks are difficult to interpret because the three-dimensional structure of the plasma is flattened in a non-trivial way. Nevertheless, taking advantage of the slow variation of the fluctuations along magnetic field lines, the optical transformation may be approximated by a generalized Abel transform, for which we proposed in Nguyen van yen et al., Nucl. Fus., 52 (2012) 013005, an inversion technique based on the wavelet-vaguelette decomposition. After validation of the new method using an academic test case and numerical data obtained with the Tokam 2D code, we present an application to an experimental movie obtained in the tokamak Tore Supra. A comparison with a classical regularization technique for ill-posed inverse problems, the singular value decomposition, allows us to assess the efficiency. The superiority of the wavelet-vaguelette technique is reflected in preserving local features, such as blobs and fronts, in the denoised emissivity map.
NASA Astrophysics Data System (ADS)
Nguyen van yen, R.; Fedorczak, N.; Brochard, F.; Bonhomme, G.; Schneider, K.; Farge, M.; Monier-Garbet, P.
2012-01-01
Images acquired by cameras installed in tokamaks are difficult to interpret because the three-dimensional structure of the plasma is flattened in a non-trivial way. Nevertheless, taking advantage of the slow variation of the fluctuations along magnetic field lines, the optical transformation may be approximated by a generalized Abel transform, for which we propose an inversion technique based on the wavelet-vaguelette decomposition. After validation of the new method using an academic test case and numerical data obtained with the Tokam 2D code, we present an application to an experimental movie obtained in the tokamak Tore Supra. A comparison with a classical regularization technique for ill-posed inverse problems, the singular value decomposition, allows us to assess the efficiency. The superiority of the wavelet-vaguelette technique is reflected in preserving local features, such as blobs and fronts, in the denoised emissivity map.
Dionne-Odom, J Nicholas; Lyons, Kathleen D; Akyar, Imatullah; Bakitas, Marie A
2016-01-01
Family caregivers of persons with advanced cancer often take on responsibilities that present daunting and complex problems. Serious problems that go unresolved may be burdensome and result in negative outcomes for caregivers' psychological and physical health and affect the quality of care delivered to the care recipients with cancer, especially at the end of life. Formal problem-solving training approaches have been developed over the past several decades to assist individuals with managing problems faced in daily life. Several of these problem-solving principles and techniques were incorporated into ENABLE (Educate, Nurture, Advise, Before Life End), an "early" palliative care telehealth intervention for individuals diagnosed with advanced cancer and their family caregivers. A hypothetical case resembling the situations of actual caregiver participants in ENABLE that exemplifies the complex problems that caregivers face is presented, followed by presentation of an overview of ENABLE's problem-solving key principles, techniques, and steps in problem-solving support. Though more research is needed to formally test the use of problem-solving support in social work practice, social workers can easily incorporate these techniques into everyday practice.
Dynamic programming and graph algorithms in computer vision.
Felzenszwalb, Pedro F; Zabih, Ramin
2011-04-01
Optimization is a powerful paradigm for expressing and solving problems in a wide range of areas, and has been successfully applied to many vision problems. Discrete optimization techniques are especially interesting since, by carefully exploiting problem structure, they often provide nontrivial guarantees concerning solution quality. In this paper, we review dynamic programming and graph algorithms, and discuss representative examples of how these discrete optimization techniques have been applied to some classical vision problems. We focus on the low-level vision problem of stereo, the mid-level problem of interactive object segmentation, and the high-level problem of model-based recognition.
Analysing attitude data through ridit schemes.
El-rouby, M G
1994-12-02
The attitudes of individuals and populations on various issues are usually assessed through sample surveys. Responses to survey questions are then scaled and combined into a meaningful whole which defines the measured attitude. The applied scales may be of nominal, ordinal, interval, or ratio nature depending upon the degree of sophistication the researcher wants to introduce into the measurement. This paper discusses methods of analysis for categorical variables of the type used in attitude and human behavior research, and recommends adoption of ridit analysis, a technique which has been successfully applied to epidemiological, clinical investigation, laboratory, and microbiological data. The ridit methodology is described after reviewing some general attitude scaling methods and problems of analysis related to them. The ridit method is then applied to a recent study conducted to assess health care service quality in North Carolina. This technique is conceptually and computationally more simple than other conventional statistical methods, and is also distribution-free. Basic requirements and limitations on its use are indicated.
Volcanic hazards and their mitigation: progress and problems
Tilling, R.I.
1989-01-01
A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monioring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichon, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruiz, Colombia (1985)) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. -from Author
EMG Processing Based Measures of Fatigue Assessment during Manual Lifting.
Shair, E F; Ahmad, S A; Marhaban, M H; Mohd Tamrin, S B; Abdullah, A R
2017-01-01
Manual lifting is one of the common practices used in the industries to transport or move objects to a desired place. Nowadays, even though mechanized equipment is widely available, manual lifting is still considered as an essential way to perform material handling task. Improper lifting strategies may contribute to musculoskeletal disorders (MSDs), where overexertion contributes as the highest factor. To overcome this problem, electromyography (EMG) signal is used to monitor the workers' muscle condition and to find maximum lifting load, lifting height and number of repetitions that the workers are able to handle before experiencing fatigue to avoid overexertion. Past researchers have introduced several EMG processing techniques and different EMG features that represent fatigue indices in time, frequency, and time-frequency domain. The impact of EMG processing based measures in fatigue assessment during manual lifting are reviewed in this paper. It is believed that this paper will greatly benefit researchers who need a bird's eye view of the biosignal processing which are currently available, thus determining the best possible techniques for lifting applications.
Classification of older adults with/without a fall history using machine learning methods.
Lin Zhang; Ou Ma; Fabre, Jennifer M; Wood, Robert H; Garcia, Stephanie U; Ivey, Kayla M; McCann, Evan D
2015-01-01
Falling is a serious problem in an aged society such that assessment of the risk of falls for individuals is imperative for the research and practice of falls prevention. This paper introduces an application of several machine learning methods for training a classifier which is capable of classifying individual older adults into a high risk group and a low risk group (distinguished by whether or not the members of the group have a recent history of falls). Using a 3D motion capture system, significant gait features related to falls risk are extracted. By training these features, classification hypotheses are obtained based on machine learning techniques (K Nearest-neighbour, Naive Bayes, Logistic Regression, Neural Network, and Support Vector Machine). Training and test accuracies with sensitivity and specificity of each of these techniques are assessed. The feature adjustment and tuning of the machine learning algorithms are discussed. The outcome of the study will benefit the prediction and prevention of falls.
Traumatic brain injury: future assessment tools and treatment prospects
Flanagan, Steven R; Cantor, Joshua B; Ashman, Teresa A
2008-01-01
Traumatic brain injury (TBI) is widespread and leads to death and disability in millions of individuals around the world each year. Overall incidence and prevalence of TBI are likely to increase in absolute terms in the future. Tackling the problem of treating TBI successfully will require improvements in the understanding of normal cerebral anatomy, physiology, and function throughout the lifespan, as well as the pathological and recuperative responses that result from trauma. New treatment approaches and combinations will need to be targeted to the heterogeneous needs of TBI populations. This article explores and evaluates the research evidence in areas that will likely lead to a reduction in TBI-related morbidity and improved outcomes. These include emerging assessment instruments and techniques in areas of structural/chemical and functional neuroimaging and neuropsychology, advances in the realms of cell-based therapies and genetics, promising cognitive rehabilitation techniques including cognitive remediation and the use of electronic technologies including assistive devices and virtual reality, and the emerging field of complementary and alternative medicine. PMID:19183780
PSO/ACO algorithm-based risk assessment of human neural tube defects in Heshun County, China.
Liao, Yi Lan; Wang, Jin Feng; Wu, Ji Lei; Wang, Jiao Jiao; Zheng, Xiao Ying
2012-10-01
To develop a new technique for assessing the risk of birth defects, which are a major cause of infant mortality and disability in many parts of the world. The region of interest in this study was Heshun County, the county in China with the highest rate of neural tube defects (NTDs). A hybrid particle swarm optimization/ant colony optimization (PSO/ACO) algorithm was used to quantify the probability of NTDs occurring at villages with no births. The hybrid PSO/ACO algorithm is a form of artificial intelligence adapted for hierarchical classification. It is a powerful technique for modeling complex problems involving impacts of causes. The algorithm was easy to apply, with the accuracy of the results being 69.5%±7.02% at the 95% confidence level. The proposed method is simple to apply, has acceptable fault tolerance, and greatly enhances the accuracy of calculations. Copyright © 2012 The Editorial Board of Biomedical and Environmental Sciences. Published by Elsevier B.V. All rights reserved.
A hybrid nonlinear programming method for design optimization
NASA Technical Reports Server (NTRS)
Rajan, S. D.
1986-01-01
Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.
Statistical methods for convergence detection of multi-objective evolutionary algorithms.
Trautmann, H; Wagner, T; Naujoks, B; Preuss, M; Mehnen, J
2009-01-01
In this paper, two approaches for estimating the generation in which a multi-objective evolutionary algorithm (MOEA) shows statistically significant signs of convergence are introduced. A set-based perspective is taken where convergence is measured by performance indicators. The proposed techniques fulfill the requirements of proper statistical assessment on the one hand and efficient optimisation for real-world problems on the other hand. The first approach accounts for the stochastic nature of the MOEA by repeating the optimisation runs for increasing generation numbers and analysing the performance indicators using statistical tools. This technique results in a very robust offline procedure. Moreover, an online convergence detection method is introduced as well. This method automatically stops the MOEA when either the variance of the performance indicators falls below a specified threshold or a stagnation of their overall trend is detected. Both methods are analysed and compared for two MOEA and on different classes of benchmark functions. It is shown that the methods successfully operate on all stated problems needing less function evaluations while preserving good approximation quality at the same time.
NASA Astrophysics Data System (ADS)
Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed
2017-05-01
Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.
Application of Central Upwind Scheme for Solving Special Relativistic Hydrodynamic Equations
Yousaf, Muhammad; Ghaffar, Tayabia; Qamar, Shamsul
2015-01-01
The accurate modeling of various features in high energy astrophysical scenarios requires the solution of the Einstein equations together with those of special relativistic hydrodynamics (SRHD). Such models are more complicated than the non-relativistic ones due to the nonlinear relations between the conserved and state variables. A high-resolution shock-capturing central upwind scheme is implemented to solve the given set of equations. The proposed technique uses the precise information of local propagation speeds to avoid the excessive numerical diffusion. The second order accuracy of the scheme is obtained with the use of MUSCL-type initial reconstruction and Runge-Kutta time stepping method. After a discussion of the equations solved and of the techniques employed, a series of one and two-dimensional test problems are carried out. To validate the method and assess its accuracy, the staggered central and the kinetic flux-vector splitting schemes are also applied to the same model. The scheme is robust and efficient. Its results are comparable to those obtained from the sophisticated algorithms, even in the case of highly relativistic two-dimensional test problems. PMID:26070067
Development and demonstration of an on-board mission planner for helicopters
NASA Technical Reports Server (NTRS)
Deutsch, Owen L.; Desai, Mukund
1988-01-01
Mission management tasks can be distributed within a planning hierarchy, where each level of the hierarchy addresses a scope of action, and associated time scale or planning horizon, and requirements for plan generation response time. The current work is focused on the far-field planning subproblem, with a scope and planning horizon encompassing the entire mission and with a response time required to be about two minutes. The far-feld planning problem is posed as a constrained optimization problem and algorithms and structural organizations are proposed for the solution. Algorithms are implemented in a developmental environment, and performance is assessed with respect to optimality and feasibility for the intended application and in comparison with alternative algorithms. This is done for the three major components of far-field planning: goal planning, waypoint path planning, and timeline management. It appears feasible to meet performance requirements on a 10 Mips flyable processor (dedicated to far-field planning) using a heuristically-guided simulated annealing technique for the goal planner, a modified A* search for the waypoint path planner, and a speed scheduling technique developed for this project.
Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop
NASA Technical Reports Server (NTRS)
Rozier, Kristin Yvonne (Editor)
2008-01-01
Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.
Development of parallel algorithms for electrical power management in space applications
NASA Technical Reports Server (NTRS)
Berry, Frederick C.
1989-01-01
The application of parallel techniques for electrical power system analysis is discussed. The Newton-Raphson method of load flow analysis was used along with the decomposition-coordination technique to perform load flow analysis. The decomposition-coordination technique enables tasks to be performed in parallel by partitioning the electrical power system into independent local problems. Each independent local problem represents a portion of the total electrical power system on which a loan flow analysis can be performed. The load flow analysis is performed on these partitioned elements by using the Newton-Raphson load flow method. These independent local problems will produce results for voltage and power which can then be passed to the coordinator portion of the solution procedure. The coordinator problem uses the results of the local problems to determine if any correction is needed on the local problems. The coordinator problem is also solved by an iterative method much like the local problem. The iterative method for the coordination problem will also be the Newton-Raphson method. Therefore, each iteration at the coordination level will result in new values for the local problems. The local problems will have to be solved again along with the coordinator problem until some convergence conditions are met.
Refined genetic algorithm -- Economic dispatch example
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheble, G.B.; Brittig, K.
1995-02-01
A genetic-based algorithm is used to solve an economic dispatch (ED) problem. The algorithm utilizes payoff information of perspective solutions to evaluate optimality. Thus, the constraints of classical LaGrangian techniques on unit curves are eliminated. Using an economic dispatch problem as a basis for comparison, several different techniques which enhance program efficiency and accuracy, such as mutation prediction, elitism, interval approximation and penalty factors, are explored. Two unique genetic algorithms are also compared. The results are verified for a sample problem using a classical technique.
Espín Balbino, Jaime; Brosa Riestra, Max; Oliva Moreno, Juan; Trapero-Bertran, Marta
2015-01-01
The development of the economic evaluation of health care interventions has become a support tool in making decisions on pricing and reimbursement of new health interventions. The increasingly extensive application of these techniques has led to the identification of particular situations in which, for various reasons, it may be reasonable to take into account special considerations when applying the general principles of economic evaluation. In this article, which closes a series of three, we will discuss, using the Metaplan technique, about the economic evaluation of health interventions in special situations such as rare diseases and end of life treatments, as well as consideration of externalities in assessments, finally pointing out some research areas to solve the main problems identified in these fields.
NASA Technical Reports Server (NTRS)
Hughes, T. H.; Dillion, A. C., III; White, J. R., Jr.; Drummond, S. E., Jr.; Hooks, W. G.
1975-01-01
Because of the volume of coal produced by strip mining, the proximity of mining operations, and the diversity of mining methods (e.g. contour stripping, area stripping, multiple seam stripping, and augering, as well as underground mining), the Warrior Coal Basin seemed best suited for initial studies on the physical impact of strip mining in Alabama. Two test sites, (Cordova and Searles) representative of the various strip mining techniques and environmental problems, were chosen for intensive studies of the correlation between remote sensing and ground truth data. Efforts were eventually concentrated in the Searles Area, since it is more accessible and offers a better opportunity for study of erosional and depositional processes than the Cordova Area.
[Current status and future perspectives of hepatocyte transplantation].
Pareja, Eugenia; Cortés, Miriam; Gómez-Lechón, M José; Maupoey, Javier; San Juan, Fernando; López, Rafael; Mir, Jose
2014-02-01
The imbalance between the number of potential beneficiaries and available organs, originates the search for new therapeutic alternatives, such as Hepatocyte transplantation (HT).Even though this is a treatment option for these patients, the lack of unanimity of criteria regarding indications and technique, different cryopreservation protocols, as well as the different methodology to assess the response to this therapy, highlights the need of a Consensus Conference to standardize criteria and consider future strategies to improve the technique and optimize the results.Our aim is to review and update the current state of hepatocyte transplantation, emphasizing the future research attempting to solve the problems and improve the results of this treatment. Copyright © 2013 AEC. Published by Elsevier Espana. All rights reserved.
An analytical and experimental evaluation of a Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. A.; Cosby, R. M.
1976-01-01
An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.
An analytical and experimental evaluation of the plano-cylindrical Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. L.; Cosby, R. M.
1976-01-01
Plastic Fresnel lenses for solar concentration are attractive because of potential for low-cost mass production. An analytical and experimental evaluation of line-focusing Fresnel lenses with application potential in the 200 to 370 C range is reported. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves-down lens. Experimentation was based primarily on a 56 cm-wide lens with f-number 1.0. A sun-tracking heliostat provided a non-moving solar source. Measured data indicated more spreading at the profile base than analytically predicted. The measured and computed transmittances were 85 and 87% respectively. Preliminary testing with a second lens (1.85 m) indicated that modified manufacturing techniques corrected the profile spreading problem.
Iterative methods for tomography problems: implementation to a cross-well tomography problem
NASA Astrophysics Data System (ADS)
Karadeniz, M. F.; Weber, G. W.
2018-01-01
The velocity distribution between two boreholes is reconstructed by cross-well tomography, which is commonly used in geology. In this paper, iterative methods, Kaczmarz’s algorithm, algebraic reconstruction technique (ART), and simultaneous iterative reconstruction technique (SIRT), are implemented to a specific cross-well tomography problem. Convergence to the solution of these methods and their CPU time for the cross-well tomography problem are compared. Furthermore, these three methods for this problem are compared for different tolerance values.
Multiobjective Resource-Constrained Project Scheduling with a Time-Varying Number of Tasks
Abello, Manuel Blanco
2014-01-01
In resource-constrained project scheduling (RCPS) problems, ongoing tasks are restricted to utilizing a fixed number of resources. This paper investigates a dynamic version of the RCPS problem where the number of tasks varies in time. Our previous work investigated a technique called mapping of task IDs for centroid-based approach with random immigrants (McBAR) that was used to solve the dynamic problem. However, the solution-searching ability of McBAR was investigated over only a few instances of the dynamic problem. As a consequence, only a small number of characteristics of McBAR, under the dynamics of the RCPS problem, were found. Further, only a few techniques were compared to McBAR with respect to its solution-searching ability for solving the dynamic problem. In this paper, (a) the significance of the subalgorithms of McBAR is investigated by comparing McBAR to several other techniques; and (b) the scope of investigation in the previous work is extended. In particular, McBAR is compared to a technique called, Estimation Distribution Algorithm (EDA). As with McBAR, EDA is applied to solve the dynamic problem, an application that is unique in the literature. PMID:24883398
NASA Astrophysics Data System (ADS)
Proux, Denys; Segond, Frédérique; Gerbier, Solweig; Metzger, Marie Hélène
Hospital Acquired Infections (HAI) is a real burden for doctors and risk surveillance experts. The impact on patients' health and related healthcare cost is very significant and a major concern even for rich countries. Furthermore required data to evaluate the threat is generally not available to experts and that prevents from fast reaction. However, recent advances in Computational Intelligence Techniques such as Information Extraction, Risk Patterns Detection in documents and Decision Support Systems allow now to address this problem.
NASA Astrophysics Data System (ADS)
Pelikan, Erich; Vogelsang, Frank; Tolxdorff, Thomas
1996-04-01
The texture-based segmentation of x-ray images of focal bone lesions using topological maps is introduced. Texture characteristics are described by image-point correlation of feature images to feature vectors. For the segmentation, the topological map is labeled using an improved labeling strategy. Results of the technique are demonstrated on original and synthetic x-ray images and quantified with the aid of quality measures. In addition, a classifier-specific contribution analysis is applied for assessing the feature space.
NASA Technical Reports Server (NTRS)
Langland, R. A.; Stephens, P. L.; Pihos, G. G.
1980-01-01
The techniques used for ingesting SEASAT-A SASS wind retrievals into the existing operational software are described. The intent is to assess the impact of SEASAT data in he marine wind fields produced by the global marine wind/sea level pressure analysis. This analysis is performed on a 21/2 deg latitude/longitude global grid which executes at three hourly time increments. Wind fields with and without SASS winds are being compared. The problems of data volume reduction and aliased wind retrieval ambiquity are treated.
A survey of particle contamination in electronic devices
NASA Technical Reports Server (NTRS)
Adolphsen, J. W.; Kagdis, W. A.; Timmins, A. R.
1976-01-01
The experiences are given of a number of National Aeronautics and Space Administration (NASA) and Space and Missile System Organization (SAMSO) contractors with particle contamination, and the methods used for its prevention and detection, evaluates the bases for the different schemes, assesses their effectiveness, and identifies the problems associated with each. It recommends specific short-range tests or approaches appropriate to individual part-type categories and recommends that specific tasks be initiated to refine techniques and to resolve technical and application facets of promising solutions.
Management. A continuing bibliography with indexes. [March 1980
NASA Technical Reports Server (NTRS)
1980-01-01
This bibliography cites 604 reports, articles, and other documents introduced into the NASA scientific and technical information system in 1979 covering the management of research and development, contracts, production, logistics, personnel, safety, reliability and quality control. Program, project, and systems management; management policy, philosophy, tools, and techniques; decision making processes for managers; technology assessment; management of urban problems; and information for managers on Federal resources, expenditures, financing, and budgeting are also covered. Abstracts are provided as well as subject, personal author, and corporate source indexes.
Three-Dimensional Profiles Using a Spherical Cutting Bit: Problem Solving in Practice
ERIC Educational Resources Information Center
Ollerton, Richard L.; Iskov, Grant H.; Shannon, Anthony G.
2002-01-01
An engineering problem concerned with relating the coordinates of the centre of a spherical cutting tool to the actual cutting surface leads to a potentially rich example of problem-solving techniques. Basic calculus, Lagrange multipliers and vector calculus techniques are employed to produce solutions that may be compared to better understand…
Holbrook, Jane
2010-01-01
Objective To assess pharmacy students' attitudes towards a blended-learning pharmacokinetics course. Design Narrated visual presentations and animations that illustrated kinetic processes and guided students through the use of software programs used for calculations were created. Other learning techniques used included online self-assessment quizzes, practice problem sets, and weekly face-to-face problem-solving tutorials. Assessment A precourse questionnaire to assess students' level of enthusiasm towards the blended-learning course and to solicit any concerns they had was administered at the beginning of the course. A postcourse questionnaire that included the same 4 Likert-scale items from the precourse questionnaire and follow-up open-ended questions was administered. Individual changes in level of enthusiasm were compared for individuals who completed both the precourse and postcourse questionnaire. Students' concerns about the blended method of learning had decreased postcourse while their enthusiasm for the benefits of blended learning had increased. Conclusion Students' initial concerns about the blended learning experience were focused on their ability to communicate with the instructor about the online components, but shifted to their own time management skills at the end of the course. Face-to-face interactions with each other and with the instructor were more highly rated than online interactions in this course. PMID:20798797
Focusing on the golden ball metaheuristic: an extended study on a wider set of problems.
Osaba, E; Diaz, F; Carballedo, R; Onieva, E; Perallos, A
2014-01-01
Nowadays, the development of new metaheuristics for solving optimization problems is a topic of interest in the scientific community. In the literature, a large number of techniques of this kind can be found. Anyway, there are many recently proposed techniques, such as the artificial bee colony and imperialist competitive algorithm. This paper is focused on one recently published technique, the one called Golden Ball (GB). The GB is a multiple-population metaheuristic based on soccer concepts. Although it was designed to solve combinatorial optimization problems, until now, it has only been tested with two simple routing problems: the traveling salesman problem and the capacitated vehicle routing problem. In this paper, the GB is applied to four different combinatorial optimization problems. Two of them are routing problems, which are more complex than the previously used ones: the asymmetric traveling salesman problem and the vehicle routing problem with backhauls. Additionally, one constraint satisfaction problem (the n-queen problem) and one combinatorial design problem (the one-dimensional bin packing problem) have also been used. The outcomes obtained by GB are compared with the ones got by two different genetic algorithms and two distributed genetic algorithms. Additionally, two statistical tests are conducted to compare these results.
Focusing on the Golden Ball Metaheuristic: An Extended Study on a Wider Set of Problems
Osaba, E.; Diaz, F.; Carballedo, R.; Onieva, E.; Perallos, A.
2014-01-01
Nowadays, the development of new metaheuristics for solving optimization problems is a topic of interest in the scientific community. In the literature, a large number of techniques of this kind can be found. Anyway, there are many recently proposed techniques, such as the artificial bee colony and imperialist competitive algorithm. This paper is focused on one recently published technique, the one called Golden Ball (GB). The GB is a multiple-population metaheuristic based on soccer concepts. Although it was designed to solve combinatorial optimization problems, until now, it has only been tested with two simple routing problems: the traveling salesman problem and the capacitated vehicle routing problem. In this paper, the GB is applied to four different combinatorial optimization problems. Two of them are routing problems, which are more complex than the previously used ones: the asymmetric traveling salesman problem and the vehicle routing problem with backhauls. Additionally, one constraint satisfaction problem (the n-queen problem) and one combinatorial design problem (the one-dimensional bin packing problem) have also been used. The outcomes obtained by GB are compared with the ones got by two different genetic algorithms and two distributed genetic algorithms. Additionally, two statistical tests are conducted to compare these results. PMID:25165742
Enhanced algorithms for stochastic programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishna, Alamuru S.
1993-09-01
In this dissertation, we present some of the recent advances made in solving two-stage stochastic linear programming problems of large size and complexity. Decomposition and sampling are two fundamental components of techniques to solve stochastic optimization problems. We describe improvements to the current techniques in both these areas. We studied different ways of using importance sampling techniques in the context of Stochastic programming, by varying the choice of approximation functions used in this method. We have concluded that approximating the recourse function by a computationally inexpensive piecewise-linear function is highly efficient. This reduced the problem from finding the mean ofmore » a computationally expensive functions to finding that of a computationally inexpensive function. Then we implemented various variance reduction techniques to estimate the mean of a piecewise-linear function. This method achieved similar variance reductions in orders of magnitude less time than, when we directly applied variance-reduction techniques directly on the given problem. In solving a stochastic linear program, the expected value problem is usually solved before a stochastic solution and also to speed-up the algorithm by making use of the information obtained from the solution of the expected value problem. We have devised a new decomposition scheme to improve the convergence of this algorithm.« less
Tolchard, Barry
2017-06-01
There is evidence supporting the use of cognitive-behavioral therapy (CBT) in the treatment of problem gambling. Despite this, little is known about how CBT works and which particular approach is most effective. This paper aims to synthesize the evidence for current CBT and propose a more unified approach to treatment. A literature review and narrative synthesis of the current research evidence of CBT for the treatment of problem gambling was conducted, focusing on the underlying mechanisms within the treatment approach. Several CBT approaches were critiqued. These can be divided into forms of exposure therapy (including aversion techniques, systematic desensitization and other behavioral experiments) those focusing on cognitive restructuring techniques (such as reinforcement of nongambling activity, use of diaries, motivational enhancement and audio-playback techniques and third wave techniques including mindfulness. Findings, in relation to the treatment actions, from this synthesis are reported. The debate surrounding the treatment of problem gambling has been conducted as an either/or rather than a both/and discourse. This paper proposes a new, unified approach to the treatment of problem gambling that incorporates the best elements of both exposure and cognitive restructuring techniques, alongside the use of techniques borrowed from mindfulness and other CBT approaches.
Assessing the Rayleigh Intensity Remote Leak Detection Technique
NASA Technical Reports Server (NTRS)
Clements, Sandra
2001-01-01
Remote sensing technologies are being considered for efficient, low cost gas leak detection. An exploratory project to identify and evaluate remote sensing technologies for application to gas leak detection is underway. During Phase 1 of the project, completed last year, eleven specific techniques were identified for further study. One of these, the Rayleigh Intensity technique, would make use of changes in the light scattered off of gas molecules to detect and locate a leak. During the 10-week Summer Faculty Fellowship Program, the scatter of light off of gas molecules was investigated. The influence of light scattered off of aerosols suspended in the atmosphere was also examined to determine if this would adversely affect leak detection. Results of this study indicate that in unconditioned air, it will be difficult, though perhaps not impossible, to distinguish between a gas leak and natural variations in the aerosol content of the air. Because information about the particle size distribution in clean room environments is incomplete, the applicability in clean rooms is uncertain though more promising than in unconditioned environments. It is suggested that problems caused by aerosols may be overcome by using the Rayleigh Intensity technique in combination with another remote sensing technique, the Rayleigh Doppler technique.
Evolutionary optimization methods for accelerator design
NASA Astrophysics Data System (ADS)
Poklonskiy, Alexey A.
Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained optimization test problems for EA with a variety of different configurations and suggest optimal default parameter values based on the results. Then we study the performance of the REPA method on the same set of test problems and compare the obtained results with those of several commonly used constrained optimization methods with EA. Based on the obtained results, particularly on the outstanding performance of REPA on test problem that presents significant difficulty for other reviewed EAs, we conclude that the proposed method is useful and competitive. We discuss REPA parameter tuning for difficult problems and critically review some of the problems from the de-facto standard test problem set for the constrained optimization with EA. In order to demonstrate the practical usefulness of the developed method, we study several problems of accelerator design and demonstrate how they can be solved with EAs. These problems include a simple accelerator design problem (design a quadrupole triplet to be stigmatically imaging, find all possible solutions), a complex real-life accelerator design problem (an optimization of the front end section for the future neutrino factory), and a problem of the normal form defect function optimization which is used to rigorously estimate the stability of the beam dynamics in circular accelerators. The positive results we obtained suggest that the application of EAs to problems from accelerator theory can be very beneficial and has large potential. The developed optimization scenarios and tools can be used to approach similar problems.
Plazzotta, Fernando; Otero, Carlos; Luna, Daniel; de Quiros, Fernan Gonzalez Bernaldo
2013-01-01
Physicians do not always keep the problem list accurate, complete and updated. To analyze natural language processing (NLP) techniques and inference rules as strategies to maintain completeness and accuracy of the problem list in EHRs. Non systematic literature review in PubMed, in the last 10 years. Strategies to maintain the EHRs problem list were analyzed in two ways: inputting and removing problems from the problem list. NLP and inference rules have acceptable performance for inputting problems into the problem list. No studies using these techniques for removing problems were published Conclusion: Both tools, NLP and inference rules have had acceptable results as tools for maintain the completeness and accuracy of the problem list.
Historical shoreline mapping (I): improving techniques and reducing positioning errors
Thieler, E. Robert; Danforth, William W.
1994-01-01
A critical need exists among coastal researchers and policy-makers for a precise method to obtain shoreline positions from historical maps and aerial photographs. A number of methods that vary widely in approach and accuracy have been developed to meet this need. None of the existing methods, however, address the entire range of cartographic and photogrammetric techniques required for accurate coastal mapping. Thus, their application to many typical shoreline mapping problems is limited. In addition, no shoreline mapping technique provides an adequate basis for quantifying the many errors inherent in shoreline mapping using maps and air photos. As a result, current assessments of errors in air photo mapping techniques generally (and falsely) assume that errors in shoreline positions are represented by the sum of a series of worst-case assumptions about digitizer operator resolution and ground control accuracy. These assessments also ignore altogether other errors that commonly approach ground distances of 10 m. This paper provides a conceptual and analytical framework for improved methods of extracting geographic data from maps and aerial photographs. We also present a new approach to shoreline mapping using air photos that revises and extends a number of photogrammetric techniques. These techniques include (1) developing spatially and temporally overlapping control networks for large groups of photos; (2) digitizing air photos for use in shoreline mapping; (3) preprocessing digitized photos to remove lens distortion and film deformation effects; (4) simultaneous aerotriangulation of large groups of spatially and temporally overlapping photos; and (5) using a single-ray intersection technique to determine geographic shoreline coordinates and express the horizontal and vertical error associated with a given digitized shoreline. As long as historical maps and air photos are used in studies of shoreline change, there will be a considerable amount of error (on the order of several meters) present in shoreline position and rate-of- change calculations. The techniques presented in this paper, however, provide a means to reduce and quantify these errors so that realistic assessments of the technological noise (as opposed to geological noise) in geographic shoreline positions can be made.
A numerical projection technique for large-scale eigenvalue problems
NASA Astrophysics Data System (ADS)
Gamillscheg, Ralf; Haase, Gundolf; von der Linden, Wolfgang
2011-10-01
We present a new numerical technique to solve large-scale eigenvalue problems. It is based on the projection technique, used in strongly correlated quantum many-body systems, where first an effective approximate model of smaller complexity is constructed by projecting out high energy degrees of freedom and in turn solving the resulting model by some standard eigenvalue solver. Here we introduce a generalization of this idea, where both steps are performed numerically and which in contrast to the standard projection technique converges in principle to the exact eigenvalues. This approach is not just applicable to eigenvalue problems encountered in many-body systems but also in other areas of research that result in large-scale eigenvalue problems for matrices which have, roughly speaking, mostly a pronounced dominant diagonal part. We will present detailed studies of the approach guided by two many-body models.
Innovation and behavioral flexibility in wild redfronted lemurs (Eulemur rufifrons).
Huebner, Franziska; Fichtel, Claudia
2015-05-01
Innovations and problem-solving abilities can provide animals with important ecological advantages as they allow individuals to deal with novel social and ecological challenges. Innovation is a solution to a novel problem or a novel solution to an old problem, with the latter being especially difficult. Finding a new solution to an old problem requires individuals to inhibit previously applied solutions to invent new strategies and to behave flexibly. We examined the role of experience on cognitive flexibility to innovate and to find new problem-solving solutions with an artificial feeding task in wild redfronted lemurs (Eulemur rufifrons). Four groups of lemurs were tested with feeding boxes, each offering three different techniques to extract food, with only one technique being available at a time. After the subjects learned a technique, this solution was no longer successful and subjects had to invent a new technique. For the first transition between task 1 and 2, subjects had to rely on their experience of the previous technique to solve task 2. For the second transition, subjects had to inhibit the previously learned technique to learn the new task 3. Tasks 1 and 2 were solved by most subjects, whereas task 3 was solved by only a few subjects. In this task, besides behavioral flexibility, especially persistence, i.e., constant trying, was important for individual success during innovation. Thus, wild strepsirrhine primates are able to innovate flexibly, suggesting a general ecological relevance of behavioral flexibility and persistence during innovation and problem solving across all primates.
Mental Health Nursing in Greece: Nursing Diagnoses and Interventions in Major Depression.
Prokofieva, Margarita; Koukia, Evmorfia; Dikeos, Dimitris
2016-08-01
The aim of the study was to assess nursing diagnoses and nursing interventions that were accordingly implemented during the care of inpatients with major depression in Greece. Twelve nurses working in three major psychiatric hospitals were recruited. Semi-structured interviews were used and audio-recorded data indicated that risk for suicide, social isolation, low self-esteem, sleep problems, and imbalanced nutrition are the nursing diagnoses most commonly reported. Establishing trust and rapport is the primary intervention, followed by specific interventions according to each diagnosis and the individualized care plan. The findings of the study also highlight the need for nursing training in order to teach nurses initial assessment procedures and appropriate evidence-based intervention techniques.
NASA Technical Reports Server (NTRS)
Schramm, Harry F.; Sullivan, Kenneth W.
1991-01-01
An evaluation of the NASA's Marshall Space Flight Center (MSFC) strategy to implement Total Quality Management (TQM) in the Advanced Solid Rocket Motor (ASRM) Project is presented. The evaluation of the implementation strategy reflected the Civil Service personnel perspective at the project level. The external and internal environments at MSFC were analyzed for their effects on the ASRM TQM strategy. Organizational forms, cultures, management systems, problem solving techniques, and training were assessed for their influence on the implementation strategy. The influence of ASRM's effort was assessed relative to its impact on mature projects as well as future projects at MSFC.
Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S
2002-02-01
This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.
Dynamic Programming and Graph Algorithms in Computer Vision*
Felzenszwalb, Pedro F.; Zabih, Ramin
2013-01-01
Optimization is a powerful paradigm for expressing and solving problems in a wide range of areas, and has been successfully applied to many vision problems. Discrete optimization techniques are especially interesting, since by carefully exploiting problem structure they often provide non-trivial guarantees concerning solution quality. In this paper we briefly review dynamic programming and graph algorithms, and discuss representative examples of how these discrete optimization techniques have been applied to some classical vision problems. We focus on the low-level vision problem of stereo; the mid-level problem of interactive object segmentation; and the high-level problem of model-based recognition. PMID:20660950
A dependency-based modelling mechanism for problem solving
NASA Technical Reports Server (NTRS)
London, P.
1978-01-01
The paper develops a technique of dependency net modeling which relies on an explicit representation of justifications for beliefs held by the problem solver. Using these justifications, the modeling mechanism is able to determine the relevant lines of inference to pursue during problem solving. Three particular problem-solving difficulties which may be handled by the dependency-based technique are discussed: (1) subgoal violation detection, (2) description binding, and (3) maintaining a consistent world model.
Rahaman, Mijanur; Pang, Chin-Tzong; Ishtyak, Mohd; Ahmad, Rais
2017-01-01
In this article, we introduce a perturbed system of generalized mixed quasi-equilibrium-like problems involving multi-valued mappings in Hilbert spaces. To calculate the approximate solutions of the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems, firstly we develop a perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems, and then by using the celebrated Fan-KKM technique, we establish the existence and uniqueness of solutions of the perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems. By deploying an auxiliary principle technique and an existence result, we formulate an iterative algorithm for solving the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems. Lastly, we study the strong convergence analysis of the proposed iterative sequences under monotonicity and some mild conditions. These results are new and generalize some known results in this field.
NASA Astrophysics Data System (ADS)
Kasprzyk, J. R.; Smith, R.; Raseman, W. J.; DeRousseau, M. A.; Dilling, L.; Ozekin, K.; Summers, R. S.; Balaji, R.; Livneh, B.; Rosario-Ortiz, F.; Sprain, L.; Srubar, W. V., III
2017-12-01
This presentation will report on three projects that used interactive workshops with stakeholders to develop problem formulations for Multi-Objective Evolutionary Algorithm (MOEA)-based decision support in diverse fields - water resources planning, water quality engineering under climate extremes, and sustainable materials design. When combined with a simulation model of a system, MOEAs use intelligent search techniques to provide new plans or designs. This approach is gaining increasing prominence in design and planning for environmental sustainability. To use this technique, a problem formulation - objectives and constraints (quantitative measures of performance) and decision variables (actions that can be modified to improve the system) - must be identified. Although critically important for MOEA effectiveness, the problem formulations are not always developed with stakeholders' interests in mind. To ameliorate this issue, project workshops were organized to improve the tool's relevance as well as collaboratively build problem formulations that can be used in applications. There were interesting differences among the projects, which altered the findings of each workshop. Attendees ranged from a group of water managers on the Front Range of Colorado, to water utility representatives from across the country, to a set of designers, academics, and trade groups. The extent to which the workshop participants were already familiar with simulation tools contributed to their willingness to accept the solutions that were generated using the tool. Moreover, in some instances, brainstorming new objectives to include within the MOEA expanded the scope of the problem formulation, relative to the initial conception of the researchers. Through describing results across a diversity of projects, the goal of this presentation is to report on how our approach may inform future decision support collaboration with a variety of stakeholders and sectors.
Security Threat Assessment of an Internet Security System Using Attack Tree and Vague Sets
2014-01-01
Security threat assessment of the Internet security system has become a greater concern in recent years because of the progress and diversification of information technology. Traditionally, the failure probabilities of bottom events of an Internet security system are treated as exact values when the failure probability of the entire system is estimated. However, security threat assessment when the malfunction data of the system's elementary event are incomplete—the traditional approach for calculating reliability—is no longer applicable. Moreover, it does not consider the failure probability of the bottom events suffered in the attack, which may bias conclusions. In order to effectively solve the problem above, this paper proposes a novel technique, integrating attack tree and vague sets for security threat assessment. For verification of the proposed approach, a numerical example of an Internet security system security threat assessment is adopted in this paper. The result of the proposed method is compared with the listing approaches of security threat assessment methods. PMID:25405226
Security threat assessment of an Internet security system using attack tree and vague sets.
Chang, Kuei-Hu
2014-01-01
Security threat assessment of the Internet security system has become a greater concern in recent years because of the progress and diversification of information technology. Traditionally, the failure probabilities of bottom events of an Internet security system are treated as exact values when the failure probability of the entire system is estimated. However, security threat assessment when the malfunction data of the system's elementary event are incomplete--the traditional approach for calculating reliability--is no longer applicable. Moreover, it does not consider the failure probability of the bottom events suffered in the attack, which may bias conclusions. In order to effectively solve the problem above, this paper proposes a novel technique, integrating attack tree and vague sets for security threat assessment. For verification of the proposed approach, a numerical example of an Internet security system security threat assessment is adopted in this paper. The result of the proposed method is compared with the listing approaches of security threat assessment methods.
Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling.
Dotto, Cintia B S; Mannina, Giorgio; Kleidorfer, Manfred; Vezzaro, Luca; Henrichs, Malte; McCarthy, David T; Freni, Gabriele; Rauch, Wolfgang; Deletic, Ana
2012-05-15
Urban drainage models are important tools used by both practitioners and scientists in the field of stormwater management. These models are often conceptual and usually require calibration using local datasets. The quantification of the uncertainty associated with the models is a must, although it is rarely practiced. The International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, has been working on the development of a framework for defining and assessing uncertainties in the field of urban drainage modelling. A part of that work is the assessment and comparison of different techniques generally used in the uncertainty assessment of the parameters of water models. This paper compares a number of these techniques: the Generalized Likelihood Uncertainty Estimation (GLUE), the Shuffled Complex Evolution Metropolis algorithm (SCEM-UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multi-objective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside the same dataset. The comparison results for a well-posed rainfall/runoff model showed that the four methods provide similar probability distributions of model parameters, and model prediction intervals. For ill-posed water quality model the differences between the results were much wider; and the paper provides the specific advantages and disadvantages of each method. In relation to computational efficiency (i.e. number of iterations required to generate the probability distribution of parameters), it was found that SCEM-UA and AMALGAM produce results quicker than GLUE in terms of required number of simulations. However, GLUE requires the lowest modelling skills and is easy to implement. All non-Bayesian methods have problems with the way they accept behavioural parameter sets, e.g. GLUE, SCEM-UA and AMALGAM have subjective acceptance thresholds, while MICA has usually problem with its hypothesis on normality of residuals. It is concluded that modellers should select the method which is most suitable for the system they are modelling (e.g. complexity of the model's structure including the number of parameters), their skill/knowledge level, the available information, and the purpose of their study. Copyright © 2012 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Crabtree, John; Zhang, Xihui
2015-01-01
Teaching advanced programming can be a challenge, especially when the students are pursuing different majors with diverse analytical and problem-solving capabilities. The purpose of this paper is to explore the efficacy of using a particular problem as a vehicle for imparting a broad set of programming concepts and problem-solving techniques. We…
Syazwan, AI; Rafee, B Mohd; Juahir, Hafizan; Azman, AZF; Nizar, AM; Izwyn, Z; Syahidatussyakirah, K; Muhaimin, AA; Yunos, MA Syafiq; Anita, AR; Hanafiah, J Muhamad; Shaharuddin, MS; Ibthisham, A Mohd; Hasmadi, I Mohd; Azhar, MN Mohamad; Azizan, HS; Zulfadhli, I; Othman, J; Rozalini, M; Kamarul, FT
2012-01-01
Purpose To analyze and characterize a multidisciplinary, integrated indoor air quality checklist for evaluating the health risk of building occupants in a nonindustrial workplace setting. Design A cross-sectional study based on a participatory occupational health program conducted by the National Institute of Occupational Safety and Health (Malaysia) and Universiti Putra Malaysia. Method A modified version of the indoor environmental checklist published by the Department of Occupational Health and Safety, based on the literature and discussion with occupational health and safety professionals, was used in the evaluation process. Summated scores were given according to the cluster analysis and principal component analysis in the characterization of risk. Environmetric techniques was used to classify the risk of variables in the checklist. Identification of the possible source of item pollutants was also evaluated from a semiquantitative approach. Result Hierarchical agglomerative cluster analysis resulted in the grouping of factorial components into three clusters (high complaint, moderate-high complaint, moderate complaint), which were further analyzed by discriminant analysis. From this, 15 major variables that influence indoor air quality were determined. Principal component analysis of each cluster revealed that the main factors influencing the high complaint group were fungal-related problems, chemical indoor dispersion, detergent, renovation, thermal comfort, and location of fresh air intake. The moderate-high complaint group showed significant high loading on ventilation, air filters, and smoking-related activities. The moderate complaint group showed high loading on dampness, odor, and thermal comfort. Conclusion This semiquantitative assessment, which graded risk from low to high based on the intensity of the problem, shows promising and reliable results. It should be used as an important tool in the preliminary assessment of indoor air quality and as a categorizing method for further IAQ investigations and complaints procedures. PMID:23055779
Syazwan, Ai; Rafee, B Mohd; Juahir, Hafizan; Azman, Azf; Nizar, Am; Izwyn, Z; Syahidatussyakirah, K; Muhaimin, Aa; Yunos, Ma Syafiq; Anita, Ar; Hanafiah, J Muhamad; Shaharuddin, Ms; Ibthisham, A Mohd; Hasmadi, I Mohd; Azhar, Mn Mohamad; Azizan, Hs; Zulfadhli, I; Othman, J; Rozalini, M; Kamarul, Ft
2012-01-01
To analyze and characterize a multidisciplinary, integrated indoor air quality checklist for evaluating the health risk of building occupants in a nonindustrial workplace setting. A cross-sectional study based on a participatory occupational health program conducted by the National Institute of Occupational Safety and Health (Malaysia) and Universiti Putra Malaysia. A modified version of the indoor environmental checklist published by the Department of Occupational Health and Safety, based on the literature and discussion with occupational health and safety professionals, was used in the evaluation process. Summated scores were given according to the cluster analysis and principal component analysis in the characterization of risk. Environmetric techniques was used to classify the risk of variables in the checklist. Identification of the possible source of item pollutants was also evaluated from a semiquantitative approach. Hierarchical agglomerative cluster analysis resulted in the grouping of factorial components into three clusters (high complaint, moderate-high complaint, moderate complaint), which were further analyzed by discriminant analysis. From this, 15 major variables that influence indoor air quality were determined. Principal component analysis of each cluster revealed that the main factors influencing the high complaint group were fungal-related problems, chemical indoor dispersion, detergent, renovation, thermal comfort, and location of fresh air intake. The moderate-high complaint group showed significant high loading on ventilation, air filters, and smoking-related activities. The moderate complaint group showed high loading on dampness, odor, and thermal comfort. This semiquantitative assessment, which graded risk from low to high based on the intensity of the problem, shows promising and reliable results. It should be used as an important tool in the preliminary assessment of indoor air quality and as a categorizing method for further IAQ investigations and complaints procedures.
Assessment of Slat Noise Predictions for 30P30N High-Lift Configuration From BANC-III Workshop
NASA Technical Reports Server (NTRS)
Choudhari, Meelan; Lockard, David P.
2015-01-01
This paper presents a summary of the computational predictions and measurement data contributed to Category 7 of the 3rd AIAA Workshop on Benchmark Problems for Airframe Noise Computations (BANC-III), which was held in Atlanta, GA, on June 14-15, 2014. Category 7 represents the first slat-noise configuration to be investigated under the BANC series of workshops, namely, the 30P30N two-dimensional high-lift model (with a slat contour that was slightly modified to enable unsteady pressure measurements) at an angle of attack that is relevant to approach conditions. Originally developed for a CFD challenge workshop to assess computational fluid dynamics techniques for steady high-lift predictions, the 30P30N configurations has provided a valuable opportunity for the airframe noise community to collectively assess and advance the computational and experimental techniques for slat noise. The contributed solutions are compared with each other as well as with the initial measurements that became available just prior to the BANC-III Workshop. Specific features of a number of computational solutions on the finer grids compare reasonably well with the initial measurements from FSU and JAXA facilities and/or with each other. However, no single solution (or a subset of solutions) could be identified as clearly superior to the remaining solutions. Grid sensitivity studies presented by multiple BANC-III participants demonstrated a relatively consistent trend of reduced surface pressure fluctuations, higher levels of turbulent kinetic energy in the flow, and lower levels of both narrow band peaks and the broadband component of unsteady pressure spectra in the nearfield and farfield. The lessons learned from the BANC-III contributions have been used to identify improvements to the problem statement for future Category-7 investigations.
Evaluation of work posture and quantification of fatigue by Rapid Entire Body Assessment (REBA)
NASA Astrophysics Data System (ADS)
Rizkya, I.; Syahputri, K.; Sari, R. M.; Anizar; Siregar, I.
2018-02-01
Work related musculoskeletal disorders (MSDs), poor body postures, and low back injuries are the most common problems occurring in many industries including small-medium industries. This study presents assessment and evaluation of ergonomic postures of material handling worker. That evaluation was carried out using REBA (Rapid Entire Body Assessment). REBA is a technique to quantize the fatigue experienced by the worker while manually lifting loads. Fatigue due to abnormal work posture leads to complaints of labor-perceived pain. REBA methods were used to an assessment of working postures for the existing process by a procedural analysis of body postures involved. This study shows that parts of the body have a high risk of work are the back, neck, and upper arms with REBA score 9, so action should be taken as soon as possible. Controlling actions were implemented to those process with high risk then substantial risk reduction was achieved.
NASA Astrophysics Data System (ADS)
Asfaroh, Jati Aurum; Rosana, Dadan; Supahar
2017-08-01
This research aims to develop an evaluation instrument models CIPP valid and reliable as well as determine the feasibility and practicality of an evaluation instrument models CIPP. An evaluation instrument models CIPP to evaluate the implementation of the project assessment topic optik to measure problem-solving skills of junior high school class VIII in the Yogyakarta region. This research is a model of development that uses 4-D. Subject of product trials are students in class VIII SMP N 1 Galur and SMP N 1 Sleman. Data collection techniques in this research using non-test techniques include interviews, questionnaires and observations. Validity in this research was analyzed using V'Aikens. Reliability analyzed using ICC. This research uses 7 raters are derived from two lecturers expert (expert judgment), two practitioners (science teacher) and three colleagues. The results of this research is the evaluation's instrument model of CIPP is used to evaluate the implementation of the implementation of the project assessment instruments. The validity result of evaluation instrument have V'Aikens values between 0.86 to 1, which means a valid and 0.836 reliability values into categories so well that it has been worth used as an evaluation instrument.
Robust biological parametric mapping: an improved technique for multimodal brain image analysis
NASA Astrophysics Data System (ADS)
Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.
2011-03-01
Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.
Stochastic modeling for time series InSAR: with emphasis on atmospheric effects
NASA Astrophysics Data System (ADS)
Cao, Yunmeng; Li, Zhiwei; Wei, Jianchao; Hu, Jun; Duan, Meng; Feng, Guangcai
2018-02-01
Despite the many applications of time series interferometric synthetic aperture radar (TS-InSAR) techniques in geophysical problems, error analysis and assessment have been largely overlooked. Tropospheric propagation error is still the dominant error source of InSAR observations. However, the spatiotemporal variation of atmospheric effects is seldom considered in the present standard TS-InSAR techniques, such as persistent scatterer interferometry and small baseline subset interferometry. The failure to consider the stochastic properties of atmospheric effects not only affects the accuracy of the estimators, but also makes it difficult to assess the uncertainty of the final geophysical results. To address this issue, this paper proposes a network-based variance-covariance estimation method to model the spatiotemporal variation of tropospheric signals, and to estimate the temporal variance-covariance matrix of TS-InSAR observations. The constructed stochastic model is then incorporated into the TS-InSAR estimators both for parameters (e.g., deformation velocity, topography residual) estimation and uncertainty assessment. It is an incremental and positive improvement to the traditional weighted least squares methods to solve the multitemporal InSAR time series. The performance of the proposed method is validated by using both simulated and real datasets.
Outline-Alphabet of 26 Marital Problems and Techniques for Dealing with Them.
ERIC Educational Resources Information Center
Edmunds, Evelyn P.
This guide is intended to familiarize therapists with the scope of marital difficulties and the range of techniques found useful in dealing with them. While it is tempting to think that proven techniques can be matched to specific marital problems, the literature cautions that predicting outcome in therapy research depends not only on the…
Dionne-Odom, J. Nicholas; Lyons, Kathleen D.; Akyar, Imatullah; Bakitas, Marie
2016-01-01
Family caregivers of persons with advanced cancer often take on responsibilities that present daunting and complex problems. Serious problems that go unresolved may be burdensome and result in negative outcomes for caregivers’ psychological and physical health and affect the quality of care delivered to the care recipients with cancer, especially at the end of life. Formal problem-solving training approaches have been developed over the past several decades to assist individuals with managing problems faced in daily life. Several of these problem-solving principles and techniques were incorporated into ENABLE (Educate, Nurture, Advise, Before Life End), an ‘early’ palliative care telehealth intervention for individuals diagnosed with advanced cancer and their family caregivers. A hypothetical case resembling the situations of actual caregiver participants in ENABLE that exemplifies the complex problems that caregivers face is presented followed by presentation of an overview of ENABLE’s problem-solving key principles, techniques and steps in problem-solving support. Though more research is needed to formally test the use of problem-solving support in social work practice, social workers can easily incorporate these techniques into everyday practice. PMID:27143574
Parametric robust control and system identification: Unified approach
NASA Technical Reports Server (NTRS)
Keel, Leehyun
1994-01-01
Despite significant advancement in the area of robust parametric control, the problem of synthesizing such a controller is still a wide open problem. Thus, we attempt to give a solution to this important problem. Our approach captures the parametric uncertainty as an H(sub infinity) unstructured uncertainty so that H(sub infinity) synthesis techniques are applicable. Although the techniques cannot cope with the exact parametric uncertainty, they give a reasonable guideline to model the unstructured uncertainty that contains the parametric uncertainty. An additional loop shaping technique is also introduced to relax its conservatism.
Techniques for recognizing identity of several response functions from the data of visual inspection
NASA Astrophysics Data System (ADS)
Nechval, Nicholas A.
1996-08-01
The purpose of this paper is to present some efficient techniques for recognizing from the observed data whether several response functions are identical to each other. For example, in an industrial setting the problem may be to determine whether the production coefficients established in a small-scale pilot study apply to each of several large- scale production facilities. The techniques proposed here combine sensor information from automated visual inspection of manufactured products which is carried out by means of pixel-by-pixel comparison of the sensed image of the product to be inspected with some reference pattern (or image). Let (a1, . . . , am) be p-dimensional parameters associated with m response models of the same type. This study is concerned with the simultaneous comparison of a1, . . . , am. A generalized maximum likelihood ratio (GMLR) test is derived for testing equality of these parameters, where each of the parameters represents a corresponding vector of regression coefficients. The GMLR test reduces to an equivalent test based on a statistic that has an F distribution. The main advantage of the test lies in its relative simplicity and the ease with which it can be applied. Another interesting test for the same problem is an application of Fisher's method of combining independent test statistics which can be considered as a parallel procedure to the GMLR test. The combination of independent test statistics does not appear to have been used very much in applied statistics. There does, however, seem to be potential data analytic value in techniques for combining distributional assessments in relation to statistically independent samples which are of joint experimental relevance. In addition, a new iterated test for the problem defined above is presented. A rejection of the null hypothesis by this test provides some reason why all the parameters are not equal. A numerical example is discussed in the context of the proposed procedures for hypothesis testing.
Integer Linear Programming in Computational Biology
NASA Astrophysics Data System (ADS)
Althaus, Ernst; Klau, Gunnar W.; Kohlbacher, Oliver; Lenhof, Hans-Peter; Reinert, Knut
Computational molecular biology (bioinformatics) is a young research field that is rich in NP-hard optimization problems. The problem instances encountered are often huge and comprise thousands of variables. Since their introduction into the field of bioinformatics in 1997, integer linear programming (ILP) techniques have been successfully applied to many optimization problems. These approaches have added much momentum to development and progress in related areas. In particular, ILP-based approaches have become a standard optimization technique in bioinformatics. In this review, we present applications of ILP-based techniques developed by members and former members of Kurt Mehlhorn’s group. These techniques were introduced to bioinformatics in a series of papers and popularized by demonstration of their effectiveness and potential.
Modeling and Hazard Analysis Using STPA
NASA Astrophysics Data System (ADS)
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Merz, A. W.
1975-01-01
Multivariable search techniques are applied to a particular class of airfoil optimization problems. These are the maximization of lift and the minimization of disturbance pressure magnitude in an inviscid nonlinear flow field. A variety of multivariable search techniques contained in an existing nonlinear optimization code, AESOP, are applied to this design problem. These techniques include elementary single parameter perturbation methods, organized search such as steepest-descent, quadratic, and Davidon methods, randomized procedures, and a generalized search acceleration technique. Airfoil design variables are seven in number and define perturbations to the profile of an existing NACA airfoil. The relative efficiency of the techniques are compared. It is shown that elementary one parameter at a time and random techniques compare favorably with organized searches in the class of problems considered. It is also shown that significant reductions in disturbance pressure magnitude can be made while retaining reasonable lift coefficient values at low free stream Mach numbers.
A spline-based parameter estimation technique for static models of elastic structures
NASA Technical Reports Server (NTRS)
Dutt, P.; Taasan, S.
1986-01-01
The problem of identifying the spatially varying coefficient of elasticity using an observed solution to the forward problem is considered. Under appropriate conditions this problem can be treated as a first order hyperbolic equation in the unknown coefficient. Some continuous dependence results are developed for this problem and a spline-based technique is proposed for approximating the unknown coefficient, based on these results. The convergence of the numerical scheme is established and error estimates obtained.
NASA Technical Reports Server (NTRS)
Ratnayake, Nalin A.; Waggoner, Erin R.; Taylor, Brian R.
2011-01-01
The problem of parameter estimation on hybrid-wing-body aircraft is complicated by the fact that many design candidates for such aircraft involve a large number of aerodynamic control effectors that act in coplanar motion. This adds to the complexity already present in the parameter estimation problem for any aircraft with a closed-loop control system. Decorrelation of flight and simulation data must be performed in order to ascertain individual surface derivatives with any sort of mathematical confidence. Non-standard control surface configurations, such as clamshell surfaces and drag-rudder modes, further complicate the modeling task. In this paper, time-decorrelation techniques are applied to a model structure selected through stepwise regression for simulated and flight-generated lateral-directional parameter estimation data. A virtual effector model that uses mathematical abstractions to describe the multi-axis effects of clamshell surfaces is developed and applied. Comparisons are made between time history reconstructions and observed data in order to assess the accuracy of the regression model. The Cram r-Rao lower bounds of the estimated parameters are used to assess the uncertainty of the regression model relative to alternative models. Stepwise regression was found to be a useful technique for lateral-directional model design for hybrid-wing-body aircraft, as suggested by available flight data. Based on the results of this study, linear regression parameter estimation methods using abstracted effectors are expected to perform well for hybrid-wing-body aircraft properly equipped for the task.
A critical assessment of mortality statistics in Thailand: potential for improvements.
Tangcharoensathien, Viroj; Faramnuayphol, Pinij; Teokul, Waranya; Bundhamcharoen, Kanitta; Wibulpholprasert, Suwit
2006-01-01
This study evaluates the collection and flow of mortality and cause-of-death (COD) data in Thailand, identifying areas of weakness and presenting potential approaches to improve these statistics. Methods include systems analysis, literature review, and the application of the Health Metrics Network (HMN) self-assessment tool by key stakeholders. We identified two weaknesses underlying incompleteness of death registration and inaccuracy of COD attribution: problems in recording events or certifying deaths, and problems in transferring information from death certificates to death registers. Deaths occurring outside health facilities, representing 65% of all deaths in Thailand, contribute to the inaccuracy of cause-of-death data because they must be certified by village heads with limited knowledge and expertise in cause-of-death attribution. However, problems also exist with in-hospital cause-of-death certification by physicians. Priority should be given to training medical personnel in death certification, review of medical records by health personnel in district hospitals, and use of verbal autopsy techniques for assessing internal consistency. This should be coupled with stronger collaboration with district registrars for the 65% of deaths that occur outside hospitals. Training of physicians and data coders and harmonization of death certificates and registries would improve COD data for the 35% of deaths that take place in hospital. Public awareness of the importance of registering all deaths and the application of registration requirements prior to funerals would also improve coverage, though enforcement would be difficult. PMID:16583083
Mühlbacher, Axel C; Kaczynski, Anika
2016-02-01
Healthcare decision making is usually characterized by a low degree of transparency. The demand for transparent decision processes can be fulfilled only when assessment, appraisal and decisions about health technologies are performed under a systematic construct of benefit assessment. The benefit of an intervention is often multidimensional and, thus, must be represented by several decision criteria. Complex decision problems require an assessment and appraisal of various criteria; therefore, a decision process that systematically identifies the best available alternative and enables an optimal and transparent decision is needed. For that reason, decision criteria must be weighted and goal achievement must be scored for all alternatives. Methods of multi-criteria decision analysis (MCDA) are available to analyse and appraise multiple clinical endpoints and structure complex decision problems in healthcare decision making. By means of MCDA, value judgments, priorities and preferences of patients, insurees and experts can be integrated systematically and transparently into the decision-making process. This article describes the MCDA framework and identifies potential areas where MCDA can be of use (e.g. approval, guidelines and reimbursement/pricing of health technologies). A literature search was performed to identify current research in healthcare. The results showed that healthcare decision making is addressing the problem of multiple decision criteria and is focusing on the future development and use of techniques to weight and score different decision criteria. This article emphasizes the use and future benefit of MCDA.
Graph-cut based discrete-valued image reconstruction.
Tuysuzoglu, Ahmet; Karl, W Clem; Stojanovic, Ivana; Castañòn, David; Ünlü, M Selim
2015-05-01
Efficient graph-cut methods have been used with great success for labeling and denoising problems occurring in computer vision. Unfortunately, the presence of linear image mappings has prevented the use of these techniques in most discrete-amplitude image reconstruction problems. In this paper, we develop a graph-cut based framework for the direct solution of discrete amplitude linear image reconstruction problems cast as regularized energy function minimizations. We first analyze the structure of discrete linear inverse problem cost functions to show that the obstacle to the application of graph-cut methods to their solution is the variable mixing caused by the presence of the linear sensing operator. We then propose to use a surrogate energy functional that overcomes the challenges imposed by the sensing operator yet can be utilized efficiently in existing graph-cut frameworks. We use this surrogate energy functional to devise a monotonic iterative algorithm for the solution of discrete valued inverse problems. We first provide experiments using local convolutional operators and show the robustness of the proposed technique to noise and stability to changes in regularization parameter. Then we focus on nonlocal, tomographic examples where we consider limited-angle data problems. We compare our technique with state-of-the-art discrete and continuous image reconstruction techniques. Experiments show that the proposed method outperforms state-of-the-art techniques in challenging scenarios involving discrete valued unknowns.
Kwon, Seung Yong; Pham, Tuyen Danh; Park, Kang Ryoung; Jeong, Dae Sik; Yoon, Sungsoo
2016-06-11
Fitness classification is a technique to assess the quality of banknotes in order to determine whether they are usable. Banknote classification techniques are useful in preventing problems that arise from the circulation of substandard banknotes (such as recognition failures, or bill jams in automated teller machines (ATMs) or bank counting machines). By and large, fitness classification continues to be carried out by humans, and this can cause the problem of varying fitness classifications for the same bill by different evaluators, and requires a lot of time. To address these problems, this study proposes a fuzzy system-based method that can reduce the processing time needed for fitness classification, and can determine the fitness of banknotes through an objective, systematic method rather than subjective judgment. Our algorithm was an implementation to actual banknote counting machine. Based on the results of tests on 3856 banknotes in United States currency (USD), 3956 in Korean currency (KRW), and 2300 banknotes in Indian currency (INR) using visible light reflection (VR) and near-infrared light transmission (NIRT) imaging, the proposed method was found to yield higher accuracy than prevalent banknote fitness classification methods. Moreover, it was confirmed that the proposed algorithm can operate in real time, not only in a normal PC environment, but also in an embedded system environment of a banknote counting machine.
Kwon, Seung Yong; Pham, Tuyen Danh; Park, Kang Ryoung; Jeong, Dae Sik; Yoon, Sungsoo
2016-01-01
Fitness classification is a technique to assess the quality of banknotes in order to determine whether they are usable. Banknote classification techniques are useful in preventing problems that arise from the circulation of substandard banknotes (such as recognition failures, or bill jams in automated teller machines (ATMs) or bank counting machines). By and large, fitness classification continues to be carried out by humans, and this can cause the problem of varying fitness classifications for the same bill by different evaluators, and requires a lot of time. To address these problems, this study proposes a fuzzy system-based method that can reduce the processing time needed for fitness classification, and can determine the fitness of banknotes through an objective, systematic method rather than subjective judgment. Our algorithm was an implementation to actual banknote counting machine. Based on the results of tests on 3856 banknotes in United States currency (USD), 3956 in Korean currency (KRW), and 2300 banknotes in Indian currency (INR) using visible light reflection (VR) and near-infrared light transmission (NIRT) imaging, the proposed method was found to yield higher accuracy than prevalent banknote fitness classification methods. Moreover, it was confirmed that the proposed algorithm can operate in real time, not only in a normal PC environment, but also in an embedded system environment of a banknote counting machine. PMID:27294940
Altena, Ellemarije; Daviaux, Yannick; Sanz-Arigita, Ernesto; Bonhomme, Emilien; de Sevin, Étienne; Micoulaud-Franchi, Jean-Arthur; Bioulac, Stéphanie; Philip, Pierre
2018-04-17
Virtual reality and simulation tools enable us to assess daytime functioning in environments that simulate real life as close as possible. Simulator sickness, however, poses a problem in the application of these tools, and has been related to pre-existing health problems. How sleep problems contribute to simulator sickness has not yet been investigated. In the current study, 20 female chronic insomnia patients and 32 female age-matched controls drove in a driving simulator covering realistic city, country and highway scenes. Fifty percent of the insomnia patients as opposed to 12.5% of controls reported excessive simulator sickness leading to experiment withdrawal. In the remaining participants, patients with insomnia showed overall increased levels of oculomotor symptoms even before driving, while nausea symptoms further increased after driving. These results, as well as the realistic simulation paradigm developed, give more insight on how vestibular and oculomotor functions as well as interoceptive functions are affected in insomnia. Importantly, our results have direct implications for both the actual driving experience and the wider context of deploying simulation techniques to mimic real life functioning, in particular in those professions often exposed to sleep problems. © 2018 European Sleep Research Society.
Undergraduate Performance in Solving Ill-Defined Biochemistry Problems
Sensibaugh, Cheryl A.; Madrid, Nathaniel J.; Choi, Hye-Jeong; Anderson, William L.; Osgood, Marcy P.
2017-01-01
With growing interest in promoting skills related to the scientific process, we studied performance in solving ill-defined problems demonstrated by graduating biochemistry majors at a public, minority-serving university. As adoption of techniques for facilitating the attainment of higher-order learning objectives broadens, so too does the need to appropriately measure and understand student performance. We extended previous validation of the Individual Problem Solving Assessment (IPSA) and administered multiple versions of the IPSA across two semesters of biochemistry courses. A final version was taken by majors just before program exit, and student responses on that version were analyzed both quantitatively and qualitatively. This mixed-methods study quantifies student performance in scientific problem solving, while probing the qualitative nature of unsatisfactory solutions. Of the five domains measured by the IPSA, we found that average graduates were only successful in two areas: evaluating given experimental data to state results and reflecting on performance after the solution to the problem was provided. The primary difficulties in each domain were quite different. The most widespread challenge for students was to design an investigation that rationally aligned with a given hypothesis. We also extend the findings into pedagogical recommendations. PMID:29180350
Subiaul, Francys; Krajkowski, Edward; Price, Elizabeth E; Etz, Alexander
2015-01-01
Children are exceptional, even 'super,' imitators but comparatively poor independent problem-solvers or innovators. Yet, imitation and innovation are both necessary components of cumulative cultural evolution. Here, we explored the relationship between imitation and innovation by assessing children's ability to generate a solution to a novel problem by imitating two different action sequences demonstrated by two different models, an example of imitation by combination, which we refer to as "summative imitation." Children (N = 181) from 3 to 5 years of age and across three experiments were tested in a baseline condition or in one of six demonstration conditions, varying in the number of models and opening techniques demonstrated. Across experiments, more than 75% of children evidenced summative imitation, opening both compartments of the problem box and retrieving the reward hidden in each. Generally, learning different actions from two different models was as good (and in some cases, better) than learning from 1 model, but the underlying representations appear to be the same in both demonstration conditions. These results show that summative imitation not only facilitates imitation learning but can also result in new solutions to problems, an essential feature of innovation and cumulative culture.
Comparative Evaluation of Different Optimization Algorithms for Structural Design Applications
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Coroneos, Rula M.; Guptill, James D.; Hopkins, Dale A.
1996-01-01
Non-linear programming algorithms play an important role in structural design optimization. Fortunately, several algorithms with computer codes are available. At NASA Lewis Research Centre, a project was initiated to assess the performance of eight different optimizers through the development of a computer code CometBoards. This paper summarizes the conclusions of that research. CometBoards was employed to solve sets of small, medium and large structural problems, using the eight different optimizers on a Cray-YMP8E/8128 computer. The reliability and efficiency of the optimizers were determined from the performance of these problems. For small problems, the performance of most of the optimizers could be considered adequate. For large problems, however, three optimizers (two sequential quadratic programming routines, DNCONG of IMSL and SQP of IDESIGN, along with Sequential Unconstrained Minimizations Technique SUMT) outperformed others. At optimum, most optimizers captured an identical number of active displacement and frequency constraints but the number of active stress constraints differed among the optimizers. This discrepancy can be attributed to singularity conditions in the optimization and the alleviation of this discrepancy can improve the efficiency of optimizers.
Performance Trend of Different Algorithms for Structural Design Optimization
NASA Technical Reports Server (NTRS)
Patnaik, Surya N.; Coroneos, Rula M.; Guptill, James D.; Hopkins, Dale A.
1996-01-01
Nonlinear programming algorithms play an important role in structural design optimization. Fortunately, several algorithms with computer codes are available. At NASA Lewis Research Center, a project was initiated to assess performance of different optimizers through the development of a computer code CometBoards. This paper summarizes the conclusions of that research. CometBoards was employed to solve sets of small, medium and large structural problems, using different optimizers on a Cray-YMP8E/8128 computer. The reliability and efficiency of the optimizers were determined from the performance of these problems. For small problems, the performance of most of the optimizers could be considered adequate. For large problems however, three optimizers (two sequential quadratic programming routines, DNCONG of IMSL and SQP of IDESIGN, along with the sequential unconstrained minimizations technique SUMT) outperformed others. At optimum, most optimizers captured an identical number of active displacement and frequency constraints but the number of active stress constraints differed among the optimizers. This discrepancy can be attributed to singularity conditions in the optimization and the alleviation of this discrepancy can improve the efficiency of optimizers.
The role of artificial intelligence techniques in scheduling systems
NASA Technical Reports Server (NTRS)
Geoffroy, Amy L.; Britt, Daniel L.; Gohring, John R.
1990-01-01
Artificial Intelligence (AI) techniques provide good solutions for many of the problems which are characteristic of scheduling applications. However, scheduling is a large, complex heterogeneous problem. Different applications will require different solutions. Any individual application will require the use of a variety of techniques, including both AI and conventional software methods. The operational context of the scheduling system will also play a large role in design considerations. The key is to identify those places where a specific AI technique is in fact the preferable solution, and to integrate that technique into the overall architecture.
The effect of chronic orthopedic infection on quality of life.
Cheatle, M D
1991-07-01
The patient with chronic orthopedic infection presents a unique challenge to the orthopedic surgeon. The orthopedic surgeon must not only possess an expertise in constantly evolving diagnostic and treatment techniques but also be able to identify numerous related problems and direct the patient in receiving the most appropriate treatment. This demands a commitment of time by the treating surgeon to the individual patient to properly assess the need for support, the extent of psychologic distress, the intensity of pain, and the requirement for medication management. The effective utilization of a multidisciplinary team of health care providers (e.g., specialists in infectious disease, physical medicine and rehabilitation, psychiatry, nursing, pharmacology) can provide an optimal treatment program for this multifaceted problem and maximize the potential for a favorable outcome.
Past, present and future of spike sorting techniques.
Rey, Hernan Gonzalo; Pedreira, Carlos; Quian Quiroga, Rodrigo
2015-10-01
Spike sorting is a crucial step to extract information from extracellular recordings. With new recording opportunities provided by the development of new electrodes that allow monitoring hundreds of neurons simultaneously, the scenario for the new generation of algorithms is both exciting and challenging. However, this will require a new approach to the problem and the development of a common reference framework to quickly assess the performance of new algorithms. In this work, we review the basic concepts of spike sorting, including the requirements for different applications, together with the problems faced by presently available algorithms. We conclude by proposing a roadmap stressing the crucial points to be addressed to support the neuroscientific research of the near future. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Schlam, E.
1983-01-01
Human factors in visible displays are discussed, taking into account an introduction to color vision, a laser optometric assessment of visual display viewability, the quantification of color contrast, human performance evaluations of digital image quality, visual problems of office video display terminals, and contemporary problems in airborne displays. Other topics considered are related to electroluminescent technology, liquid crystal and related technologies, plasma technology, and display terminal and systems. Attention is given to the application of electroluminescent technology to personal computers, electroluminescent driving techniques, thin film electroluminescent devices with memory, the fabrication of very large electroluminescent displays, the operating properties of thermally addressed dye switching liquid crystal display, light field dichroic liquid crystal displays for very large area displays, and hardening military plasma displays for a nuclear environment.
Inertial Confinement fusion targets
NASA Technical Reports Server (NTRS)
Hendricks, C. D.
1982-01-01
Inertial confinement fusion (ICF) targets are made as simple flat discs, as hollow shells or as complicated multilayer structures. Many techniques were devised for producing the targets. Glass and metal shells are made by using drop and bubble techniques. Solid hydrogen shells are also produced by adapting old methods to the solution of modern problems. Some of these techniques, problems, and solutions are discussed. In addition, the applications of many of the techniques to fabrication of ICF targets is presented.
The effects of preference assessment type on problem behavior.
Tung, Sara Beth; Donaldson, Jeanne M; Kahng, SungWoo
2017-10-01
Individuals with intellectual and developmental disabilities who engage in problem behavior maintained by access to tangibles may exhibit more problem behavior during certain preference assessments. We compared three common preference assessments to determine which resulted in fewer problem behaviors. The paired stimulus and multiple-stimulus without replacement assessments produced higher rates of problem behavior than the free operant (FO) assessment, suggesting that the FO assessment may be the most appropriate assessment for individuals who engage in problem behavior maintained by access to tangibles. © 2017 Society for the Experimental Analysis of Behavior.
Bogie, Rob; Voss, Laura; Arts, Jacobus J; Lataster, Arno; Willems, Paul C; Brans, Boudewijn; van Rhijn, Lodewijk W; Welting, Tim J M
2016-12-01
An animal study. To explore ultra-high molecular weight polyethylene (UHMWPE) sublaminar wires in spinal surgery and to assess stability and biocompatibility of the UHMWPE instrumentation in an ovine model. Sublaminar wiring is a well-established technique in segmental scoliosis surgery. However, during introduction and/or removal of the metal sublaminar wires, neurological problems can occur. Abrasion after cutting metal wires for removal can lead to damage to the dural sac. Sublaminar wires have to withhold large forces and breakage of the wires can occur. Different types of sublaminar wires have been developed to address these problems. UHMWPE sublaminar wires can potentially substitute currently used metal sublaminar metal wires. In vivo testing and biocompatibility analysis of UHMWPE wires are recommended before clinical use in spinal surgery. In 6 immature sheep, pedicle screws were instrumented at lumbar level L4 and attached with titanium rods to 4 thoracolumbar vertebrae using 3- and 5-mm-wide UHMWPE sublaminar wiring constructions in 5 animals. Titanium sublaminar wires were applied in 1 animal to function as a control subject. After a follow-up period of 16 weeks, the animals were sacrificed and the spines were isolated. Radiographs and computed tomography (CT) scans were made to assess stability of the instrumentation. The vertebrae were dissected for macroscopic and histologic evaluation. None of the wires had loosened and the instrumentation remained stable. CT scans and radiographs showed no signs of failure of the instrumentation and no neurological complications occurred. Although several bony bridges were seen on CT, growth was observed at the operated levels. Biocompatibility was assessed by macroscopical and histologic analysis, showing no signs of dural or epidural inflammation. This pilot animal study shows that UHMWPE sublaminar wiring is a safe technique. The UHMWPE wires are biocompatible and provide sufficient stability in spinal instrumentation. Heterotopic ossification because of periost reactions in the ovine spine led to some restrictions in this study.
Techniques for shuttle trajectory optimization
NASA Technical Reports Server (NTRS)
Edge, E. R.; Shieh, C. J.; Powers, W. F.
1973-01-01
The application of recently developed function-space Davidon-type techniques to the shuttle ascent trajectory optimization problem is discussed along with an investigation of the recently developed PRAXIS algorithm for parameter optimization. At the outset of this analysis, the major deficiency of the function-space algorithms was their potential storage problems. Since most previous analyses of the methods were with relatively low-dimension problems, no storage problems were encountered. However, in shuttle trajectory optimization, storage is a problem, and this problem was handled efficiently. Topics discussed include: the shuttle ascent model and the development of the particular optimization equations; the function-space algorithms; the operation of the algorithm and typical simulations; variable final-time problem considerations; and a modification of Powell's algorithm.
Combining Techniques to Refine Item to Skills Q-Matrices with a Partition Tree
ERIC Educational Resources Information Center
Desmarais, Michel C.; Xu, Peng; Beheshti, Behzad
2015-01-01
The problem of mapping items to skills is gaining interest with the emergence of recent techniques that can use data for both defining this mapping, and for refining mappings given by experts. We investigate the problem of refining mapping from an expert by combining the output of different techniques. The combination is based on a partition tree…
NASA Astrophysics Data System (ADS)
Vasant, P.; Ganesan, T.; Elamvazuthi, I.
2012-11-01
A fairly reasonable result was obtained for non-linear engineering problems using the optimization techniques such as neural network, genetic algorithms, and fuzzy logic independently in the past. Increasingly, hybrid techniques are being used to solve the non-linear problems to obtain better output. This paper discusses the use of neuro-genetic hybrid technique to optimize the geological structure mapping which is known as seismic survey. It involves the minimization of objective function subject to the requirement of geophysical and operational constraints. In this work, the optimization was initially performed using genetic programming, and followed by hybrid neuro-genetic programming approaches. Comparative studies and analysis were then carried out on the optimized results. The results indicate that the hybrid neuro-genetic hybrid technique produced better results compared to the stand-alone genetic programming method.
Wavelet-promoted sparsity for non-invasive reconstruction of electrical activity of the heart.
Cluitmans, Matthijs; Karel, Joël; Bonizzi, Pietro; Volders, Paul; Westra, Ronald; Peeters, Ralf
2018-05-12
We investigated a novel sparsity-based regularization method in the wavelet domain of the inverse problem of electrocardiography that aims at preserving the spatiotemporal characteristics of heart-surface potentials. In three normal, anesthetized dogs, electrodes were implanted around the epicardium and body-surface electrodes were attached to the torso. Potential recordings were obtained simultaneously on the body surface and on the epicardium. A CT scan was used to digitize a homogeneous geometry which consisted of the body-surface electrodes and the epicardial surface. A novel multitask elastic-net-based method was introduced to regularize the ill-posed inverse problem. The method simultaneously pursues a sparse wavelet representation in time-frequency and exploits correlations in space. Performance was assessed in terms of quality of reconstructed epicardial potentials, estimated activation and recovery time, and estimated locations of pacing, and compared with performance of Tikhonov zeroth-order regularization. Results in the wavelet domain obtained higher sparsity than those in the time domain. Epicardial potentials were non-invasively reconstructed with higher accuracy than with Tikhonov zeroth-order regularization (p < 0.05), and recovery times were improved (p < 0.05). No significant improvement was found in terms of activation times and localization of origin of pacing. Next to improved estimation of recovery isochrones, which is important when assessing substrate for cardiac arrhythmias, this novel technique opens potentially powerful opportunities for clinical application, by allowing to choose wavelet bases that are optimized for specific clinical questions. Graphical Abstract The inverse problem of electrocardiography is to reconstruct heart-surface potentials from recorded bodysurface electrocardiograms (ECGs) and a torso-heart geometry. However, it is ill-posed and solving it requires additional constraints for regularization. We introduce a regularization method that simultaneously pursues a sparse wavelet representation in time-frequency and exploits correlations in space. Our approach reconstructs epicardial (heart-surface) potentials with higher accuracy than common methods. It also improves the reconstruction of recovery isochrones, which is important when assessing substrate for cardiac arrhythmias. This novel technique opens potentially powerful opportunities for clinical application, by allowing to choose wavelet bases that are optimized for specific clinical questions.
NASA Astrophysics Data System (ADS)
Goeters, Klaus-Martin; Fassbender, Christoph
A unique composition of personality assessment methods was applied to a group of 97 ESA scientists and engineers. This group is highly comparable to real astronaut candidates with respect to age and education. The list of used tests includes personality questionnaires, problem solving in groups as well as a projective technique. The study goals were: 1. Verification of psychometric qualities and applicability of tests to the target group; 2. Search for culture-fair tests by which multi-national European groups can be examined; 3. Identification of test methods by which the adaptability of the candidates to the psycho-social stress of long-duration space flights can be assessed. Based on the empirical findings, a test battery was defined which can be used in the selection of ESA space personnel.
Quality assessment of malaria laboratory diagnosis in South Africa.
Dini, Leigh; Frean, John
2003-01-01
To assess the quality of malaria diagnosis in 115 South African laboratories participating in the National Health Laboratory Service Parasitology External Quality Assessment Programme we reviewed the results from 7 surveys from January 2000 to August 2002. The mean percentage incorrect result rate was 13.8% (95% CI 11.3-16.9%), which is alarmingly high, with about 1 in 7 blood films being incorrectly interpreted. Most participants with incorrect blood film interpretations had acceptable Giemsa staining quality, indicating that there is less of a problem with staining technique than with blood film interpretation. Laboratories in provinces in which malaria is endemic did not necessarily perform better than those in non-endemic areas. The results clearly suggest that malaria laboratory diagnosis throughout South Africa needs strengthening by improving laboratory standardization and auditing, training, quality assurance and referral resources.
Gambling Risk Groups are Not All the Same: Risk Factors Amongst Sports Bettors.
Russell, Alex M T; Hing, Nerilee; Li, En; Vitartas, Peter
2018-03-20
Sports betting is increasing worldwide, with an associated increase in sports betting-related problems. Previous studies have examined risk factors for problem gambling amongst sports bettors and have identified demographic, behavioural, marketing, normative and impulsiveness factors. These studies have generally compared those in problem gambling, or a combination of moderate risk and problem gambling, groups to non-problem gamblers, often due to statistical power issues. However, recent evidence suggests that, at a population level, the bulk of gambling-related harm stems from low risk and moderate risk gamblers, rather than problem gamblers. Thus it is essential to understand the risk factors for each level of gambling-related problems (low risk, moderate risk, problem) separately. The present study used a large sample (N = 1813) to compare each gambling risk group to non-problem gamblers, first using bivariate and then multivariate statistical techniques. A range of demographic, behavioural, marketing, normative and impulsiveness variables were included as possible risk factors. The results indicated that some variables, such as gambling expenditure, number of accounts with different operators, number of different types of promotions used and impulsiveness were significantly higher for all risk groups, while others such as some normative factors, age, gender and particular sports betting variables only applied to those with the highest level of gambling-related problems. The results generally supported findings from previous literature for problem gamblers, and extended these findings to low risk and moderate risk groups. In the future, where statistical power allows, risk factors should be assessed separately for all levels of gambling problems.
Control of stochastic sensitivity in a stabilization problem for gas discharge system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bashkirtseva, Irina
2015-11-30
We consider a nonlinear dynamic stochastic system with control. A problem of stochastic sensitivity synthesis of the equilibrium is studied. A mathematical technique of the solution of this problem is discussed. This technique is applied to the problem of the stabilization of the operating mode for the stochastic gas discharge system. We construct a feedback regulator that reduces the stochastic sensitivity of the equilibrium, suppresses large-amplitude oscillations, and provides a proper operation of this engineering device.
Sousa, Marcelo R; Jones, Jon P; Frind, Emil O; Rudolph, David L
2013-01-01
In contaminant travel from ground surface to groundwater receptors, the time taken in travelling through the unsaturated zone is known as the unsaturated zone time lag. Depending on the situation, this time lag may or may not be significant within the context of the overall problem. A method is presented for assessing the importance of the unsaturated zone in the travel time from source to receptor in terms of estimates of both the absolute and the relative advective times. A choice of different techniques for both unsaturated and saturated travel time estimation is provided. This method may be useful for practitioners to decide whether to incorporate unsaturated processes in conceptual and numerical models and can also be used to roughly estimate the total travel time between points near ground surface and a groundwater receptor. This method was applied to a field site located in a glacial aquifer system in Ontario, Canada. Advective travel times were estimated using techniques with different levels of sophistication. The application of the proposed method indicates that the time lag in the unsaturated zone is significant at this field site and should be taken into account. For this case, sophisticated and simplified techniques lead to similar assessments when the same knowledge of the hydraulic conductivity field is assumed. When there is significant uncertainty regarding the hydraulic conductivity, simplified calculations did not lead to a conclusive decision. Copyright © 2012 Elsevier B.V. All rights reserved.
Degeling, Koen; Schivo, Stefano; Mehra, Niven; Koffijberg, Hendrik; Langerak, Rom; de Bono, Johann S; IJzerman, Maarten J
2017-12-01
With the advent of personalized medicine, the field of health economic modeling is being challenged and the use of patient-level dynamic modeling techniques might be required. To illustrate the usability of two such techniques, timed automata (TA) and discrete event simulation (DES), for modeling personalized treatment decisions. An early health technology assessment on the use of circulating tumor cells, compared with prostate-specific antigen and bone scintigraphy, to inform treatment decisions in metastatic castration-resistant prostate cancer was performed. Both modeling techniques were assessed quantitatively, in terms of intermediate outcomes (e.g., overtreatment) and health economic outcomes (e.g., net monetary benefit). Qualitatively, among others, model structure, agent interactions, data management (i.e., importing and exporting data), and model transparency were assessed. Both models yielded realistic and similar intermediate and health economic outcomes. Overtreatment was reduced by 6.99 and 7.02 weeks by applying circulating tumor cell as a response marker at a net monetary benefit of -€1033 and -€1104 for the TA model and the DES model, respectively. Software-specific differences were observed regarding data management features and the support for statistical distributions, which were considered better for the DES software. Regarding method-specific differences, interactions were modeled more straightforward using TA, benefiting from its compositional model structure. Both techniques prove suitable for modeling personalized treatment decisions, although DES would be preferred given the current software-specific limitations of TA. When these limitations are resolved, TA would be an interesting modeling alternative if interactions are key or its compositional structure is useful to manage multi-agent complex problems. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Numerical simulation of liquid jet impact on a rigid wall
NASA Astrophysics Data System (ADS)
Aganin, A. A.; Guseva, T. S.
2016-11-01
Basic points of a numerical technique for computing high-speed liquid jet impact on a rigid wall are presented. In the technique the flows of the liquid and the surrounding gas are governed by the equations of gas dynamics in the density, velocity, and pressure, which are integrated by the CIP-CUP method on dynamically adaptive grids without explicitly tracking the gas-liquid interface. The efficiency of the technique is demonstrated by the results of computing the problems of impact of the liquid cone and the liquid wedge on a wall in the mode with the shockwave touching the wall by its edge. Numerical solutions of these problems are compared with the analytical solution of the problem of impact of the plane liquid flow on a wall. Applicability of the technique to the problems of the high-speed liquid jet impact on a wall is illustrated by the results of computing a problem of impact of a cylindrical liquid jet with the hemispherical end on a wall covered by a layer of the same liquid.
A preliminary look at techniques used to obtain airdata from flight at high angles of attack
NASA Technical Reports Server (NTRS)
Moes, Timothy R.; Whitmore, Stephen A.
1990-01-01
Flight research at high angles of attack has posed new problems for airdata measurements. New sensors and techniques for measuring the standard airdata quantities of static pressure, dynamic pressure, angle of attack, and angle of sideslip were subsequently developed. The ongoing airdata research supporting NASA's F-18 high alpha research program is updated. Included are the techniques used and the preliminary results. The F-18 aircraft was flown with three research airdata systems: a standard airdata probe on the right wingtip, a self-aligning airdata probe on the left wingtip, and a flush airdata system on the nose cone. The primary research goal was to obtain steady-state calibrations for each airdata system up to an angle of attack of 50 deg. This goal was accomplished and preliminary accuracies of the three airdata systems were assessed and are presented. An effort to improve the fidelity of the airdata measurements during dynamic maneuvering is also discussed. This involved enhancement of the aerodynamic data with data obtained from linear accelerometers, rate gyros, and attitude gyros. Preliminary results of this technique are presented.
Hsu, Guo-Liang; Tang, Jung-Chang; Hwang, Wu-Yuin
2014-08-01
The one-more-than technique is an effective strategy for individuals with intellectual disabilities (ID) to use when making purchases. However, the heavy cognitive demands of money counting skills potentially limit how individuals with ID shop. This study employed a multiple-probe design across participants and settings, via the assistance of a mobile purchasing assistance system (MPAS), to assess the effectiveness of the one-more-than technique on independent purchases for items with prices beyond the participants' money counting skills. Results indicated that the techniques with the MPAS could effectively convert participants' initial money counting problems into useful advantages for successfully promoting the independent purchasing skills of three secondary school students with ID. Also noteworthy is the fact that mobile technologies could be a permanent prompt for those with ID to make purchases in their daily lives. The treatment effects could be maintained for eight weeks and generalized across three community settings. Implications for practice and future studies are provided. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sasmita, E.; Edriati, S.; Yunita, A.
2018-04-01
Related to the math score of the first semester in class at seventh grade of MTSN Model Padang which much the score that low (less than KKM). It because of the students who feel less involved in learning process because the teacher don't do assessment the discussions. The solution of the problem is discussion assessment in Cooperative Learning Model type Numbered Head Together. This study aims to determine whether the discussion assessment in NHT effect on student learning outcomes of class VII MTsN Model Padang. The instrument used in this study is discussion assessment and final tests. The data analysis technique used is the simple linear regression analysis. Hypothesis test results Fcount greater than the value of Ftable then the hypothesis in this study received. So it concluded that the assessment of the discussion in NHT effect on student learning outcomes of class VII MTsN Model Padang.
A new framework for interactive quality assessment with application to light field coding
NASA Astrophysics Data System (ADS)
Viola, Irene; Ebrahimi, Touradj
2017-09-01
In recent years, light field has experienced a surge of popularity, mainly due to the recent advances in acquisition and rendering technologies that have made it more accessible to the public. Thanks to image-based rendering techniques, light field contents can be rendered in real time on common 2D screens, allowing virtual navigation through the captured scenes in an interactive fashion. However, this richer representation of the scene poses the problem of reliable quality assessments for light field contents. In particular, while subjective methodologies that enable interaction have already been proposed, no work has been done on assessing how users interact with light field contents. In this paper, we propose a new framework to subjectively assess the quality of light field contents in an interactive manner and simultaneously track users behaviour. The framework is successfully used to perform subjective assessment of two coding solutions. Moreover, statistical analysis performed on the results shows interesting correlation between subjective scores and average interaction time.
NASA Technical Reports Server (NTRS)
Berger, B. S.; Duangudom, S.
1973-01-01
A technique is introduced which extends the range of useful approximation of numerical inversion techniques to many cycles of an oscillatory function without requiring either the evaluation of the image function for many values of s or the computation of higher-order terms. The technique consists in reducing a given initial value problem defined over some interval into a sequence of initial value problems defined over a set of subintervals. Several numerical examples demonstrate the utility of the method.
Solving fractional optimal control problems within a Chebyshev-Legendre operational technique
NASA Astrophysics Data System (ADS)
Bhrawy, A. H.; Ezz-Eldien, S. S.; Doha, E. H.; Abdelkawy, M. A.; Baleanu, D.
2017-06-01
In this manuscript, we report a new operational technique for approximating the numerical solution of fractional optimal control (FOC) problems. The operational matrix of the Caputo fractional derivative of the orthonormal Chebyshev polynomial and the Legendre-Gauss quadrature formula are used, and then the Lagrange multiplier scheme is employed for reducing such problems into those consisting of systems of easily solvable algebraic equations. We compare the approximate solutions achieved using our approach with the exact solutions and with those presented in other techniques and we show the accuracy and applicability of the new numerical approach, through two numerical examples.
The flying hot wire and related instrumentation
NASA Technical Reports Server (NTRS)
Coles, D.; Cantnell, B.; Wadcock, A.
1978-01-01
A flying hot-wire technique is proposed for studies of separated turbulent flow in wind tunnels. The technique avoids the problem of signal rectification in regions of high turbulence level by moving the probe rapidly through the flow on the end of a rotating arm. New problems which arise include control of effects of torque variation on rotor speed, avoidance of interference from the wake of the moving arms, and synchronization of data acquisition with rotation. Solutions for these problems are described. The self-calibrating feature of the technique is illustrated by a sample X-array calibration.
Taylor, George C.
1971-01-01
Hydrologic instrumentation and methodology for assessing water-resource potentials have originated largely in the developed countries of the temperature zone. The developing countries lie largely in the tropic zone, which contains the full gamut of the earth's climatic environments, including most of those of the temperate zone. For this reason, most hydrologic techniques have world-wide applicability. Techniques for assessing water-resource potentials for the high priority goals of economic growth are well established in the developing countries--but much more are well established in the developing countries--but much more so in some than in other. Conventional techniques for measurement and evaluation of basic hydrologic parameters are now well-understood in the developing countries and are generally adequate for their current needs and those of the immediate future. Institutional and economic constraints, however, inhibit growth of sustained programs of hydrologic data collection and application of the data to problems in engineering technology. Computer-based technology, including processing of hydrologic data and mathematical modelling of hydrologic parameters i also well-begun in many developing countries and has much wider potential application. In some developing counties, however, there is a tendency to look on the computer as a panacea for deficiencies in basic hydrologic data collection programs. This fallacy must be discouraged, as the computer is a tool and not a "magic box." There is no real substitute for sound programs of basic data collection. Nuclear and isotopic techniques are being used increasingly in the developed countries in the measurement and evaluation of virtually all hydrologic parameter in which conventional techniques have been used traditionally. Even in the developed countries, however, many hydrologists are not using nuclear techniques, simply because they lack knowledge of the principles involved and of the potential benefits. Nuclear methodology in hydrologic applications is generally more complex than the conventional and hence requires a high level of technical expertise for effective use. Application of nuclear techniques to hydrologic problems in the developing countries is likely to be marginal for some years to come, owing to the higher costs involved and expertise required. Nuclear techniques, however, would seem to have particular promise in studies of water movement in unsaturated soils and of erosion and sedimentation where conventional techniques are inadequate, inefficient and in some cases costly. Remote sensing offers great promise for synoptic evaluations of water resources and hydrologic processes, including the transient phenomena of the hydrologic cycle. Remote sensing is not, however, a panacea for deficiencies in hydrologic data programs in the developing countries. Rather it is a means for extending and augmenting on-the-ground observations ans surveys (ground truth) to evaluated water resources and hydrologic processes on a regionall or even continental scale. With respect to economic growth goals in developing countries, there are few identifiable gaps in existing hydrologic instrumentation and methodology insofar as appraisal, development and management of available water resources are concerned. What is needed is acceleration of institutional development and professional motivation toward more effective use of existing and proven methodology. Moreover, much sophisticated methodology can be applied effectively in the developing countries only when adequate levels of indigenous scientific skills have been reached and supportive institutional frameworks are evolved to viability.
Application of artificial intelligence to impulsive orbital transfers
NASA Technical Reports Server (NTRS)
Burns, Rowland E.
1987-01-01
A generalized technique for the numerical solution of any given class of problems is presented. The technique requires the analytic (or numerical) solution of every applicable equation for all variables that appear in the problem. Conditional blocks are employed to rapidly expand the set of known variables from a minimum of input. The method is illustrated via the use of the Hohmann transfer problem from orbital mechanics.
Practical semen analysis: from A to Z
Brazil, Charlene
2010-01-01
Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076
Outcomes in Patients Treated with a Novel, Simple Method for Hemostasis of Dermal Avulsion Injuries.
Dowling, Sean Taylor; Lin, Brian Wai
2017-10-01
A recently described technique proposes a simple method to achieve permanent hemostasis of distal fingertip dermal avulsion injuries. It is simple to learn and easy to perform with readily available materials found in most emergency departments. However, long-term outcomes for patients treated with this technique have not yet been evaluated. A primary objective of the current article is to provide safety data for the technique using an off-label product indication. Emergency department of Kaiser Permanente Medical Center, San Francisco, California. Six patients were treated in the emergency department for fingertip dermal avulsion injuries using a tourniquet and tissue adhesive glue (Dermabond by Ethicon, Somerville, New Jersey). Patients were subsequently contacted to assess healing and satisfaction with cosmetic outcome through interview and photographs of their wounds at 9 months following the date of injury. All 6 patients were satisfied with the cosmetic outcome of treatment, and none received a diagnosis of serious complications. This series demonstrates cosmetic outcomes for injuries treated with the technique, highlights potential problems that may be perceived by patients during their clinical course, and creates the groundwork for a larger clinical study examining the use of the technique.
Nondestructive evaluation technique guide
NASA Technical Reports Server (NTRS)
Vary, A.
1973-01-01
A total of 70 individual nondestructive evaluation (NDE) techniques are described. Information is presented that permits ease of comparison of the merits and limitations of each technique with respect to various NDE problems. An NDE technique classification system is presented. It is based on the system that was adopted by the National Materials Advisory Board (NMAB). The classification system presented follows the NMAB system closely with the exception of additional categories that have been added to cover more advanced techniques presently in use. The rationale of the technique is explained. The format provides for a concise description of each technique, the physical principles involved, objectives of interrogation, example applications, limitations of each technique, a schematic illustration, and key reference material. Cross-index tabulations are also provided so that particular NDE problems can be referred to appropriate techniques.
Kohen, D P; Olness, K N; Colwell, S O; Heimel, A
1984-02-01
This report assessed outcomes of hypnotherapeutic interventions for 505 children and adolescents seen by four pediatricians over a period of one year and followed from four months to two years. Presenting problems included enuresis, acute pain, chronic pain, asthma, habit disorders, obesity, encopresis, and anxiety. Using strict criteria for determination of problem resolution (e.g., all beds dry) and recognizing that some conditions were intrinsically chronic, the authors found that 51% of these children and adolescents achieved complete resolution of the presenting problem; an additional 32% achieved significant improvement, 9% showed initial or some improvement; and 7% demonstrated no apparent change or improvement. Children as young as three years of age effectively applied self-hypnosis techniques. In general, facility in self-hypnosis increased with age. There was an inverse correlation (p less than 0.001) between clinical success and number of visits, suggesting that prediction of responsivity is possible after four visits or less.
Certification trails and software design for testability
NASA Technical Reports Server (NTRS)
Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.
1993-01-01
Design techniques which may be applied to make program testing easier were investigated. Methods for modifying a program to generate additional data which we refer to as a certification trail are presented. This additional data is designed to allow the program output to be checked more quickly and effectively. Certification trails were described primarily from a theoretical perspective. A comprehensive attempt to assess experimentally the performance and overall value of the certification trail method is reported. The method was applied to nine fundamental, well-known algorithms for the following problems: convex hull, sorting, huffman tree, shortest path, closest pair, line segment intersection, longest increasing subsequence, skyline, and voronoi diagram. Run-time performance data for each of these problems is given, and selected problems are described in more detail. Our results indicate that there are many cases in which certification trails allow for significantly faster overall program execution time than a 2-version programming approach, and also give further evidence of the breadth of applicability of this method.
Optimal structure and parameter learning of Ising models
Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant; ...
2018-03-16
Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less
Optimal structure and parameter learning of Ising models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lokhov, Andrey; Vuffray, Marc Denis; Misra, Sidhant
Reconstruction of the structure and parameters of an Ising model from binary samples is a problem of practical importance in a variety of disciplines, ranging from statistical physics and computational biology to image processing and machine learning. The focus of the research community shifted toward developing universal reconstruction algorithms that are both computationally efficient and require the minimal amount of expensive data. Here, we introduce a new method, interaction screening, which accurately estimates model parameters using local optimization problems. The algorithm provably achieves perfect graph structure recovery with an information-theoretically optimal number of samples, notably in the low-temperature regime, whichmore » is known to be the hardest for learning. Here, the efficacy of interaction screening is assessed through extensive numerical tests on synthetic Ising models of various topologies with different types of interactions, as well as on real data produced by a D-Wave quantum computer. Finally, this study shows that the interaction screening method is an exact, tractable, and optimal technique that universally solves the inverse Ising problem.« less
NASA Technical Reports Server (NTRS)
1972-01-01
The systems approach was used in the seminar on the complex multidisciplinary problem of housing and related environment conditions. The main areas of study are the following; historical overview of housing; diagrammatic presentation of the problem; technological innovations and contributions; management, economic, legal, and political considerations; environment and natural resources; human needs and behavior; model of the housing industry; and potential for implementation. It is felt that a greater attempt should be made to transfer aerospace technology to the housing industry; however, the emphasis of the conference was directed to the modern management techniques developed by NASA. Among the conclusions are the following: The extent and character of the housing problem should be defined. Increased coordination of housing programs within and between Federal agencies is essential. Development of physically sophisticated building systems requires Federal support. New towns of differing life styles need to be created. Physiological and psychological reactions to environmental enclosure need to be defined.
Constraint-based integration of planning and scheduling for space-based observatory management
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Smith, Steven F.
1994-01-01
Progress toward the development of effective, practical solutions to space-based observatory scheduling problems within the HSTS scheduling framework is reported. HSTS was developed and originally applied in the context of the Hubble Space Telescope (HST) short-term observation scheduling problem. The work was motivated by the limitations of the current solution and, more generally, by the insufficiency of classical planning and scheduling approaches in this problem context. HSTS has subsequently been used to develop improved heuristic solution techniques in related scheduling domains and is currently being applied to develop a scheduling tool for the upcoming Submillimeter Wave Astronomy Satellite (SWAS) mission. The salient architectural characteristics of HSTS and their relationship to previous scheduling and AI planning research are summarized. Then, some key problem decomposition techniques underlying the integrated planning and scheduling approach to the HST problem are described; research results indicate that these techniques provide leverage in solving space-based observatory scheduling problems. Finally, more recently developed constraint-posting scheduling procedures and the current SWAS application focus are summarized.
ERIC Educational Resources Information Center
Newby, Michael; Nguyen, ThuyUyen H.
2010-01-01
This paper examines the effectiveness of a technique that first appeared as a Teaching Tip in the Journal of Information Systems Education. In this approach the same problem is used in every programming assignment within a course, but the students are required to use different programming techniques. This approach was used in an intermediate C++…
A new statistical framework to assess structural alignment quality using information compression
Collier, James H.; Allison, Lloyd; Lesk, Arthur M.; Garcia de la Banda, Maria; Konagurthu, Arun S.
2014-01-01
Motivation: Progress in protein biology depends on the reliability of results from a handful of computational techniques, structural alignments being one. Recent reviews have highlighted substantial inconsistencies and differences between alignment results generated by the ever-growing stock of structural alignment programs. The lack of consensus on how the quality of structural alignments must be assessed has been identified as the main cause for the observed differences. Current methods assess structural alignment quality by constructing a scoring function that attempts to balance conflicting criteria, mainly alignment coverage and fidelity of structures under superposition. This traditional approach to measuring alignment quality, the subject of considerable literature, has failed to solve the problem. Further development along the same lines is unlikely to rectify the current deficiencies in the field. Results: This paper proposes a new statistical framework to assess structural alignment quality and significance based on lossless information compression. This is a radical departure from the traditional approach of formulating scoring functions. It links the structural alignment problem to the general class of statistical inductive inference problems, solved using the information-theoretic criterion of minimum message length. Based on this, we developed an efficient and reliable measure of structural alignment quality, I-value. The performance of I-value is demonstrated in comparison with a number of popular scoring functions, on a large collection of competing alignments. Our analysis shows that I-value provides a rigorous and reliable quantification of structural alignment quality, addressing a major gap in the field. Availability: http://lcb.infotech.monash.edu.au/I-value Contact: arun.konagurthu@monash.edu Supplementary information: Online supplementary data are available at http://lcb.infotech.monash.edu.au/I-value/suppl.html PMID:25161241
Honey bee-inspired algorithms for SNP haplotype reconstruction problem
NASA Astrophysics Data System (ADS)
PourkamaliAnaraki, Maryam; Sadeghi, Mehdi
2016-03-01
Reconstructing haplotypes from SNP fragments is an important problem in computational biology. There have been a lot of interests in this field because haplotypes have been shown to contain promising data for disease association research. It is proved that haplotype reconstruction in Minimum Error Correction model is an NP-hard problem. Therefore, several methods such as clustering techniques, evolutionary algorithms, neural networks and swarm intelligence approaches have been proposed in order to solve this problem in appropriate time. In this paper, we have focused on various evolutionary clustering techniques and try to find an efficient technique for solving haplotype reconstruction problem. It can be referred from our experiments that the clustering methods relying on the behaviour of honey bee colony in nature, specifically bees algorithm and artificial bee colony methods, are expected to result in more efficient solutions. An application program of the methods is available at the following link. http://www.bioinf.cs.ipm.ir/software/haprs/
Choosing Objectives in Over-Subscription Planning
NASA Technical Reports Server (NTRS)
Smith, David E.
2003-01-01
Many NASA planning problems are over-subscription problems - that is, there are a large number of possible goals of differing value, and the planning system must choose a subset &it car! be accomplished within the limited time and resources available. Examples include planning for telescopes like Hubble, SIRTF, and SOFIA; scheduling for the Deep Space Network; and planning science experiments for a Mars rover. Unfortunately, existing planning systems are not designed to deal with problems like this - they expect a well-defined conjunctive goal and terminate in failure unless the entire goal is achieved. In this paper we develop techniques for over-subscription problems that assist a classical planner in choosing which goals to achieve, and the order in which to achieve them. These techniques use plan graph cost-estimation techniques to construct an orienteering problem, which is then used to provide heuristic advice on the goals and goal order that should considered by a planner.
NASA Technical Reports Server (NTRS)
Riedel, S. A.
1979-01-01
A method by which modern and classical control theory techniques may be integrated in a synergistic fashion and used in the design of practical flight control systems is presented. A general procedure is developed, and several illustrative examples are included. Emphasis is placed not only on the synthesis of the design, but on the assessment of the results as well. The first step is to establish the differences, distinguishing characteristics and connections between the modern and classical control theory approaches. Ultimately, this uncovers a relationship between bandwidth goals familiar in classical control and cost function weights in the equivalent optimal system. In order to obtain a practical optimal solution, it is also necessary to formulate the problem very carefully, and each choice of state, measurement and output variable must be judiciously considered. Once design goals are established and problem formulation completed, the control system is synthesized in a straightforward manner. Three steps are involved: filter-observer solution, regulator solution, and the combination of those two into the controller. Assessment of the controller permits and examination and expansion of the synthesis results.
Methods for detection of haemophilia carriers: a Memorandum*
1977-01-01
This Memorandum discusses the problems and techniques involved in the detection of carriers of haemophilia A (blood coagulation factor VIII deficiency) and haemophilia B (factor IX deficiency), particularly with a view to its application to genetic counselling. Apart from the personal suffering caused by haemophilia, the proper treatment of haemophiliacs places a great strain on the blood transfusion services, and it is therefore important that potential carriers should have precise information about the consequences of their having children. The Memorandum classifies the types of carrier and describes the laboratory methods used for the assessment of coagulant activity and antigen concentration in blood. Particular emphasis is laid on the establishment of international, national, and laboratory (working) standards for factors VIII and IX and their calibration in international units (IU). This is followed by a detailed account of the statistical analysis of pedigree and laboratory data, which leads to an assessment of the likelihood that a particular person will transmit the haemophilia gene to her children. Finally, the problems and responsibilities involved in genetic counselling are considered. PMID:304395
Benchmarking NNWSI flow and transport codes: COVE 1 results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayden, N.K.
1985-06-01
The code verification (COVE) activity of the Nevada Nuclear Waste Storage Investigations (NNWSI) Project is the first step in certification of flow and transport codes used for NNWSI performance assessments of a geologic repository for disposing of high-level radioactive wastes. The goals of the COVE activity are (1) to demonstrate and compare the numerical accuracy and sensitivity of certain codes, (2) to identify and resolve problems in running typical NNWSI performance assessment calculations, and (3) to evaluate computer requirements for running the codes. This report describes the work done for COVE 1, the first step in benchmarking some of themore » codes. Isothermal calculations for the COVE 1 benchmarking have been completed using the hydrologic flow codes SAGUARO, TRUST, and GWVIP; the radionuclide transport codes FEMTRAN and TRUMP; and the coupled flow and transport code TRACR3D. This report presents the results of three cases of the benchmarking problem solved for COVE 1, a comparison of the results, questions raised regarding sensitivities to modeling techniques, and conclusions drawn regarding the status and numerical sensitivities of the codes. 30 refs.« less
Sociodemographic factors in Arab children with Autism Spectrum Disorders
Amr, Mostafa; Bu Ali, WaleedAl; Hablas, Hatem; Raddad, Dahoud; El-Mehesh, Fatma; El-Gilany, Abdel-Hady; Al-Shamy, Hemdan
2012-01-01
Introduction There is a critical gap in Autistic Spectrum Disorders (ASD) research with respect to manifestations of the condition in developing countries This study examined the influence of sociodemographic variables on the severity of autistic symptoms and behavioral profile in Arab children. Methods The total study sample comprised of 60 Arab children (38 boys and 22 girls) from three Arab countries (22 Jordanians, 19 Saudis and 19 Egyptians). The diagnosis of Autism Spectrum Disorders (ASD) was based on DSM-IV criteria supplemented by direct observation according to the Indian Scale for Assessment of Autism (ISAA) and assessment of Intelligent Quotient (IQ). Finally, parents rated their child on the Achenbach Child Behavior Checklist (CBCL). Results It was found that the housewives and Saudi parents described more autistic symptoms and externalizing behavior problems. A significant negative correlation was found between IQ and each of ISAA, CBCL Internalizing and Externalizing problems scores. Conclusion The study concluded that the clinical presentation of ASD may be shaped by cultural factors that are likely to help to formulate specific diagnosis and intervention techniques in Arab children with ASD. PMID:23346279
Assessing Students' Mathematical Problem Posing
ERIC Educational Resources Information Center
Silver, Edward A.; Cai, Jinfa
2005-01-01
Specific examples are used to discuss assessment, an integral part of mathematics instruction, with problem posing and assessment of problem posing. General assessment criteria are suggested to evaluate student-generated problems in terms of their quantity, originality, and complexity.
NASA Astrophysics Data System (ADS)
Fikri, P. M.; Sinaga, P.; Hasanah, L.; Solehat, D.
2018-05-01
This study aims to determine profile of students’ generated representations and creative thinking skill on problem solving in vocational school. This research is a descriptive research to get an idea of comprehend students’ generated representations and creative thinking skill on problem solving of vocational school in Bandung. Technique of collecting data is done by test method, observation, and interview. Representation is something that represents, describes or symbolizes an object or process. To evaluate the multi-representation skill used essay test with rubric of scoring was used to assess multi-depressant student skills. While creative thinking skill on problem solving used essay test which contains the components of skills in finding facts, problem finding skills, idea finding skills and solution finding skills. The results showed generated representations is still relatively low, this is proven by average student answers explanation is mathematically correct but there is no explanation verbally or graphically. While creative thinking skill on problem solving is still relatively low, this is proven by average score for skill indicator in finding the student problem is 1.52 including the non-creative category, average score for the skill indicator in finding the student idea is 1.23 including the non-creative category, and the average score of the students skill in finding this solution is 0.72 belongs to a very uncreative category.
Problems with numerical techniques: Application to mid-loop operation transients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryce, W.M.; Lillington, J.N.
1997-07-01
There has been an increasing need to consider accidents at shutdown which have been shown in some PSAs to provide a significant contribution to overall risk. In the UK experience has been gained at three levels: (1) Assessment of codes against experiments; (2) Plant studies specifically for Sizewell B; and (3) Detailed review of modelling to support the plant studies for Sizewell B. The work has largely been carried out using various versions of RELAP5 and SCDAP/RELAP5. The paper details some of the problems that have needed to be addressed. It is believed by the authors that these kinds ofmore » problems are probably generic to most of the present generation system thermal-hydraulic codes for the conditions present in mid-loop transients. Thus as far as possible these problems and solutions are proposed in generic terms. The areas addressed include: condensables at low pressure, poor time step calculation detection, water packing, inadequate physical modelling, numerical heat transfer and mass errors. In general single code modifications have been proposed to solve the problems. These have been very much concerned with means of improving existing models rather than by formulating a completely new approach. They have been produced after a particular problem has arisen. Thus, and this has been borne out in practice, the danger is that when new transients are attempted, new problems arise which then also require patching.« less
Artificial intelligence technology assessment for the US Army Depot System Command
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pennock, K A
1991-07-01
This assessment of artificial intelligence (AI) has been prepared for the US Army's Depot System Command (DESCOM) by Pacific Northwest Laboratory. The report describes several of the more promising AI technologies, focusing primarily on knowledge-based systems because they have been more successful in commercial applications than any other AI technique. The report also identifies potential Depot applications in the areas of procedural support, scheduling and planning, automated inspection, training, diagnostics, and robotic systems. One of the principal objectives of the report is to help decisionmakers within DESCOM to evaluate AI as a possible tool for solving individual depot problems. Themore » report identifies a number of factors that should be considered in such evaluations. 22 refs.« less
The use of computed tomography for assessment of the swim bladder in koi carp (Cyprinus carpio).
Pees, Michael; Pees, Kathrin; Kiefer, Ingmar
2010-01-01
Seven normal koi (Cyprinus carpio) and seven koi with negative buoyancy were examined using computed tomography (CT) to assess the swim bladder. The volume of the swim bladder was calculated in all animals. In the healthy koi there was a statistical correlation (r = 0.996) between body mass and swim bladder volume with volume (ml) being related to body mass according to the formula 4.9 +/- 0.054 x BM (g). In all koi with buoyancy problems, the gas volume of the swim bladder was reduced. Additionally, fluid was found within the swim bladder in three of the abnormal koi. CT proved to be a quick noninvasive technique for the examination of the swim bladder in koi.
Using Literacy Techniques to Teach Astronomy to Non-Science Majors
NASA Astrophysics Data System (ADS)
Garland, C. A.; Ratay, D. L.
We discuss an introductory-level college astronomy class that significantly relied on reading and writing assignments to deliver basic content knowledge and provide a basis for deeper analysis of the material. As opposed to the traditional problem-set method of homework, students were required to read popular articles from magazines and newspapers related to the content presented in class, and then prepare responses. These responses ranged from methodological analyzes to using the readings to create original science journalism. Additional forms of assessment indicated that students benefited from this type of course design. We propose that given the background of students in this type of course, our course design is better suited to engage students in the material and provides a valid alternative method of assessment.
Optical Molecular Imaging for Diagnosing Intestinal Diseases
Kim, Sang-Yeob
2013-01-01
Real-time visualization of the molecular signature of cells can be achieved with advanced targeted imaging techniques using molecular probes and fluorescence endoscopy. This molecular optical imaging in gastrointestinal endoscopy is promising for improving the detection of neoplastic lesions, their characterization for patient stratification, and the assessment of their response to molecular targeted therapy and radiotherapy. In inflammatory bowel disease, this method can be used to detect dysplasia in the presence of background inflammation and to visualize inflammatory molecular targets for assessing disease severity and prognosis. Several preclinical and clinical trials have applied this method in endoscopy; however, this field has just started to evolve. Hence, many problems have yet to be solved to enable the clinical application of this novel method. PMID:24340254
NASA Technical Reports Server (NTRS)
1985-01-01
The service life of the Space Shuttle Main Engine (SSME) turbomachinery bearings was a predominant factor in engine durability and maintenance problems. Recent data has indicated that bearing life is about one order of magnitude lower than the goal of seven and one-half hours particularly those in the High Pressure Oxidizer Turbopump (HPOTP). Bearing technology, primarily cryogenic turbomachinery bearing technology, is expanded by exploring the life and performance effects of design changes; design concept changes; materials changes; manufacturing technique changes; and lubrication system changes. Each variation is assessed against the current bearing design in full scale cryogenic tests.
Investigation of parabolic computational techniques for internal high-speed viscous flows
NASA Technical Reports Server (NTRS)
Anderson, O. L.; Power, G. D.
1985-01-01
A feasibility study was conducted to assess the applicability of an existing parabolic analysis (ADD-Axisymmetric Diffuser Duct), developed previously for subsonic viscous internal flows, to mixed supersonic/subsonic flows with heat addition simulating a SCRAMJET combustor. A study was conducted with the ADD code modified to include additional convection effects in the normal momentum equation when supersonic expansion and compression waves were present. It is concluded from the present study that for the class of problems where strong viscous/inviscid interactions are present a global iteration procedure is required.
Engineering solutions of environmental problems in organic waste handling
NASA Astrophysics Data System (ADS)
Briukhanov, A. Y.; Vasilev, E. V.; Shalavina, E. V.; Kucheruk, O. N.
2017-10-01
This study shows the urgent need to consider modernization of agricultural production in terms of sustainable development, which takes into account environmental implications of intensive technologies in livestock farming. Some science-based approaches are offered to address related environmental challenges. High-end technologies of organic livestock waste processing were substantiated by the feasibility study and nutrient balance calculation. The technologies were assessed on the basis of best available techniques criteria, including measures such as specific capital and operational costs associated with nutrient conservation and their delivery to the plants.
Validation of the Transient Structural Response of a Threaded Assembly: Phase I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doebling, Scott W.; Hemez, Francois M.; Robertson, Amy N.
2004-04-01
This report explores the application of model validation techniques in structural dynamics. The problem of interest is the propagation of an explosive-driven mechanical shock through a complex threaded joint. The study serves the purpose of assessing whether validating a large-size computational model is feasible, which unit experiments are required, and where the main sources of uncertainty reside. The results documented here are preliminary, and the analyses are exploratory in nature. The results obtained to date reveal several deficiencies of the analysis, to be rectified in future work.
Acute poisoning: understanding 90% of cases in a nutshell
Greene, S; Dargan, P; Jones, A
2005-01-01
The acutely poisoned patient remains a common problem facing doctors working in acute medicine in the United Kingdom and worldwide. This review examines the initial management of the acutely poisoned patient. Aspects of general management are reviewed including immediate interventions, investigations, gastrointestinal decontamination techniques, use of antidotes, methods to increase poison elimination, and psychological assessment. More common and serious poisonings caused by paracetamol, salicylates, opioids, tricyclic antidepressants, selective serotonin reuptake inhibitors, benzodiazepines, non-steroidal anti-inflammatory drugs, and cocaine are discussed in detail. Specific aspects of common paediatric poisonings are reviewed. PMID:15811881
Acute poisoning: understanding 90% of cases in a nutshell.
Greene, S L; Dargan, P I; Jones, A L
2005-04-01
The acutely poisoned patient remains a common problem facing doctors working in acute medicine in the United Kingdom and worldwide. This review examines the initial management of the acutely poisoned patient. Aspects of general management are reviewed including immediate interventions, investigations, gastrointestinal decontamination techniques, use of antidotes, methods to increase poison elimination, and psychological assessment. More common and serious poisonings caused by paracetamol, salicylates, opioids, tricyclic antidepressants, selective serotonin reuptake inhibitors, benzodiazepines, non-steroidal anti-inflammatory drugs, and cocaine are discussed in detail. Specific aspects of common paediatric poisonings are reviewed.
NASA Technical Reports Server (NTRS)
Rodgers, E. B.; Seale, D. B.; Boraas, M. E.; Sommer, C. V.
1989-01-01
The probable sources and implications of microbial contamination on the proposed Space Station are discussed. Because of the limited availability of material, facilities and time on the Space Station, we are exploring the feasibility of replacing traditional incubation methods for assessing microbial contamination with rapid, automated methods. Some possibilities include: ATP measurement, microscopy and telecommunications, and molecular techniques such as DNA probes or monoclonal antibodies. Some of the important ecological factors that could alter microbes in space include microgravity, exposure to radiation, and antibiotic resistance.
Recent progress in Precambrian paleobiology
NASA Technical Reports Server (NTRS)
Schopf, J. W.
1986-01-01
Ongoing studies at UCLA include the following: (1) investigations in Archean and Proterozoic sequences of various locations; (2) laboratory and field studies of modern microbial biocoenoses (analogues of Precambrian microbial communities) especially those at Laguna Mormona, Baja California, Mexico; (3) development of new laboratory techniques for the separation and concentration of minute cellularly preserved fossils for isotopic and organic geochemical analyses; and (4) assembly of a computerized database for assessment of the timing and nature of major events occurring during Precambrian biotic evolution, and of the potential applicability of ancient microbiotas to problems of global biostratigraphy and biogeography.
Remote sensing in Michigan for land resource management
NASA Technical Reports Server (NTRS)
Sattinger, I. J.
1972-01-01
This project to demonstrate the application of earth resource survey technology to current problems in Michigan was undertaken jointly by the Environmental Research Institute of Michigan and Michigan State University. Remote sensing techniques were employed to advantage in providing management information for the Pointe Mouillee State Game Area and preparing an impact assessment in advance of the projected construction of the M-14 freeway from Ann Arbor to Plymouth, Michigan. The project also assisted the state government in its current effort to develop and implement a state-wide land management plan.
Status report: Data management program algorithm evaluation activity at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.
1977-01-01
An algorithm evaluation activity was initiated to study the problems associated with image processing by assessing the independent and interdependent effects of registration, compression, and classification techniques on LANDSAT data for several discipline applications. The objective of the activity was to make recommendations on selected applicable image processing algorithms in terms of accuracy, cost, and timeliness or to propose alternative ways of processing the data. As a means of accomplishing this objective, an Image Coding Panel was established. The conduct of the algorithm evaluation is described.
Digital Image Compression Using Artificial Neural Networks
NASA Technical Reports Server (NTRS)
Serra-Ricart, M.; Garrido, L.; Gaitan, V.; Aloy, A.
1993-01-01
The problem of storing, transmitting, and manipulating digital images is considered. Because of the file sizes involved, large amounts of digitized image information are becoming common in modern projects. Our goal is to described an image compression transform coder based on artificial neural networks techniques (NNCTC). A comparison of the compression results obtained from digital astronomical images by the NNCTC and the method used in the compression of the digitized sky survey from the Space Telescope Science Institute based on the H-transform is performed in order to assess the reliability of the NNCTC.
Lee, Celine; Reutter, Heiko M; Grässer, Melanie F; Fisch, Margit; Noeker, Meinolf
2006-02-01
To identify problems in the long-term psychosocial and developmental outcome specific to patients with the bladder exstrophy-epispadias complex (BEEC), using a self-developed semi-structured questionnaire, as there are various techniques of reconstruction to repair BEEC but to date neither patients nor surgeons have a clear answer about which type gives the most acceptable long-term results. Increasingly many patients with BEEC reach adulthood and wish to have sexual relationships and families. To date, no studies have used disease-specific psychological instruments to measure the psychosocial status of patients with BEEC. Thus we contacted 208 patients with BEEC, and 122 were enrolled, covering the complete spectrum of the BEEC. The data assessed included the surgical reconstruction, subjective assessment of continence, developmental milestones, school performance and career, overall satisfaction in life, disease-specific fears and partnership experiences in patients aged >18 years. We compared affected females and males to assess gender-associated differences in quality of life. Affected females had more close friendships, fewer disadvantages in relation to healthy female peers and more partnerships than the males. Family planning seemed to be less of a problem in affected females. There were no gender differences in the adjustments within school and professional career, which was very good in general. Future studies are needed to assess the disease-specific anxieties, considering gender-specific differences.
Practical Problems in the Cement Industry Solved by Modern Research Techniques
ERIC Educational Resources Information Center
Daugherty, Kenneth E.; Robertson, Les D.
1972-01-01
Practical chemical problems in the cement industry are being solved by such techniques as infrared spectroscopy, gas chromatography-mass spectrometry, X-ray diffraction, atomic absorption and arc spectroscopy, thermally evolved gas analysis, Mossbauer spectroscopy, transmission and scanning electron microscopy. (CP)
Management Techniques for Librarians.
ERIC Educational Resources Information Center
Evans, G. Edward
This textbook on library management techniques is concerned with basic management problems. Examples of problems in planning, organization, and coordination are drawn from situations in libraries or information centers. After an introduction to library management, the history of management is covered. Several styles of management and organization…
A variable-gain output feedback control design methodology
NASA Technical Reports Server (NTRS)
Halyo, Nesim; Moerder, Daniel D.; Broussard, John R.; Taylor, Deborah B.
1989-01-01
A digital control system design technique is developed in which the control system gain matrix varies with the plant operating point parameters. The design technique is obtained by formulating the problem as an optimal stochastic output feedback control law with variable gains. This approach provides a control theory framework within which the operating range of a control law can be significantly extended. Furthermore, the approach avoids the major shortcomings of the conventional gain-scheduling techniques. The optimal variable gain output feedback control problem is solved by embedding the Multi-Configuration Control (MCC) problem, previously solved at ICS. An algorithm to compute the optimal variable gain output feedback control gain matrices is developed. The algorithm is a modified version of the MCC algorithm improved so as to handle the large dimensionality which arises particularly in variable-gain control problems. The design methodology developed is applied to a reconfigurable aircraft control problem. A variable-gain output feedback control problem was formulated to design a flight control law for an AFTI F-16 aircraft which can automatically reconfigure its control strategy to accommodate failures in the horizontal tail control surface. Simulations of the closed-loop reconfigurable system show that the approach produces a control design which can accommodate such failures with relative ease. The technique can be applied to many other problems including sensor failure accommodation, mode switching control laws and super agility.
NOTES: a review of the technical problems encountered and their solutions.
Mintz, Yoav; Horgan, Santiago; Cullen, John; Stuart, David; Falor, Eric; Talamini, Mark A
2008-08-01
Natural orifice translumenal endoscopic surgery (NOTES) is currently investigated and developed worldwide. In the past few years, multiple groups have confronted this challenge. Many technical problems are encountered in this technique due to the currently available tools for this approach. Some of the unique technical problems in NOTES include: blindly performed primary incisions; uncontrolled pneumoperitoneal pressure; no support for the endoscope in the abdominal cavity; inadequate vision; insufficient illumination; limited retraction and exposure; and the complexity of suturing and performing a safe anastomosis. In this paper, we review the problems encountered in NOTES and provide possible temporary solutions. Acute and survival studies were performed on 15 farm pigs. The hybrid technique approach (i.e., endoscopic surgery with the aid of laparoscopic vision) was performed in all cases. Procedures performed included liver biopsies, bilateral tubal ligation, oophprectomy, cholecystectomy, splenectomy and small bowel resection, and anastomosis. All attempted procedures were successfully performed. New methods and techniques were developed to overcome the technical problems. Closure of the gastrotomy was achieved by T-bar sutures and by stapler closure of the stomach incision. Small bowel anastomosis was achieved by the dual-lumen NOTES technique. The hybrid technique serves as a temporary approach to aid in developing the NOTES technique. A rectal or vaginal port of entry enables and facilitates gastrointestinal NOTES by using available laparoscopic instruments. The common operations performed today in the laparoscopic fashion could be probably performed in the NOTES approach. The safety of these procedures, however, is yet to be determined.
High frequency flow-structural interaction in dense subsonic fluids
NASA Technical Reports Server (NTRS)
Liu, Baw-Lin; Ofarrell, J. M.
1995-01-01
Prediction of the detailed dynamic behavior in rocket propellant feed systems and engines and other such high-energy fluid systems requires precise analysis to assure structural performance. Designs sometimes require placement of bluff bodies in a flow passage. Additionally, there are flexibilities in ducts, liners, and piping systems. A design handbook and interactive data base have been developed for assessing flow/structural interactions to be used as a tool in design and development, to evaluate applicable geometries before problems develop, or to eliminate or minimize problems with existing hardware. This is a compilation of analytical/empirical data and techniques to evaluate detailed dynamic characteristics of both the fluid and structures. These techniques have direct applicability to rocket engine internal flow passages, hot gas drive systems, and vehicle propellant feed systems. Organization of the handbook is by basic geometries for estimating Strouhal numbers, added mass effects, mode shapes for various end constraints, critical onset flow conditions, and possible structural response amplitudes. Emphasis is on dense fluids and high structural loading potential for fatigue at low subsonic flow speeds where high-frequency excitations are possible. Avoidance and corrective measure illustrations are presented together with analytical curve fits for predictions compiled from a comprehensive data base.
Learning-based computing techniques in geoid modeling for precise height transformation
NASA Astrophysics Data System (ADS)
Erol, B.; Erol, S.
2013-03-01
Precise determination of local geoid is of particular importance for establishing height control in geodetic GNSS applications, since the classical leveling technique is too laborious. A geoid model can be accurately obtained employing properly distributed benchmarks having GNSS and leveling observations using an appropriate computing algorithm. Besides the classical multivariable polynomial regression equations (MPRE), this study attempts an evaluation of learning based computing algorithms: artificial neural networks (ANNs), adaptive network-based fuzzy inference system (ANFIS) and especially the wavelet neural networks (WNNs) approach in geoid surface approximation. These algorithms were developed parallel to advances in computer technologies and recently have been used for solving complex nonlinear problems of many applications. However, they are rather new in dealing with precise modeling problem of the Earth gravity field. In the scope of the study, these methods were applied to Istanbul GPS Triangulation Network data. The performances of the methods were assessed considering the validation results of the geoid models at the observation points. In conclusion the ANFIS and WNN revealed higher prediction accuracies compared to ANN and MPRE methods. Beside the prediction capabilities, these methods were also compared and discussed from the practical point of view in conclusions.
Fuzzy Analysis in Creative Problem Solving.
ERIC Educational Resources Information Center
Carey, Russell L.
1984-01-01
"Diagraming Analysis of a Fuzzy Technique" (DAFT) is a model rectifying two problems associated with Future Problem Solving Bowl activities, namely problem definition by teams and evaluation of team responses. (MC)
On the Performance Evaluation of 3D Reconstruction Techniques from a Sequence of Images
NASA Astrophysics Data System (ADS)
Eid, Ahmed; Farag, Aly
2005-12-01
The performance evaluation of 3D reconstruction techniques is not a simple problem to solve. This is not only due to the increased dimensionality of the problem but also due to the lack of standardized and widely accepted testing methodologies. This paper presents a unified framework for the performance evaluation of different 3D reconstruction techniques. This framework includes a general problem formalization, different measuring criteria, and a classification method as a first step in standardizing the evaluation process. Performance characterization of two standard 3D reconstruction techniques, stereo and space carving, is also presented. The evaluation is performed on the same data set using an image reprojection testing methodology to reduce the dimensionality of the evaluation domain. Also, different measuring strategies are presented and applied to the stereo and space carving techniques. These measuring strategies have shown consistent results in quantifying the performance of these techniques. Additional experiments are performed on the space carving technique to study the effect of the number of input images and the camera pose on its performance.
Gülşen, İsmail; Ak, Hakan; Evcılı, Gökhan; Balbaloglu, Özlem; Sösüncü, Enver
2013-01-01
Background. In this retrospective study, we aimed to compare the results of two surgical techniques, conventional and transverse mini-incision. Materials and Methods. 95 patients were operated between 2011 and 2012 in Bitlis State Hospital. 50 patients were operated with conventional technique and 45 of them were operated with minimal transverse incision. Postoperative complications, incision site problems, and the time of starting to use their hands in daily activities were noted. Results. 95 patients were included in the study. The mean age was 48. 87 of them were female and 8 were male. There was no problem of incision site in both of the two surgical techniques. Only in one patient, anesthesia developed in minimal incision technique. The time of starting to use their hands in daily activities was 22,2 days and 17 days in conventional and minimal incision technique, respectively. Conclusion. Two surgical techniques did not show superiority to each other in terms of postoperative complications and incision site problems except the time of starting to use their hands in daily activities. PMID:24396607
NACA Conference on Aerodynamic Problems of Transonic Airplane Design
NASA Technical Reports Server (NTRS)
1949-01-01
During the past several years it has been necessary for aeronautical research workers to exert a good portion of their effort in developing the means for conducting research in the high-speed range. The transonic range particularly has presented a very acute problem because of the choking phenomena in wind tunnels at speeds close to the speed of sound. At the same time, the multiplicity of design problems for aircraft introduced by the peculiar flow problems of the transonic speed range has given rise to an enormous demand for detail design data. Substantial progress has been made, however, in developing the required research techniques and in supplying the demand for aerodynamic data required for design purposes. In meeting this demand, it has been necessary to resort to new techniques possessing such novel features that the results obtained have had to be viewed with caution. Furthermore, the kinds of measurements possible with these various techniques are so varied that the correlation of results obtained by different techniques generally becomes an indirect process that can only be accomplished in conjunction with the application of estimates of the extent to which the results of measurements by any given technique are modified by differences that are inherent in the techniques. Thus, in the establishment of the validity and applicability of data obtained by any given technique, direct comparisons between data from different sources are a supplement to but not a substitute for the detailed knowledge required of the characteristics of each technique and fundamental aerodynamic flow phenomena.
Ground Penetrating Radar technique for railway track characterization in Portugal
NASA Astrophysics Data System (ADS)
De Chiara, Francesca; Fontul, Simona; Fortunato, Eduardo; D'Andrea, Antonio
2013-04-01
Maintenance actions are significant for transport infrastructures but, today, costs have to be necessary limited. A proper quality control since the construction phase is a key factor for a long life cycle and for a good economy policy. For this reason, suitable techniques have to be chosen and non-destructive tests represent an efficient solution, as they allow to evaluate infrastructure characteristics in a continuous or quasi-continuous way, saving time and costs, enabling to make changes if tests results do not comply with the project requirements. Ground Penetrating Radar (GPR) is a quick and effective technique to evaluate infrastructure condition in a continuous manner, replacing or reducing the use of traditional drilling method. GPR application to railways infrastructures, during construction and monitoring phase, is relatively recent. It is based on the measuring of layers thicknesses and detection of structural changes. It also enables the assessment of materials properties that constitute the infrastructure and the evaluation of the different types of defects such as ballast pockets, fouled ballast, poor drainage, subgrade settlement and transitions problems. These deteriorations are generally the causes of vertical deviations in track geometry and they cannot be detected by the common monitoring procedures, namely the measurements of track geometry. Moreover, the development of new GPR systems with higher antenna frequencies, better data acquisition systems, more user friendly software and new algorithms for calculation of materials properties can lead to a regular use of GPR. Therefore, it represents a reliable technique to assess track geometry problems and consequently to improve maintenance planning. In Portugal, rail inspection is performed with Plasser & Theurer EM120 equipment and recently 400 MHz IDS antennas were installed on it. GPR tests were performed on the Portuguese rail network and, as case study in this paper, a renewed track was considered. The aim was to detect, along the track, changes of the layers in terms of both thicknesses and materials characteristics by using specific software, Railwaydoctor. Different test campaigns were studied in order to determine and compare the materials dielectric constants that can be influenced by water content values, due to measurements performed in different seasons.
EMG Processing Based Measures of Fatigue Assessment during Manual Lifting
Marhaban, M. H.; Abdullah, A. R.
2017-01-01
Manual lifting is one of the common practices used in the industries to transport or move objects to a desired place. Nowadays, even though mechanized equipment is widely available, manual lifting is still considered as an essential way to perform material handling task. Improper lifting strategies may contribute to musculoskeletal disorders (MSDs), where overexertion contributes as the highest factor. To overcome this problem, electromyography (EMG) signal is used to monitor the workers' muscle condition and to find maximum lifting load, lifting height and number of repetitions that the workers are able to handle before experiencing fatigue to avoid overexertion. Past researchers have introduced several EMG processing techniques and different EMG features that represent fatigue indices in time, frequency, and time-frequency domain. The impact of EMG processing based measures in fatigue assessment during manual lifting are reviewed in this paper. It is believed that this paper will greatly benefit researchers who need a bird's eye view of the biosignal processing which are currently available, thus determining the best possible techniques for lifting applications. PMID:28303251
Tsirlin, Inna; Dupierrix, Eve; Chokron, Sylvie; Coquillart, Sabine; Ohlmann, Theophile
2009-04-01
Unilateral spatial neglect is a disabling condition frequently occurring after stroke. People with neglect suffer from various spatial deficits in several modalities, which in many cases impair everyday functioning. A successful treatment is yet to be found. Several techniques have been proposed in the last decades, but only a few showed long-lasting effects and none could completely rehabilitate the condition. Diagnostic methods of neglect could be improved as well. The disorder is normally diagnosed with pen-and-paper methods, which generally do not assess patients in everyday tasks and do not address some forms of the disorder. Recently, promising new methods based on virtual reality have emerged. Virtual reality technologies hold great opportunities for the development of effective assessment and treatment techniques for neglect because they provide rich, multimodal, and highly controllable environments. In order to stimulate advancements in this domain, we present a review and an analysis of the current work. We describe past and ongoing research of virtual reality applications for unilateral neglect and discuss the existing problems and new directions for development.
Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi
2016-10-01
Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
The impact of nonuniform sampling on stratospheric ozone trends derived from occultation instruments
NASA Astrophysics Data System (ADS)
Damadeo, Robert P.; Zawodny, Joseph M.; Remsberg, Ellis E.; Walker, Kaley A.
2018-01-01
This paper applies a recently developed technique for deriving long-term trends in ozone from sparsely sampled data sets to multiple occultation instruments simultaneously without the need for homogenization. The technique can compensate for the nonuniform temporal, spatial, and diurnal sampling of the different instruments and can also be used to account for biases and drifts between instruments. These problems have been noted in recent international assessments as being a primary source of uncertainty that clouds the significance of derived trends. Results show potential recovery
trends of ˜ 2-3 % decade-1 in the upper stratosphere at midlatitudes, which are similar to other studies, and also how sampling biases present in these data sets can create differences in derived recovery trends of up to ˜ 1 % decade-1 if not properly accounted for. Limitations inherent to all techniques (e.g., relative instrument drifts) and their impacts (e.g., trend differences up to ˜ 2 % decade-1) are also described and a potential path forward towards resolution is presented.