ERIC Educational Resources Information Center
Klein, Hans E., Ed.
This book presents a selection of papers from the international, interdisciplinary conference of the World Association for Case Method Research & Application. Papers are categorized into seven areas: (1) "International Case Studies" (e.g., event-based entrepreneurship, case studies on consumer complaints, and strategic quality…
ERIC Educational Resources Information Center
Klein, Hans E., Ed.
This book presents a selection of papers from the annual, international, interdisciplinary conference of the World Association for Case Method Research & Application. Papers are categorized into six areas: (1) "Case Studies and Research" (e.g., subjectivity as a source of insight in case study research, evolution of a teaching case,…
Transportation planning, climate change, and decision making under uncertainty
DOT National Transportation Integrated Search
2008-01-01
Case studies are presented that illustrate the application of methods which incorporate : decisionmaking under uncertainty. The applications of these methods that are summarized in : this paper deal with cases outside of transportation, including mil...
Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel
2011-03-28
Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory
Trends and Lessons Learned in Interdisciplinary and Non-Business Case Method Application.
ERIC Educational Resources Information Center
Anyansi-Archibong, Chi; Czuchry, Andrew J.; House, Claudia S.; Cicirello, Tony
2000-01-01
Presents results of a survey designed to test the level of development and application of cases in non-business courses such as sciences, mathematics, engineering, health, and technology. Findings support the growing popularity of the case method of teaching and learning outside the business domain. Suggests a framework for establishing win-win…
ERIC Educational Resources Information Center
Koutsoukos, Marios; Fragoulis, Iosif; Valkanos, Euthimios
2015-01-01
The main objective of this case study is to examine secondary education teachers' opinions concerning the connection of environmental education with the use of experiential teaching methods. Exploring whether the application of experiential methods can upgrade the learning procedure, leading to a more holistic approach, the research focuses on…
Teaching and the Case Method. Text, Cases, and Readings. Third Edition.
ERIC Educational Resources Information Center
Barnes, Louis B.; And Others
This volume includes text, cases, and readings for a college faculty seminar to develop the knowledge, skills, and attitudes necessary for utilization of the case method approach to instruction. It builds on a long-term clinical research effort on the dynamics of the case method of teaching and application at Harvard Business School. In addition…
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Applications of AN OO Methodology and Case to a Daq System
NASA Astrophysics Data System (ADS)
Bee, C. P.; Eshghi, S.; Jones, R.; Kolos, S.; Magherini, C.; Maidantchik, C.; Mapelli, L.; Mornacchi, G.; Niculescu, M.; Patel, A.; Prigent, D.; Spiwoks, R.; Soloviev, I.; Caprini, M.; Duval, P. Y.; Etienne, F.; Ferrato, D.; Le van Suu, A.; Qian, Z.; Gaponenko, I.; Merzliakov, Y.; Ambrosini, G.; Ferrari, R.; Fumagalli, G.; Polesello, G.
The RD13 project has evaluated the use of the Object Oriented Information Engineering (OOIE) method during the development of several software components connected to the DAQ system. The method is supported by a sophisticated commercial CASE tool (Object Management Workbench) and programming environment (Kappa) which covers the full life-cycle of the software including model simulation, code generation and application deployment. This paper gives an overview of the method, CASE tool, DAQ components which have been developed and we relate our experiences with the method and tool, its integration into our development environment and the spiral lifecycle it supports.
NASA Technical Reports Server (NTRS)
Young, S. G.
1976-01-01
An ultrasonic cavitation method for restoring obliterated serial numbers has been further explored by application to articles involved in police cases. The method was applied successfully to gun parts. In one case portions of numbers were restored after prior failure by other laboratories using chemical etching techniques. The ultrasonic method was not successful on a heavily obliterated and restamped automobile engine block, but it was partially successful on a motorcycle gear-case housing. Additional studies were made on the effect of a larger diameter ultrasonic probe, and on the method's ability to restore numbers obliterated by peening.
Disease Risk Score (DRS) as a Confounder Summary Method: Systematic Review and Recommendations
Tadrous, Mina; Gagne, Joshua J.; Stürmer, Til; Cadarette, Suzanne M.
2013-01-01
Purpose To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. Methods We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Results Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Conclusion Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. PMID:23172692
ERIC Educational Resources Information Center
Nielsen, Richard A.
2016-01-01
This article shows how statistical matching methods can be used to select "most similar" cases for qualitative analysis. I first offer a methodological justification for research designs based on selecting most similar cases. I then discuss the applicability of existing matching methods to the task of selecting most similar cases and…
Maggin, Daniel M; Swaminathan, Hariharan; Rogers, Helen J; O'Keeffe, Breda V; Sugai, George; Horner, Robert H
2011-06-01
A new method for deriving effect sizes from single-case designs is proposed. The strategy is applicable to small-sample time-series data with autoregressive errors. The method uses Generalized Least Squares (GLS) to model the autocorrelation of the data and estimate regression parameters to produce an effect size that represents the magnitude of treatment effect from baseline to treatment phases in standard deviation units. In this paper, the method is applied to two published examples using common single case designs (i.e., withdrawal and multiple-baseline). The results from these studies are described, and the method is compared to ten desirable criteria for single-case effect sizes. Based on the results of this application, we conclude with observations about the use of GLS as a support to visual analysis, provide recommendations for future research, and describe implications for practice. Copyright © 2011 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
Case Method Teaching as Science and Art: A Metaphoric Approach and Curricular Application
ERIC Educational Resources Information Center
Greenhalgh, Anne M.
2007-01-01
The following article takes a metaphoric approach to case method teaching to shed light on one of our most important practices. The article hinges on the dual comparison of case method as science and as art. The dominant, scientific view of cases is that they are neutral descriptions of real-life business problems, subject to rigorous analysis.…
Disease risk score as a confounder summary method: systematic review and recommendations.
Tadrous, Mina; Gagne, Joshua J; Stürmer, Til; Cadarette, Suzanne M
2013-02-01
To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. Copyright © 2012 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Maggin, Daniel M.; Swaminathan, Hariharan; Rogers, Helen J.; O'Keeffe, Breda V.; Sugai, George; Horner, Robert H.
2011-01-01
A new method for deriving effect sizes from single-case designs is proposed. The strategy is applicable to small-sample time-series data with autoregressive errors. The method uses Generalized Least Squares (GLS) to model the autocorrelation of the data and estimate regression parameters to produce an effect size that represents the magnitude of…
ERIC Educational Resources Information Center
Tatzl, Dietmar
2015-01-01
This article presents an innovative adaptation of the case method to teaching English for specific academic purposes. Widespread in its traditional form in various content disciplines, the case method bears the potential for truly student-centred language instruction. The current application transforms learners from case analysts to case authors…
Generalizing DTW to the multi-dimensional case requires an adaptive approach
Hu, Bing; Jin, Hongxia; Wang, Jun; Keogh, Eamonn
2017-01-01
In recent years Dynamic Time Warping (DTW) has emerged as the distance measure of choice for virtually all time series data mining applications. For example, virtually all applications that process data from wearable devices use DTW as a core sub-routine. This is the result of significant progress in improving DTW’s efficiency, together with multiple empirical studies showing that DTW-based classifiers at least equal (and generally surpass) the accuracy of all their rivals across dozens of datasets. Thus far, most of the research has considered only the one-dimensional case, with practitioners generalizing to the multi-dimensional case in one of two ways, dependent or independent warping. In general, it appears the community believes either that the two ways are equivalent, or that the choice is irrelevant. In this work, we show that this is not the case. The two most commonly used multi-dimensional DTW methods can produce different classifications, and neither one dominates over the other. This seems to suggest that one should learn the best method for a particular application. However, we will show that this is not necessary; a simple, principled rule can be used on a case-by-case basis to predict which of the two methods we should trust at the time of classification. Our method allows us to ensure that classification results are at least as accurate as the better of the two rival methods, and, in many cases, our method is significantly more accurate. We demonstrate our ideas with the most extensive set of multi-dimensional time series classification experiments ever attempted. PMID:29104448
Exploring Methodologies and Indicators for Cross-disciplinary Applications
NASA Astrophysics Data System (ADS)
Bernknopf, R.; Pearlman, J.
2015-12-01
Assessing the impact and benefit of geospatial information is a multidisciplinary task that involves the social, economic and environmental knowledge to formulate indicators and methods. There are use cases that couple the social sciences including economics, psychology, sociology that incorporate geospatial information. Benefit - cost analysis is an empirical approach that uses money as an indicator for decision making. It is a traditional base for a use case and has been applied to geospatial information and other areas. A new use case that applies indicators is Meta Regression analysis, which is used to evaluate transfers of socioeconomic benefits from different geographic regions into a unifying statistical approach. In this technique, qualitative and quantitative variables are indicators, which provide a weighted average of value for the nonmarket good or resource over a large region. The expected willingness to pay for the nonmarket good can be applied to a specific region. A third use case is the application of Decision Support Systems and Tools that have been used for forecasting agricultural prices and analysis of hazard policies. However, new methods for integrating these disciplines into use cases, an avenue to instruct the development of operational applications of geospatial information, are needed. Experience in one case may not be broadly transferable to other uses and applications if multiple disciplines are involved. To move forward, more use cases are needed and, especially, applications in the private sector. Applications are being examined across a multidisciplinary community for good examples that would be instructive in meeting the challenge. This presentation will look at the results of an investigation into directions in the broader applications of use cases to teach the methodologies and use of indicators that have applications across fields of interest.
Methods to control for unmeasured confounding in pharmacoepidemiology: an overview.
Uddin, Md Jamal; Groenwold, Rolf H H; Ali, Mohammed Sanni; de Boer, Anthonius; Roes, Kit C B; Chowdhury, Muhammad A B; Klungel, Olaf H
2016-06-01
Background Unmeasured confounding is one of the principal problems in pharmacoepidemiologic studies. Several methods have been proposed to detect or control for unmeasured confounding either at the study design phase or the data analysis phase. Aim of the Review To provide an overview of commonly used methods to detect or control for unmeasured confounding and to provide recommendations for proper application in pharmacoepidemiology. Methods/Results Methods to control for unmeasured confounding in the design phase of a study are case only designs (e.g., case-crossover, case-time control, self-controlled case series) and the prior event rate ratio adjustment method. Methods that can be applied in the data analysis phase include, negative control method, perturbation variable method, instrumental variable methods, sensitivity analysis, and ecological analysis. A separate group of methods are those in which additional information on confounders is collected from a substudy. The latter group includes external adjustment, propensity score calibration, two-stage sampling, and multiple imputation. Conclusion As the performance and application of the methods to handle unmeasured confounding may differ across studies and across databases, we stress the importance of using both statistical evidence and substantial clinical knowledge for interpretation of the study results.
ERIC Educational Resources Information Center
Akgün, Levent
2015-01-01
The aim of this study is to identify prospective secondary mathematics teachers' opinions about the mathematical modeling method and the applicability of this method in high schools. The case study design, which is among the qualitative research methods, was used in the study. The study was conducted with six prospective secondary mathematics…
Application of electrical geophysics to the release of water resources, case of Ain Leuh (Morocco)
NASA Astrophysics Data System (ADS)
Zitouni, A.; Boukdir, A.; El Fjiji, H.; Baite, W.; Ekouele Mbaki, V. R.; Ben Said, H.; Echakraoui, Z.; Elissami, A.; El Maslouhi, M. R.
2018-05-01
Being seen needs in increasing waters in our contry for fine domestics, manufactures and agricultural, the prospecting of subterranean waters by geologic and hydrogeologic classic method remains inaplicable in the cases of the regions where one does not arrange drillings or polls (soundings) of gratitude (recongnition) in very sufficient (self-important) number. In that case of figure, the method of prospecting geophysics such as the method of nuclear magnetic resonance (NMR) and the method of the geophysics radar are usually used most usually because they showed, worldwide, results very desive in the projects of prospecting and evaluation of the resources in subterranean waters. In the present work, which concerns only the methodology of the electric resistivity, we treat the adopted methodological approach and the study of the case of application in the tray of Ajdir Ain Leuh.
ERIC Educational Resources Information Center
Coryn, Chris L. S.; Schroter, Daniela C.; Hanssen, Carl E.
2009-01-01
Brinkerhoff's Success Case Method (SCM) was developed with the specific purpose of assessing the impact of organizational interventions (e.g., training and coaching) on business goals by analyzing extreme groups using case study techniques and storytelling. As an efficient and cost-effective method of evaluative inquiry, SCM is attractive in other…
NASA Technical Reports Server (NTRS)
Hackett, J. E.; Sampath, S.; Phillips, C. G.
1981-01-01
A new, fast, non-iterative version of the "Wall Pressure Signature Method" is described and used to determine blockage and angle-of-attack wind tunnel corrections for highly-powered jet-flap models. The correction method is complemented by the application of tangential blowing at the tunnel floor to suppress flow breakdown there, using feedback from measured floor pressures. This tangential blowing technique was substantiated by subsequent flow investigations using an LV. The basic tests on an unswept, knee-blown, jet flapped wing were supplemented to include the effects of slat-removal, sweep and the addition of unflapped tips. C sub mu values were varied from 0 to 10 free-air C sub l's in excess of 18 were measured in some cases. Application of the new methods yielded corrected data which agreed with corresponding large tunnel "free air" resuls to within the limits of experimental accuracy in almost all cases. A program listing is provided, with sample cases.
Application of the Analog Method to Modelling Heat Waves: A Case Study with Power Transformers
2017-04-21
UNCLASSIFIED Massachusetts Institute of Technology Lincoln Laboratory APPLICATION OF THE ANALOG METHOD TO MODELLING HEAT WAVES: A CASE STUDY WITH...18 2 Calibration and validation statistics with the use of five atmospheric vari- ables to construct analogue diagnostics for JJA of transformer T2...electrical grid as a series of nodes (transformers) and edges (transmission lines) so that basic mathematical anal- ysis can be performed. The mathematics
Discussion on accuracy degree evaluation of accident velocity reconstruction model
NASA Astrophysics Data System (ADS)
Zou, Tiefang; Dai, Yingbiao; Cai, Ming; Liu, Jike
In order to investigate the applicability of accident velocity reconstruction model in different cases, a method used to evaluate accuracy degree of accident velocity reconstruction model is given. Based on pre-crash velocity in theory and calculation, an accuracy degree evaluation formula is obtained. With a numerical simulation case, Accuracy degrees and applicability of two accident velocity reconstruction models are analyzed; results show that this method is feasible in practice.
A Case Study in Persuasive Effect: Lyman Beecher on Duelling
ERIC Educational Resources Information Center
Minnick, Wayne C.
1971-01-01
The purpose of this paper is to describe and criticize methods critics commonly use to judge speech effects from historical records alone, and to provide a case study illustrating the application of those methods. (Author/JB)
[Application of case-based method in genetics and eugenics teaching].
Li, Ya-Xuan; Zhao, Xin; Zhang, Fei-Xiong; Hu, Ying-Kao; Yan, Yue-Ming; Cai, Min-Hua; Li, Xiao-Hui
2012-05-01
Genetics and Eugenics is a cross-discipline between genetics and eugenics. It is a common curriculum in many Chinese universities. In order to increase the learning interest, we introduced case teaching method and got a better teaching effect. Based on our teaching practices, we summarized some experiences about this subject. In this article, the main problem of case-based method applied in Genetics and Eugenics teaching was discussed.
Scandurra, Isabella; Hägglund, Maria; Koch, Sabine
2008-01-01
A significant problem with current health information technologies is that they poorly support collaborative work of healthcare professionals, sometimes leading to a fragmentation of workflow and disruption of healthcare processes. This paper presents two homecare cases, both applying multi-disciplinary thematic seminars (MdTS) as a collaborative method for user needs elicitation and requirements specification. This study describes the MdTS application to elicit user needs from different perspectives to coincide with collaborative professions' work practices in two cases. Despite different objectives, the two cases validated that MdTS emphasized the "points of intersection" in cooperative work. Different user groups with similar, yet distinct needs reached a common understanding of the entire work process, agreed upon requirements and participated in the design of prototypes supporting cooperative work. MdTS was applicable in both exploratory and normative studies aiming to elicit the specific requirements in a cooperative environment.
76 FR 37796 - Notice of Availability of Government-Owned Inventions; Available for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-28
... Aircraft Division, Business and Partnership Office, Office of Research and Technology Applications... October 26, 2010 U.S. Patent Application No. 7,839,304 B2: Method and System for Alerting Aircrew to... Application No. 12/868,772: Colorimetric Method for Detection of Biodiesel in Fuel, Navy Case No. PAX37, filed...
Batch mode grid generation: An endangered species
NASA Technical Reports Server (NTRS)
Schuster, David M.
1992-01-01
Non-interactive grid generation schemes should thrive as emphasis shifts from development of numerical analysis and design methods to application of these tools to real engineering problems. A strong case is presented for the continued development and application of non-interactive geometry modeling methods. Guidelines, strategies, and techniques for developing and implementing these tools are presented using current non-interactive grid generation methods as examples. These schemes play an important role in the development of multidisciplinary analysis methods and some of these applications are also discussed.
NASA Astrophysics Data System (ADS)
Borodachev, S. M.
2016-06-01
The simple derivation of recursive least squares (RLS) method equations is given as special case of Kalman filter estimation of a constant system state under changing observation conditions. A numerical example illustrates application of RLS to multicollinearity problem.
An International Survey of Industrial Applications of Formal Methods. Volume 2. Case Studies
1993-09-30
impact of the product on IBM revenues. 4. Error rates were claimed to be below industrial average and errors were minimal to fix. Formal methods, as...critical applications. These include: 3 I I International Survey of Industrial Applications 41 i) "Software failures, particularly under first use, seem...project to add improved modelling capability. I U International Survey of Industrial Applications 93 I Design and Implementation These products are being
Automated interferometric alignment system for paraboloidal mirrors
Maxey, L. Curtis
1993-01-01
A method is described for a systematic method of interpreting interference fringes obtained by using a corner cube retroreflector as an alignment aid when aigning a paraboloid to a spherical wavefront. This is applicable to any general case where such alignment is required, but is specifically applicable in the case of aligning an autocollimating test using a diverging beam wavefront. In addition, the method provides information which can be systematically interpreted such that independent information about pitch, yaw and focus errors can be obtained. Thus, the system lends itself readily to automation. Finally, although the method is developed specifically for paraboloids, it can be seen to be applicable to a variety of other aspheric optics when applied in combination with a wavefront corrector that produces a wavefront which, when reflected from the correctly aligned aspheric surface will produce a collimated wavefront like that obtained from the paraboloid when it is correctly aligned to a spherical wavefront.
Automated interferometric alignment system for paraboloidal mirrors
Maxey, L.C.
1993-09-28
A method is described for a systematic method of interpreting interference fringes obtained by using a corner cube retroreflector as an alignment aid when aligning a paraboloid to a spherical wavefront. This is applicable to any general case where such alignment is required, but is specifically applicable in the case of aligning an autocollimating test using a diverging beam wavefront. In addition, the method provides information which can be systematically interpreted such that independent information about pitch, yaw and focus errors can be obtained. Thus, the system lends itself readily to automation. Finally, although the method is developed specifically for paraboloids, it can be seen to be applicable to a variety of other aspheric optics when applied in combination with a wavefront corrector that produces a wavefront which, when reflected from the correctly aligned aspheric surface will produce a collimated wavefront like that obtained from the paraboloid when it is correctly aligned to a spherical wavefront. 14 figures.
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-06-01
Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.
Applications of a direct/iterative design method to complex transonic configurations
NASA Technical Reports Server (NTRS)
Smith, Leigh Ann; Campbell, Richard L.
1992-01-01
The current study explores the use of an automated direct/iterative design method for the reduction of drag in transport configurations, including configurations with engine nacelles. The method requires the user to choose a proper target-pressure distribution and then develops a corresponding airfoil section. The method can be applied to two-dimensional airfoil sections or to three-dimensional wings. The three cases that are presented show successful application of the method for reducing drag from various sources. The first two cases demonstrate the use of the method to reduce induced drag by designing to an elliptic span-load distribution and to reduce wave drag by decreasing the shock strength for a given lift. In the second case, a body-mounted nacelle is added and the method is successfully used to eliminate increases in wing drag associated with the nacelle addition by designing to an arbitrary pressure distribution as a result of the redesigning of a wing in combination with a given underwing nacelle to clean-wing, target-pressure distributions. These cases illustrate several possible uses of the method for reducing different types of drag. The magnitude of the obtainable drag reduction varies with the constraints of the problem and the configuration to be modified.
ERIC Educational Resources Information Center
Frasier, Lori D.; Thraen, Ioana; Kaplan, Rich; Goede, Patricia
2012-01-01
Objectives: The training of physicians, nurse examiners, social workers and other health professional on the evidentiary findings of sexual abuse in children is challenging. Our objective was to develop peer reviewed training cases for medical examiners of child sexual abuse, using a secure web based telehealth application (TeleCAM). Methods:…
Matsumoto, Naoki; Takenaka, Toshifumi; Ikeda, Nobuyuki; Yazaki, Satoshi; Sato, Yuichi
2015-01-01
To present the method of Naegele forceps delivery clinically practiced by the lead author, its success rate, and morbidity and to evaluate the relationship between morbidity and the number of forceps traction applications. Naegele forceps delivery was performed when the fetal head reached station +2 cm, the forceps were applied in the maternal pelvic application, and traction was slowly and gently performed. In the past two years, Naegele forceps delivery was attempted by the lead author in 87 cases, which were retrospectively reviewed. The numbers of traction applications were one in 64.7% of cases, two in 24.7%, and three or more in 10.7%. The success rate was 100%. No severe morbidity was observed in mothers or neonates. Neonatal facial injury occurred most commonly in cases with fetal head malrotation, elevated numbers of traction applications, and maternal complications. Umbilical artery acidemia most commonly occurred in cases with nonreassuring fetal status. The significant crude odds ratio for three or more traction applications was 20 in cases with malrotation. Naegele forceps delivery has a high success rate, but multiple traction applications will sometimes be required, particularly in cases with malrotation. Malrotation and elevated numbers of traction applications may lead to neonatal head damage.
Modification of the flow pass method as applied to problems of chemistry of planet atmospheres
NASA Technical Reports Server (NTRS)
Parshev, V. A.
1980-01-01
It was shown that the modified flow pass method possesses considerable effectiveness, both in the case when the coefficient of diffusion changes severely in the examined region and in the case when diffusion is the prevalent process, as compared with chemical reactions. The case when a regular pass proves inapplicable, or applicable in a limited interval of the decisive parameters, was examined.
Application of Canonical Effective Methods to Background-Independent Theories
NASA Astrophysics Data System (ADS)
Buyukcam, Umut
Effective formalisms play an important role in analyzing phenomena above some given length scale when complete theories are not accessible. In diverse exotic but physically important cases, the usual path-integral techniques used in a standard Quantum Field Theory approach seldom serve as adequate tools. This thesis exposes a new effective method for quantum systems, called the Canonical Effective Method, which owns particularly wide applicability in backgroundindependent theories as in the case of gravitational phenomena. The central purpose of this work is to employ these techniques to obtain semi-classical dynamics from canonical quantum gravity theories. Application to non-associative quantum mechanics is developed and testable results are obtained. Types of non-associative algebras relevant for magnetic-monopole systems are discussed. Possible modifications of hypersurface deformation algebra and the emergence of effective space-times are presented. iii.
Formal functional test designs with a test representation language
NASA Technical Reports Server (NTRS)
Hops, J. M.
1993-01-01
The application of the category-partition method to the test design phase of hardware, software, or system test development is discussed. The method provides a formal framework for reducing the total number of possible test cases to a minimum logical subset for effective testing. An automatic tool and a formal language were developed to implement the method and produce the specification of test cases.
Application of a Novel Collaboration Engineering Method for Learning Design: A Case Study
ERIC Educational Resources Information Center
Cheng, Xusen; Li, Yuanyuan; Sun, Jianshan; Huang, Jianqing
2016-01-01
Collaborative case studies and computer-supported collaborative learning (CSCL) play an important role in the modern education environment. A number of researchers have given significant attention to learning design in order to improve the satisfaction of collaborative learning. Although collaboration engineering (CE) is a mature method widely…
Methods and Case Studies for Teaching and Learning about Failure and Safety.
ERIC Educational Resources Information Center
Bignell, Victor
1999-01-01
Discusses methods for analyzing case studies of failures of technological systems. Describes two distance learning courses that compare standard models of failure and success with the actuality of given scenarios. Provides teaching and learning materials and information sources for application to aspects of design, manufacture, inspection, use,…
NASA Astrophysics Data System (ADS)
Yuanyuan, Xu; Zhengmao, Zhang; Xiang, Fang; Yuanshuai, Xu; Xinxin, Song
2018-03-01
The combination of theory and practice is a difficult problem on dispatcher training. Through a typical example of case, this paper provides an effective case teaching method for dispatcher training, and combines the theoretical discussion of the rule of experience with cases and achieves vividness. It helps students to understand and catch the key points of the theory, and improve their practical skills.
ERIC Educational Resources Information Center
Ahrens, Fred; Mistry, Rajendra
2005-01-01
In product engineering there often arise design analysis problems for which a commercial software package is either unavailable or cost prohibitive. Further, these calculations often require successive iterations that can be time intensive when performed by hand, thus development of a software application is indicated. This case relates to the…
Illustrated structural application of universal first-order reliability method
NASA Technical Reports Server (NTRS)
Verderaime, V.
1994-01-01
The general application of the proposed first-order reliability method was achieved through the universal normalization of engineering probability distribution data. The method superimposes prevailing deterministic techniques and practices on the first-order reliability method to surmount deficiencies of the deterministic method and provide benefits of reliability techniques and predictions. A reliability design factor is derived from the reliability criterion to satisfy a specified reliability and is analogous to the deterministic safety factor. Its application is numerically illustrated on several practical structural design and verification cases with interesting results and insights. Two concepts of reliability selection criteria are suggested. Though the method was developed to support affordable structures for access to space, the method should also be applicable for most high-performance air and surface transportation systems.
General image method in a plane-layered elastostatic medium
NASA Technical Reports Server (NTRS)
Fares, N.; Li, V. C.
1988-01-01
The general-image method presently used to obtain the elastostatic fields in plane-layered media relies on the use of potentials in order to represent elastic fields. For the case of a single interface, this method yields the displacement field in closed form, and is applicable to antiplane, plane, and three-dimensional problems. In the case of multiplane interfaces, the image method generates the displacement fields in terms of infinite series whose convergences can be accelerated to improve method efficiency.
Automated real-time software development
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Walker, Carrie K.; Turkovich, John J.
1993-01-01
A Computer-Aided Software Engineering (CASE) system has been developed at the Charles Stark Draper Laboratory (CSDL) under the direction of the NASA Langley Research Center. The CSDL CASE tool provides an automated method of generating source code and hard copy documentation from functional application engineering specifications. The goal is to significantly reduce the cost of developing and maintaining real-time scientific and engineering software while increasing system reliability. This paper describes CSDL CASE and discusses demonstrations that used the tool to automatically generate real-time application code.
Waterson, Patrick; Robertson, Michelle M; Cooke, Nancy J; Militello, Laura; Roth, Emilie; Stanton, Neville A
2015-01-01
An important part of the application of sociotechnical systems theory (STS) is the development of methods, tools and techniques to assess human factors and ergonomics workplace requirements. We focus in this paper on describing and evaluating current STS methods for workplace safety, as well as outlining a set of six case studies covering the application of these methods to a range of safety contexts. We also describe an evaluation of the methods in terms of ratings of their ability to address a set of theoretical and practical questions (e.g. the degree to which methods capture static/dynamic aspects of tasks and interactions between system levels). The outcomes from the evaluation highlight a set of gaps relating to the coverage and applicability of current methods for STS and safety (e.g. coverage of external influences on system functioning; method usability). The final sections of the paper describe a set of future challenges, as well as some practical suggestions for tackling these. We provide an up-to-date review of STS methods, a set of case studies illustrating their use and an evaluation of their strengths and weaknesses. The paper concludes with a 'roadmap' for future work.
Direct application of Padé approximant for solving nonlinear differential equations.
Vazquez-Leal, Hector; Benhammouda, Brahim; Filobello-Nino, Uriel; Sarmiento-Reyes, Arturo; Jimenez-Fernandez, Victor Manuel; Garcia-Gervacio, Jose Luis; Huerta-Chua, Jesus; Morales-Mendoza, Luis Javier; Gonzalez-Lee, Mario
2014-01-01
This work presents a direct procedure to apply Padé method to find approximate solutions for nonlinear differential equations. Moreover, we present some cases study showing the strength of the method to generate highly accurate rational approximate solutions compared to other semi-analytical methods. The type of tested nonlinear equations are: a highly nonlinear boundary value problem, a differential-algebraic oscillator problem, and an asymptotic problem. The high accurate handy approximations obtained by the direct application of Padé method shows the high potential if the proposed scheme to approximate a wide variety of problems. What is more, the direct application of the Padé approximant aids to avoid the previous application of an approximative method like Taylor series method, homotopy perturbation method, Adomian Decomposition method, homotopy analysis method, variational iteration method, among others, as tools to obtain a power series solutions to post-treat with the Padé approximant. 34L30.
Two Case Studies in the Scientific Method: Antisense Experiments and HIV Vaccination Studies.
ERIC Educational Resources Information Center
Guilfoile, Patrick
1999-01-01
Presents two recent cases that can be used in the classroom to illustrate the application of scientific methods in biological research: (1) the use of a complementary RNA or DNA molecule to block the production or translation of an mRNA molecule; and (2) the development of HIV trial vaccines. Contains 20 references. (WRM)
Applying Case-Based Method in Designing Self-Directed Online Instruction: A Formative Research Study
ERIC Educational Resources Information Center
Luo, Heng; Koszalka, Tiffany A.; Arnone, Marilyn P.; Choi, Ikseon
2018-01-01
This study investigated the case-based method (CBM) instructional-design theory and its application in designing self-directed online instruction. The purpose of this study was to validate and refine the theory for a self-directed online instruction context. Guided by formative research methodology, this study first developed an online tutorial…
NASA Technical Reports Server (NTRS)
Lahti, G. P.
1971-01-01
The method of steepest descent used in optimizing one-dimensional layered radiation shields is extended to multidimensional, multiconstraint situations. The multidimensional optimization algorithm and equations are developed for the case of a dose constraint in any one direction being dependent only on the shield thicknesses in that direction and independent of shield thicknesses in other directions. Expressions are derived for one-, two-, and three-dimensional cases (one, two, and three constraints). The precedure is applicable to the optimization of shields where there are different dose constraints and layering arrangements in the principal directions.
Máthé, Koppány; Buşoniu, Lucian
2015-01-01
Unmanned aerial vehicles (UAVs) have gained significant attention in recent years. Low-cost platforms using inexpensive sensor payloads have been shown to provide satisfactory flight and navigation capabilities. In this report, we survey vision and control methods that can be applied to low-cost UAVs, and we list some popular inexpensive platforms and application fields where they are useful. We also highlight the sensor suites used where this information is available. We overview, among others, feature detection and tracking, optical flow and visual servoing, low-level stabilization and high-level planning methods. We then list popular low-cost UAVs, selecting mainly quadrotors. We discuss applications, restricting our focus to the field of infrastructure inspection. Finally, as an example, we formulate two use-cases for railway inspection, a less explored application field, and illustrate the usage of the vision and control techniques reviewed by selecting appropriate ones to tackle these use-cases. To select vision methods, we run a thorough set of experimental evaluations. PMID:26121608
Considerations for the Systematic Analysis and Use of Single-Case Research
ERIC Educational Resources Information Center
Horner, Robert H.; Swaminathan, Hariharan; Sugai, George; Smolkowski, Keith
2012-01-01
Single-case research designs provide a rigorous research methodology for documenting experimental control. If single-case methods are to gain wider application, however, a need exists to define more clearly (a) the logic of single-case designs, (b) the process and decision rules for visual analysis, and (c) an accepted process for integrating…
Reasoning with case histories of process knowledge for efficient process development
NASA Technical Reports Server (NTRS)
Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.
1988-01-01
The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.
Anisotropic adaptive mesh generation in two dimensions for CFD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borouchaki, H.; Castro-Diaz, M.J.; George, P.L.
This paper describes the extension of the classical Delaunay method in the case where anisotropic meshes are required such as in CFD when the modelized physic is strongly directional. The way in which such a mesh generation method can be incorporated in an adaptative loop of CFD as well as the case of multicriterium adaptation are discussed. Several concrete application examples are provided to illustrate the capabilities of the proposed method.
Shinu, Pottathil; Singh, Varsha; Nair, Anroop; Mehrishi, Priya; Mehta, Sonia; Joshi, Ekta
2013-10-01
The study was designed to compare the efficacy of cetylpyridinium chloride (CPC) and sodium chloride (NaCl) decontamination method with N-acetyl L-Cystine (NALC) and sodium hydroxide (NaOH) decontamination (the reference method) method for the recovery of Mycobacterium tuberculosis (MTB) from clinically suspected cases of pulmonary tuberculosis. To evaluate CPC-NaCl and NALC-NaOH decontamination methods, sputum specimens (n = 796) were studied (culturing on Löwenstein-Jensen medium), and the performances were compared. The CPC-NaCl decontamination method demonstrated a sensitivity, specificity, negative predictive value, and positive predictive value of 97.99%, 87.53%, 70.19%, and 99.32%, respectively, when compared to NALC-NaOH decontamination method. In summary, CPC-NaCl decontamination method effectively detected significantly higher number of MTB cases (n = 208) than NALC-NaOH decontamination method (n = 149) particularly in sputum with scanty bacilli and smear-negative cases, indicating the potential of CPC-NaCl decontamination method to preserve paucibacillary cases more efficient than NALC-NaOH decontamination method. © 2013.
Measuring Workload Demand of Informatics Systems with the Clinical Case Demand Index
Iyengar, M. Sriram; Rogith, Deevakar; Florez-Arango, Jose F
2017-01-01
Introduction: The increasing use of Health Information Technology (HIT) can add substantially to workload on clinical providers. Current methods for assessing workload do not take into account the nature of clinical cases and the use of HIT tools while solving them. Methods: The Clinical Case Demand Index (CCDI), consisting of a summary score and visual representation, was developed to meet this need. Consistency with current perceived workload measures was evaluated in a Randomized Control Trial of a mobile health system. Results: CCDI is significantly correlated with existing workload measures and inversely related to provider performance. Discussion: CCDI combines subjective and objective characteristics of clinical cases along with cognitive and clinical dimensions. Applications include evaluation of HIT tools, clinician scheduling, medical education. Conclusion: CCDI supports comparative effectiveness research of HIT tools. In addition, CCDI could have numerous applications including training, clinical trials, design of clinical workflows, and others. PMID:29854166
The Application of Linear and Nonlinear Water Tanks Case Study in Teaching of Process Control
NASA Astrophysics Data System (ADS)
Li, Xiangshun; Li, Zhiang
2018-02-01
In the traditional process control teaching, the importance of passing knowledge is emphasized while the development of creative and practical abilities of students is ignored. Traditional teaching methods are not very helpful to breed a good engineer. Case teaching is a very useful way to improve students’ innovative and practical abilities. In the traditional case teaching, knowledge points are taught separately based on different examples or no examples, thus it is very hard to setup the whole knowledge structure. Though all the knowledge is learned, how to use the knowledge to solve engineering problems keeps challenging for students. In this paper, the linear and nonlinear tanks are taken as illustrative examples which involves several knowledge points of process control. The application method of each knowledge point is discussed in detail and simulated. I believe the case-based study will be helpful for students.
ERIC Educational Resources Information Center
Koenig, Lane; Fields, Errol L.; Dall, Timothy M.; Ameen, Ansari Z.; Harwood, Henrick J.
This report demonstrates three applications of case-mix methods using regression analysis. The results are used to assess the relative effectiveness of substance abuse treatment providers. The report also examines the ability of providers to improve client employment outcomes, an outcome domain relatively unexamined in the assessment of provider…
Formalizing Space Shuttle Software Requirements
NASA Technical Reports Server (NTRS)
Crow, Judith; DiVito, Ben L.
1996-01-01
This paper describes two case studies in which requirements for new flight-software subsystems on NASA's Space Shuttle were analyzed, one using standard formal specification techniques, the other using state exploration. These applications serve to illustrate three main theses: (1) formal methods can complement conventional requirements analysis processes effectively, (2) formal methods confer benefits regardless of how extensively they are adopted and applied, and (3) formal methods are most effective when they are judiciously tailored to the application.
NASA Astrophysics Data System (ADS)
Çakır, Süleyman
2017-10-01
In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.
Comparison of formant detection methods used in speech processing applications
NASA Astrophysics Data System (ADS)
Belean, Bogdan
2013-11-01
The paper describes time frequency representations of speech signal together with the formant significance in speech processing applications. Speech formants can be used in emotion recognition, sex discrimination or diagnosing different neurological diseases. Taking into account the various applications of formant detection in speech signal, two methods for detecting formants are presented. First, the poles resulted after a complex analysis of LPC coefficients are used for formants detection. The second approach uses the Kalman filter for formant prediction along the speech signal. Results are presented for both approaches on real life speech spectrograms. A comparison regarding the features of the proposed methods is also performed, in order to establish which method is more suitable in case of different speech processing applications.
Case-Cohort Studies: Design and Applicability to Hand Surgery.
Vojvodic, Miliana; Shafarenko, Mark; McCabe, Steven J
2018-04-24
Observational studies are common research strategies in hand surgery. The case-cohort design offers an efficient and resource-friendly method for risk assessment and outcomes analysis. Case-cohorts remain underrepresented in upper extremity research despite several practical and economic advantages over case-control studies. This report outlines the purpose, utility, and structure of the case-cohort design and offers a sample research question to demonstrate its value to risk estimation for adverse surgical outcomes. The application of well-designed case-cohort studies is advocated in an effort to improve the quality and quantity of observational research evidence in hand and upper extremity surgery. Copyright © 2018 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Hou, Gene J. W.
1994-01-01
A method for eigenvalue and eigenvector approximate analysis for the case of repeated eigenvalues with distinct first derivatives is presented. The approximate analysis method developed involves a reparameterization of the multivariable structural eigenvalue problem in terms of a single positive-valued parameter. The resulting equations yield first-order approximations to changes in the eigenvalues and the eigenvectors associated with the repeated eigenvalue problem. This work also presents a numerical technique that facilitates the definition of an eigenvector derivative for the case of repeated eigenvalues with repeated eigenvalue derivatives (of all orders). Examples are given which demonstrate the application of such equations for sensitivity and approximate analysis. Emphasis is placed on the application of sensitivity analysis to large-scale structural and controls-structures optimization problems.
Combined use of videoendoscopy and X-ray imaging for improved monitoring of stenting application
NASA Astrophysics Data System (ADS)
Cysewska-Sobusiak, A. R.; Sowier, A.; Skrzywanek, P.
2005-09-01
The subject of this paper concerns advanced techniques of procedures and imaging used in minimally invasive surgery and in non-operable cases of the alimentary tract tumor therapy. Examples of videoendoscopy and X-ray imaging used for the application of stents (prostheses) and catheters allowing for the performance of diagnostic and endo-therapeutic procedures are described. The possibility was indicated to elaborate a new method of proceeding in tumor therapy in the patients for whom the methods used so far were ineffective. In the paper examples of combined imaging the application of metallic stents and plastic catheters allowing for the performance of diagnostic and therapeutic procedures are presented. The cases shown refer to tumor located in the esophagus and in the bile and pancreatic ducts.
Sansom, P; Copley, V R; Naik, F C; Leach, S; Hall, I M
2013-01-01
Statistical methods used in spatio-temporal surveillance of disease are able to identify abnormal clusters of cases but typically do not provide a measure of the degree of association between one case and another. Such a measure would facilitate the assignment of cases to common groups and be useful in outbreak investigations of diseases that potentially share the same source. This paper presents a model-based approach, which on the basis of available location data, provides a measure of the strength of association between cases in space and time and which is used to designate and visualise the most likely groupings of cases. The method was developed as a prospective surveillance tool to signal potential outbreaks, but it may also be used to explore groupings of cases in outbreak investigations. We demonstrate the method by using a historical case series of Legionnaires’ disease amongst residents of England and Wales. PMID:23483594
The Role of CMR and Others in Project Implementation using the CM Method to Support the Government
NASA Astrophysics Data System (ADS)
Tada, Hiroshi; Miyatake, Ichiro; Mouri, Junji; Endo, Kenji; Fueta, Toshiharu
In Japan, the construction management (CM) method has been introduced as a measure to support the governmental agencies, in developing and maintaining local infrastructures, or in executing public works projects in an appropriate manner, etc. The scope of work of the Construction Manager (CMR) of the CM method is specified as work items, in the special specification document for CM services contained in the contract documents, as a reflection of the client's expectations towards the performance of CMR. However, the CM services has been conducted as required on a case-by-case basis, because it is not possible to anticipate the actual construction status in advance, and thus the special specification document does not provide full detail of the scope of work of CMR. In such case, there may be a difference in the way the scope of work in the special specification document is recognized between the client and the CMR, which could make the CM method less effective. Moreover, there is a case in which the role sharing between the client and the CMR is not clearly defined, and both parties may engage in the same task in such case, causing an obstacle for smooth project implementation. For this reason, it is required to prepare the special specification document which clearly defines the scope of work of CMR, by examining the status of application of the CM method in actual project cases, and to improve the practices of the CM method as necessary. In view of this background, this study looks in to the actual project cases using the CM method, for the purpose of clarifying the actual scope of work of CMR for each task item defined in the special specification document, and the role sharing between the client and CMR, in the aim of contributing the promotion of the use and the effective application of the CM method.
Ontology-Based Method for Fault Diagnosis of Loaders.
Xu, Feixiang; Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei
2018-02-28
This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study.
Ontology-Based Method for Fault Diagnosis of Loaders
Liu, Xinhui; Chen, Wei; Zhou, Chen; Cao, Bingwei
2018-01-01
This paper proposes an ontology-based fault diagnosis method which overcomes the difficulty of understanding complex fault diagnosis knowledge of loaders and offers a universal approach for fault diagnosis of all loaders. This method contains the following components: (1) An ontology-based fault diagnosis model is proposed to achieve the integrating, sharing and reusing of fault diagnosis knowledge for loaders; (2) combined with ontology, CBR (case-based reasoning) is introduced to realize effective and accurate fault diagnoses following four steps (feature selection, case-retrieval, case-matching and case-updating); and (3) in order to cover the shortages of the CBR method due to the lack of concerned cases, ontology based RBR (rule-based reasoning) is put forward through building SWRL (Semantic Web Rule Language) rules. An application program is also developed to implement the above methods to assist in finding the fault causes, fault locations and maintenance measures of loaders. In addition, the program is validated through analyzing a case study. PMID:29495646
Kilgore, Matthew D
The cardiology service line director at a health maintenance organization (HMO) in Washington State required a valid, reliable, and practical means for measuring workloads and other productivity factors for six heart failure (HF) registered nurse case managers located across three geographical regions. The Kilgore Heart Failure Case Management (KHFCM) Acuity Tool was systematically designed, developed, and validated to measure workload as a dependent function of the number of heart failure case management (HFCM) services rendered and the duration of times spent on various care duties. Research and development occurred at various HMO-affiliated internal medicine and cardiology offices throughout Western Washington. The concepts, methods, and principles used to develop the KHFCM Acuity Tool are applicable for any type of health care professional aiming to quantify workload using a high-quality objective tool. The content matter, scaling, and language on the KHFCM Acuity Tool are specific to HFCM settings. The content matter and numeric scales for the KHFCM Acuity Tool were developed and validated using a mixed-method participant action research method applied to a group of six outpatient HF case managers and their respective caseloads. The participant action research method was selected, because the application of this method requires research participants to become directly involved in the diagnosis of research problems, the planning and execution of actions taken to address those problems, and the implementation of progressive strategies throughout the course of the study, as necessary, to produce the most credible and practical practice improvements (; ; ; ). Heart failure case managers served clients with New York Heart Association Functional Class III-IV HF (), and encounters were conducted primarily by telephone or in-office consultation. A mix of qualitative and quantitative results demonstrated a variety of quality improvement outcomes achieved by the design and practice application of the KHFCM Acuity Tool. Quality improvement outcomes included a more valid reflection of encounter times and demonstration of the KHFCM Acuity Tool as a reliable, practical, credible, and satisfying tool for reflecting HF case manager workloads and HF disease severity. The KHFCM Acuity Tool defines workload simply as a function of the number of HFCM services performed and the duration of time spent on a client encounter. The design of the tool facilitates the measure of workload, service utilization, and HF disease characteristics, independently from the overall measure of acuity, so that differences in individual case manager practice, as well as client characteristics within sites, across sites, and potentially throughout annual seasons, can be demonstrated. Data produced from long-term applications of the KHFCM Acuity Tool, across all regions, could serve as a driver for establishing systemwide HFCM productivity benchmarks or standards of practice for HF case managers. Data produced from localized applications could serve as a reference for coordinating staffing resources or developing HFCM productivity benchmarks within individual regions or sites.
Langholz, Bryan; Thomas, Duncan C.; Stovall, Marilyn; Smith, Susan A.; Boice, John D.; Shore, Roy E.; Bernstein, Leslie; Lynch, Charles F.; Zhang, Xinbo; Bernstein, Jonine L.
2009-01-01
Summary Methods for the analysis of individually matched case-control studies with location-specific radiation dose and tumor location information are described. These include likelihood methods for analyses that just use cases with precise location of tumor information and methods that also include cases with imprecise tumor location information. The theory establishes that each of these likelihood based methods estimates the same radiation rate ratio parameters, within the context of the appropriate model for location and subject level covariate effects. The underlying assumptions are characterized and the potential strengths and limitations of each method are described. The methods are illustrated and compared using the WECARE study of radiation and asynchronous contralateral breast cancer. PMID:18647297
Waterson, Patrick; Robertson, Michelle M.; Cooke, Nancy J.; Militello, Laura; Roth, Emilie; Stanton, Neville A.
2015-01-01
An important part of the application of sociotechnical systems theory (STS) is the development of methods, tools and techniques to assess human factors and ergonomics workplace requirements. We focus in this paper on describing and evaluating current STS methods for workplace safety, as well as outlining a set of six case studies covering the application of these methods to a range of safety contexts. We also describe an evaluation of the methods in terms of ratings of their ability to address a set of theoretical and practical questions (e.g. the degree to which methods capture static/dynamic aspects of tasks and interactions between system levels). The outcomes from the evaluation highlight a set of gaps relating to the coverage and applicability of current methods for STS and safety (e.g. coverage of external influences on system functioning; method usability). The final sections of the paper describe a set of future challenges, as well as some practical suggestions for tackling these. Practitioner Summary: We provide an up-to-date review of STS methods, a set of case studies illustrating their use and an evaluation of their strengths and weaknesses. The paper concludes with a ‘roadmap’ for future work. PMID:25832121
Analysis of experts' perception of the effectiveness of teaching methods
NASA Astrophysics Data System (ADS)
Kindra, Gurprit S.
1984-03-01
The present study attempts to shed light on the perceptions of business educators regarding the effectiveness of six methodologies in achieving Gagné's five learning outcomes. Results of this study empirically confirm the oft-stated contention that no one method is globally effective for the attainment of all objectives. Specifically, business games, traditional lecture, and case study methods are perceived to be most effective for the learning of application, knowledge acquisition, and analysis and application, respectively.
Current Practices in Constructing and Evaluating Assurance Cases With Applications to Aviation
NASA Technical Reports Server (NTRS)
Rinehart, David J.; Knight, John C.; Rowanhill, Jonathan
2015-01-01
This report introduces and provides an overview of assurance cases including theory, practice, and evaluation. This report includes a section that introduces the principles, terminology, and history of assurance cases. The core of the report presents twelve example uses of assurance cases from a range of domains, using a novel classification scheme. The report also reviews the state of the art in assurance case evaluation methods.
NASA Astrophysics Data System (ADS)
Kaiya, Haruhiko; Osada, Akira; Kaijiri, Kenji
We present a method to identify stakeholders and their preferences about non-functional requirements (NFR) by using use case diagrams of existing systems. We focus on the changes about NFR because such changes help stakeholders to identify their preferences. Comparing different use case diagrams of the same domain helps us to find changes to be occurred. We utilize Goal-Question-Metrics (GQM) method for identifying variables that characterize NFR, and we can systematically represent changes about NFR using the variables. Use cases that represent system interactions help us to bridge the gap between goals and metrics (variables), and we can easily construct measurable NFR. For validating and evaluating our method, we applied our method to an application domain of Mail User Agent (MUA) system.
Evaluating Health Information Systems Using Ontologies
Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-01-01
Background There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. Objectives The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems—whether similar or heterogeneous—by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. Methods On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. Results The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. Conclusions The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems. PMID:27311735
Understanding What It Means for Assurance Cases to "Work"
NASA Technical Reports Server (NTRS)
Rinehart, David J.; Knight, John C.; Rowanhill, Jonathan
2017-01-01
This report is the result of our year-long investigation into assurance case practices and effectiveness. Assurance cases are a method for working toward acceptable critical system performance. They represent a significant thread of applied assurance methods extending back many decades and being employed in a range of industries and applications. Our research presented in this report includes a literature survey of over 50 sources and interviews with nearly a dozen practitioners in the field. We have organized our results into seven major claimed assurance case benefits and their supporting mechanisms, evidence, counter-evidence, and caveats.
The "four quadrants" approach to clinical ethics case analysis; an application and review.
Sokol, D K
2008-07-01
In 1982, Jonsen, Siegler and Winslade published Clinical Ethics, in which they described the "four quadrants" approach, a new method of analysing clinical ethics cases. Although the book is now in its 6th edition, a literature search has revealed only one academic paper demonstrating the method at work. This paper is an attempt to start filling this gap. As a way of describing and testing the approach, I apply the four quadrants method to a detailed clinical ethics case. The analysis is interspersed with reflections on the method itself. It is hoped that this experiment will encourage ethicists and clinicians to devote more attention to this neglected approach.
Prediction Interval Development for Wind-Tunnel Balance Check-Loading
NASA Technical Reports Server (NTRS)
Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.
2014-01-01
Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.
Efficient strategy for detecting gene × gene joint action and its application in schizophrenia.
Won, Sungho; Kwon, Min-Seok; Mattheisen, Manuel; Park, Suyeon; Park, Changsoon; Kihara, Daisuke; Cichon, Sven; Ophoff, Roel; Nöthen, Markus M; Rietschel, Marcella; Baur, Max; Uitterlinden, Andre G; Hofmann, A; Lange, Christoph
2014-01-01
We propose a new approach to detect gene × gene joint action in genome-wide association studies (GWASs) for case-control designs. This approach offers an exhaustive search for all two-way joint action (including, as a special case, single gene action) that is computationally feasible at the genome-wide level and has reasonable statistical power under most genetic models. We found that the presence of any gene × gene joint action may imply differences in three types of genetic components: the minor allele frequencies and the amounts of Hardy-Weinberg disequilibrium may differ between cases and controls, and between the two genetic loci the degree of linkage disequilibrium may differ between cases and controls. Using Fisher's method, it is possible to combine the different sources of genetic information in an overall test for detecting gene × gene joint action. The proposed statistical analysis is efficient and its simplicity makes it applicable to GWASs. In the current study, we applied the proposed approach to a GWAS on schizophrenia and found several potential gene × gene interactions. Our application illustrates the practical advantage of the proposed method. © 2013 WILEY PERIODICALS, INC.
Applicability of transfer tensor method for open quantum system dynamics.
Gelzinis, Andrius; Rybakovas, Edvardas; Valkunas, Leonas
2017-12-21
Accurate simulations of open quantum system dynamics is a long standing issue in the field of chemical physics. Exact methods exist, but are costly, while perturbative methods are limited in their applicability. Recently a new black-box type method, called transfer tensor method (TTM), was proposed [J. Cerrillo and J. Cao, Phys. Rev. Lett. 112, 110401 (2014)]. It allows one to accurately simulate long time dynamics with a numerical cost of solving a time-convolution master equation, provided many initial system evolution trajectories are obtained from some exact method beforehand. The possible time-savings thus strongly depend on the ratio of total versus initial evolution lengths. In this work, we investigate the parameter regimes where an application of TTM would be most beneficial in terms of computational time. We identify several promising parameter regimes. Although some of them correspond to cases when perturbative theories could be expected to perform well, we find that the accuracy of such approaches depends on system parameters in a more complex way than it is commonly thought. We propose that the TTM should be applied whenever system evolution is expected to be long and accuracy of perturbative methods cannot be ensured or in cases when the system under consideration does not correspond to any single perturbative regime.
Young, Robin L; Weinberg, Janice; Vieira, Verónica; Ozonoff, Al; Webster, Thomas F
2010-07-19
A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic.
2010-01-01
Background A common, important problem in spatial epidemiology is measuring and identifying variation in disease risk across a study region. In application of statistical methods, the problem has two parts. First, spatial variation in risk must be detected across the study region and, second, areas of increased or decreased risk must be correctly identified. The location of such areas may give clues to environmental sources of exposure and disease etiology. One statistical method applicable in spatial epidemiologic settings is a generalized additive model (GAM) which can be applied with a bivariate LOESS smoother to account for geographic location as a possible predictor of disease status. A natural hypothesis when applying this method is whether residential location of subjects is associated with the outcome, i.e. is the smoothing term necessary? Permutation tests are a reasonable hypothesis testing method and provide adequate power under a simple alternative hypothesis. These tests have yet to be compared to other spatial statistics. Results This research uses simulated point data generated under three alternative hypotheses to evaluate the properties of the permutation methods and compare them to the popular spatial scan statistic in a case-control setting. Case 1 was a single circular cluster centered in a circular study region. The spatial scan statistic had the highest power though the GAM method estimates did not fall far behind. Case 2 was a single point source located at the center of a circular cluster and Case 3 was a line source at the center of the horizontal axis of a square study region. Each had linearly decreasing logodds with distance from the point. The GAM methods outperformed the scan statistic in Cases 2 and 3. Comparing sensitivity, measured as the proportion of the exposure source correctly identified as high or low risk, the GAM methods outperformed the scan statistic in all three Cases. Conclusions The GAM permutation testing methods provide a regression-based alternative to the spatial scan statistic. Across all hypotheses examined in this research, the GAM methods had competing or greater power estimates and sensitivities exceeding that of the spatial scan statistic. PMID:20642827
Applying TEAM in Regional Sketch Planning: Three Case Studies in Atlanta, Orlando, St. Louis
This EPA report documents 3 case studies of the application of TEAM (Travel Efficiency Assessment Method) to develop, assess and quantify regional greenhouse gas and criteria pollutant emission reductions from travel efficiency strategies in a cost effecti
The Application of Simulation Method in Isothermal Elastic Natural Gas Pipeline
NASA Astrophysics Data System (ADS)
Xing, Chunlei; Guan, Shiming; Zhao, Yue; Cao, Jinggang; Chu, Yanji
2018-02-01
This Elastic pipeline mathematic model is of crucial importance in natural gas pipeline simulation because of its compliance with the practical industrial cases. The numerical model of elastic pipeline will bring non-linear complexity to the discretized equations. Hence the Newton-Raphson method cannot achieve fast convergence in this kind of problems. Therefore A new Newton Based method with Powell-Wolfe Condition to simulate the Isothermal elastic pipeline flow is presented. The results obtained by the new method aregiven based on the defined boundary conditions. It is shown that the method converges in all cases and reduces significant computational cost.
An operational modal analysis method in frequency and spatial domain
NASA Astrophysics Data System (ADS)
Wang, Tong; Zhang, Lingmi; Tamura, Yukio
2005-12-01
A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.
NASA Astrophysics Data System (ADS)
Miller, Urszula; Grzelka, Agnieszka; Romanik, Elżbieta; Kuriata, Magdalena
2018-01-01
Operation of municipal management facilities is inseparable from the problem of malodorous compounds emissions to the atmospheric air. In that case odor nuisance is related to the chemical composition of waste, sewage and sludge as well as to the activity of microorganisms whose products of life processes can be those odorous compounds. Significant reduction of odorant emission from many sources can be achieved by optimizing parameters and conditions of processes. However, it is not always possible to limit the formation of odorants. In such cases it is best to use appropriate deodorizing methods. The choice of the appropriate method is based on in terms of physical parameters, emission intensity of polluted gases and their composition, if it is possible to determine. Among the solutions used in municipal economy, there can be distinguished physico-chemical methods such as sorption and oxidation. In cases where the source of the emission is not encapsulated, odor masking techniques are used, which consists of spraying preparations that neutralize unpleasant odors. The paper presents the characteristics of selected methods of eliminating odor nuisance and evaluation of their applicability in municipal management facilities.
NASA Technical Reports Server (NTRS)
Jones, Robert T
1937-01-01
A simplified treatment of the application of Heaviside's operational methods to problems of airplane dynamics is given. Certain graphical methods and logarithmic formulas that lessen the amount of computation involved are explained. The problem representing a gust disturbance or control manipulation is taken up and it is pointed out that in certain cases arbitrary control manipulations may be dealt with as though they imposed specific constraints on the airplane, thus avoiding the necessity of any integration. The application of the calculations described in the text is illustrated by several examples chosen to show the use of the methods and the practicability of the graphical and logarithmic computations described.
Therapeutic Uses of the WebCam in Child Psychiatry
ERIC Educational Resources Information Center
Chlebowski, Susan; Fremont, Wanda
2011-01-01
Objective: The authors provide examples for the use of the WebCam as a therapeutic tool in child psychiatry, discussing cases to demonstrate the application of the WebCam, which is most often used in psychiatry training programs during resident supervision and for case presentations. Method: Six cases illustrate the use of the WebCam in individual…
ERIC Educational Resources Information Center
Spirer, Janet E.
In comparison with traditional experimental design, which is concerned with what happened, a case study approach is more appropriate for answering the question of why or how something happened. As an alternative complementary-vocational-education-evaluation approach, the case study attempts to describe and analyze some program in comprehensive…
A Real-Life Case Study of Audit Interactions--Resolving Messy, Complex Problems
ERIC Educational Resources Information Center
Beattie, Vivien; Fearnley, Stella; Hines, Tony
2012-01-01
Real-life accounting and auditing problems are often complex and messy, requiring the synthesis of technical knowledge in addition to the application of generic skills. To help students acquire the necessary skills to deal with these problems effectively, educators have called for the use of case-based methods. Cases based on real situations (such…
Campos, Eneida Rached; Moreira-Filho, Djalma de Carvalho; Silva, Marcos Tadeu Nolasco da
2018-05-01
Scores to predict treatment outcomes have earned a well-deserved place in healthcare practice. However, when used to help achieve excellence in the care of a given disease, scores should also take into account organizational and social aspects. This article aims to create scores to obtain key variables and its application in the management of care of a given disease. We present a method called Epidemiological Planning for Patient Care Trajectory (PELC) and its application in a research of HIV pediatric patients. This case study is presented by means of two studies. The first study deals with the development of the method PELC. The second is HIV Pediatric case-control study based on PELC method. HIV pediatric research - the first practical PELC application - found these four key variables to the individual quality level care trajectories: adherence to ART, attending at least one appointment with the otolaryngologist, attending at least one appointment with social services, and having missed one or more routine appointments. We believe PELC method can be used in researches about any kind of care trajectories, contributing to quality level advancements in health services, with emphasis on patient safety and equity in healthcare.
Harmony search method: theory and applications.
Gao, X Z; Govindasamy, V; Xu, H; Wang, X; Zenger, K
2015-01-01
The Harmony Search (HS) method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem.
Goal-Oriented Intelligence in Optimization of Distributed Parameter Systems
2004-08-01
Yarus, and R.L. Chambers, editors, AAPG Computer Applications in geology, No. 3, The American Association of Petroleum Geologists, Tulsa, OK, USA...Stochastic Modeling and Geostatistics – Principles, Methods, and Case Studies, AAPG Computer Applications in geology, No. 3, The American
A general description of detachment for multidimensional modelling of biofilms.
Xavier, Joao de Bivar; Picioreanu, Cristian; van Loosdrecht, Mark C M
2005-09-20
A general method for describing biomass detachment in multidimensional biofilm modelling is introduced. Biomass losses from processes acting on the entire surface of the biofilm, such as erosion, are modelled using a continuous detachment speed function F(det). Discrete detachment events, i.e. sloughing, are implicitly derived from simulations. The method is flexible to allow F(det) to take several forms, including expressions dependent on any state variables such as the local biofilm density. This methodology for biomass detachment was integrated with multidimensional (2D and 3D) particle-based multispecies biofilm models by using a novel application of the level set method. Application of the method is illustrated by trends in the dynamics of biofilms structure and activity derived from simulations performed on a simple model considering uniform biomass (case study I) and a model discriminating biomass composition in heterotrophic active mass, extracellular polymeric substances (EPS) and inert mass (case study II). Results from case study I demonstrate the effect of applied detachment forces as a fundamental factor influencing steady-state biofilm activity and structure. Trends from experimental observations reported in literature were correctly described. For example, simulation results indicated that biomass sloughing is reduced when erosion forces are increased. Case study II illustrates the application of the detachment methodology to systems with non-uniform biomass composition. Simulations carried out at different bulk concentrations of substrate show changes in biofilm structure (in terms of shape, density and spatial distribution of biomass components) and activity (in terms of oxygen and substrate consumption) as a consequence of either oxygen-limited or substrate-limited growth. (c) 2005 Wiley Periodicals, Inc.
A Solution Adaptive Technique Using Tetrahedral Unstructured Grids
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2000-01-01
An adaptive unstructured grid refinement technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The method is based on a combination of surface mesh subdivision and local remeshing of the volume grid Simple functions of flow quantities are employed to detect dominant features of the flowfield The method is designed for modular coupling with various error/feature analyzers and flow solvers. Several steady-state, inviscid flow test cases are presented to demonstrate the applicability of the method for solving practical three-dimensional problems. In all cases, accurate solutions featuring complex, nonlinear flow phenomena such as shock waves and vortices have been generated automatically and efficiently.
Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura
2015-06-30
This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results.
Evaluating Health Information Systems Using Ontologies.
Eivazzadeh, Shahryar; Anderberg, Peter; Larsson, Tobias C; Fricker, Samuel A; Berglund, Johan
2016-06-16
There are several frameworks that attempt to address the challenges of evaluation of health information systems by offering models, methods, and guidelines about what to evaluate, how to evaluate, and how to report the evaluation results. Model-based evaluation frameworks usually suggest universally applicable evaluation aspects but do not consider case-specific aspects. On the other hand, evaluation frameworks that are case specific, by eliciting user requirements, limit their output to the evaluation aspects suggested by the users in the early phases of system development. In addition, these case-specific approaches extract different sets of evaluation aspects from each case, making it challenging to collectively compare, unify, or aggregate the evaluation of a set of heterogeneous health information systems. The aim of this paper is to find a method capable of suggesting evaluation aspects for a set of one or more health information systems-whether similar or heterogeneous-by organizing, unifying, and aggregating the quality attributes extracted from those systems and from an external evaluation framework. On the basis of the available literature in semantic networks and ontologies, a method (called Unified eValuation using Ontology; UVON) was developed that can organize, unify, and aggregate the quality attributes of several health information systems into a tree-style ontology structure. The method was extended to integrate its generated ontology with the evaluation aspects suggested by model-based evaluation frameworks. An approach was developed to extract evaluation aspects from the ontology that also considers evaluation case practicalities such as the maximum number of evaluation aspects to be measured or their required degree of specificity. The method was applied and tested in Future Internet Social and Technological Alignment Research (FI-STAR), a project of 7 cloud-based eHealth applications that were developed and deployed across European Union countries. The relevance of the evaluation aspects created by the UVON method for the FI-STAR project was validated by the corresponding stakeholders of each case. These evaluation aspects were extracted from a UVON-generated ontology structure that reflects both the internally declared required quality attributes in the 7 eHealth applications of the FI-STAR project and the evaluation aspects recommended by the Model for ASsessment of Telemedicine applications (MAST) evaluation framework. The extracted evaluation aspects were used to create questionnaires (for the corresponding patients and health professionals) to evaluate each individual case and the whole of the FI-STAR project. The UVON method can provide a relevant set of evaluation aspects for a heterogeneous set of health information systems by organizing, unifying, and aggregating the quality attributes through ontological structures. Those quality attributes can be either suggested by evaluation models or elicited from the stakeholders of those systems in the form of system requirements. The method continues to be systematic, context sensitive, and relevant across a heterogeneous set of health information systems.
ERIC Educational Resources Information Center
Hazzard, Eric L.; Moreno, Elizabeth; Beall, Deborah L.; Zidenberg-Cherr, Sheri
2012-01-01
Objective: To compare the applicant schools (AS) to non-applicant schools (NAS) residing in the same school districts for the California Instructional School Garden Program and identify barriers to the application process. Methods: A case-control, cross-sectional study design was used to compare resources and school environments. Pearson…
Aerospace Applications of Optimization under Uncertainty
NASA Technical Reports Server (NTRS)
Padula, Sharon; Gumbert, Clyde; Li, Wu
2003-01-01
The Multidisciplinary Optimization (MDO) Branch at NASA Langley Research Center develops new methods and investigates opportunities for applying optimization to aerospace vehicle design. This paper describes MDO Branch experiences with three applications of optimization under uncertainty: (1) improved impact dynamics for airframes, (2) transonic airfoil optimization for low drag, and (3) coupled aerodynamic/structures optimization of a 3-D wing. For each case, a brief overview of the problem and references to previous publications are provided. The three cases are aerospace examples of the challenges and opportunities presented by optimization under uncertainty. The present paper will illustrate a variety of needs for this technology, summarize promising methods, and uncover fruitful areas for new research.
Aerospace Applications of Optimization under Uncertainty
NASA Technical Reports Server (NTRS)
Padula, Sharon; Gumbert, Clyde; Li, Wu
2006-01-01
The Multidisciplinary Optimization (MDO) Branch at NASA Langley Research Center develops new methods and investigates opportunities for applying optimization to aerospace vehicle design. This paper describes MDO Branch experiences with three applications of optimization under uncertainty: (1) improved impact dynamics for airframes, (2) transonic airfoil optimization for low drag, and (3) coupled aerodynamic/structures optimization of a 3-D wing. For each case, a brief overview of the problem and references to previous publications are provided. The three cases are aerospace examples of the challenges and opportunities presented by optimization under uncertainty. The present paper will illustrate a variety of needs for this technology, summarize promising methods, and uncover fruitful areas for new research.
Cheng, S.; Tian, G.; Xia, J.; He, H.; Shi, Z.; ,
2004-01-01
The multichannel analysis of surface-wave method (MASW) is a newly development method. The method has been employed in various applications in environmental and engineering geophysics overseas. However, It can only be found a few case studies in China. Most importantly, there is no application of the MASW in desert area in China or abroad. We present a case study of investigating the low-depression velocity in Temple of North Taba Area in Erdos Basin. The MASW method successfully defined the low-depression velocity layer in the desert area. Comparing results obtained by the MASW method with results by refraction seismic method, we discussed efficiency and simplicity of applying the MASW method in the desert area. It is proved that the maximum investigation depth can reach 60m in the study area when the acquisition and procession parameters are carefully chosen. The MASW method can remedy the incompetence of the refraction method and the micro-seismograph log method in low-depression velocity layer's investigation. The MASW method is also a powerful tool in investigation of near-surface complicated materials and possesses many unique advantages.
Evidence Arguments for Using Formal Methods in Software Certification
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganesh
2013-01-01
We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.
Application of geostatistics to coal-resource characterization and mine planning. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kauffman, P.W.; Walton, D.R.; Martuneac, L.
1981-12-01
Geostatistics is a proven method of ore reserve estimation in many non-coal mining areas but little has been published concerning its application to coal resources. This report presents the case for using geostatistics for coal mining applications and describes how a coal mining concern can best utilize geostatistical techniques for coal resource characterization and mine planning. An overview of the theory of geostatistics is also presented. Many of the applications discussed are documented in case studies that are a part of the report. The results of an exhaustive literature search are presented and recommendations are made for needed future researchmore » and demonstration projects.« less
77 FR 28860 - Notice of Availability of Government-Owned Inventions; Available for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-16
.../372,755: Foam Free Testing Systems and Methods, Navy Case No. 101448//U.S. Patent Application No. 7,372,712: Foam Free Testing Systems and Methods. ADDRESSES: Requests for copies of the inventions cited...
This presentation from the 2016 TRB Summer Conference on Transportation Planning and Air Quality summarizes the application of the Travel Efficiency Assessment Method (TEAM) which analyzed selected transportation emission reduction strategies in three case
Di Schiavi, Maria Teresa; Foti, Marina; Mosconi, Maria Cristina; Mattiolo, Giuseppina; Cavallina, Roberta
2014-01-01
Irradiation is a preservation technology used to improve the safety and hygienic quality of food. Aim of this study was to assess the applicability and validity of the microbiological screening method direct epifluorescence filter technique (DEFT)/aerobic plate count (APC) (EN 13783:2001) for the identification of irradiated herbs and spices. Tests on non-irradiated and irradiated samples of dried herbs and spices were performed. The method was based on the comparison of APC and count obtained using DEFT. In accordance with the standard reference, this method is not applicable to samples with APC<103 colony forming units (CFU)/g and this is its main limit. The results obtained in our laboratories showed that in 50% of cases of non-irradiated samples and in 96% of the samples treated with ionising radiation, the method was not applicable due to a value of CFU/g <103. PMID:27800348
NASA Astrophysics Data System (ADS)
Tarigan, A. P. M.; Rahmad, D.; Sembiring, R. A.; Iskandar, R.
2018-02-01
This paper illustrates an application of Analytical Hierarchy Process (AHP) as a potential decision-making method in water resource management related to drainage rehabilitation. The prioritization problem of urban drainage rehabilitation in Medan City due to limited budget is used as a study case. A hierarchical structure is formed for the prioritization criteria and the alternative drainages to be rehabilitated. Based on the AHP, the prioritization criteria are ranked and a descending-order list of drainage is made in order to select the most favorable drainages to have rehabilitation. A sensitivity analysis is then conducted to check the consistency of the final decisions in case of minor changes in judgements. The results of AHP computed manually are compared with that using the software Expert Choice. It is observed that the top three ranked drainages are consistent, and both results of the AHP methods, calculated manually and performed using Expert Choice, are in agreement. It is hoped that the application of the AHP will help the decision-making process by the city government in the problem of urban drainage rehabilitation.
Inoue, Yuji; Yoneyama, Masami; Nakamura, Masanobu; Ozaki, Satoshi; Ito, Kenjiro; Hiura, Mikio
2012-01-01
Vulnerable plaque can be attributed to induction of ischemic symptoms and magnetic resonance imaging of carotid artery is valuable to detect the plaque. Magnetization prepared rapid acquisition with gradient echo (MPRAGE) method could detect hemorrhagic vulnerable plaque as high intensity signal; however, blood flow is not sufficiently masked by this method. The contrast for plaque in T
Wilkinson, Jessica; Goff, Morgan; Rusoja, Evan; Hanson, Carl; Swanson, Robert Chad
2018-06-01
This review of systems thinking (ST) case studies seeks to compile and analyse cases from ST literature and provide practitioners with a reference for ST in health practice. Particular attention was given to (1) reviewing the frequency and use of key ST terms, methods, and tools in the context of health, and (2) extracting and analysing longitudinal themes across cases. A systematic search of databases was conducted, and a total of 36 case studies were identified. A combination of integrative and inductive qualitative approaches to analysis was used. Most cases identified took place in high-income countries and applied ST retrospectively. The most commonly used ST terms were agent/stakeholder/actor (n = 29), interdependent/interconnected (n = 28), emergence (n = 26), and adaptability/adaptation (n = 26). Common ST methods and tools were largely underutilized. Social network analysis was the most commonly used method (n = 4), and innovation or change management history was the most frequently used tool (n = 11). Four overarching themes were identified; the importance of the interdependent and interconnected nature of a health system, characteristics of leaders in a complex adaptive system, the benefits of using ST, and barriers to implementing ST. This review revealed that while much has been written about the potential benefits of applying ST to health, it has yet to completely transition from theory to practice. There is however evidence of the practical use of an ST lens as well as specific methods and tools. With clear examples of ST applications, the global health community will be better equipped to understand and address key health challenges. © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Park, Jihoon; Yang, Guang; Satija, Addy; Scheidt, Céline; Caers, Jef
2016-12-01
Sensitivity analysis plays an important role in geoscientific computer experiments, whether for forecasting, data assimilation or model calibration. In this paper we focus on an extension of a method of regionalized sensitivity analysis (RSA) to applications typical in the Earth Sciences. Such applications involve the building of large complex spatial models, the application of computationally extensive forward modeling codes and the integration of heterogeneous sources of model uncertainty. The aim of this paper is to be practical: 1) provide a Matlab code, 2) provide novel visualization methods to aid users in getting a better understanding in the sensitivity 3) provide a method based on kernel principal component analysis (KPCA) and self-organizing maps (SOM) to account for spatial uncertainty typical in Earth Science applications and 4) provide an illustration on a real field case where the above mentioned complexities present themselves. We present methods that extend the original RSA method in several ways. First we present the calculation of conditional effects, defined as the sensitivity of a parameter given a level of another parameters. Second, we show how this conditional effect can be used to choose nominal values or ranges to fix insensitive parameters aiming to minimally affect uncertainty in the response. Third, we develop a method based on KPCA and SOM to assign a rank to spatial models in order to calculate the sensitivity on spatial variability in the models. A large oil/gas reservoir case is used as illustration of these ideas.
Determine the Compressive Strength of Calcium Silicate Bricks by Combined Nondestructive Method
2014-01-01
The paper deals with the application of combined nondestructive method for assessment of compressive strength of calcium silicate bricks. In this case, it is a combination of the rebound hammer method and ultrasonic pulse method. Calibration relationships for determining compressive strength of calcium silicate bricks obtained from nondestructive parameter testing for the combined method as well as for the L-type Schmidt rebound hammer and ultrasonic pulse method are quoted here. Calibration relationships are known for their close correlation and are applicable in practice. The highest correlation between parameters from nondestructive measurement and predicted compressive strength is obtained using the SonReb combined nondestructive method. Combined nondestructive SonReb method was proved applicable for determination of compressive strength of calcium silicate bricks at checking tests in a production plant and for evaluation of bricks built in existing masonry structures. PMID:25276864
48 CFR 1852.227-70 - New technology.
Code of Federal Regulations, 2011 CFR
2011-10-01
... method; or to operate, in case of a machine or system; and, in each case, under such conditions as to... contract. Reportable items include, but are not limited to, new processes, machines, manufactures, and compositions of matter, and improvements to, or new applications of, existing processes, machines, manufactures...
Parental Cognitive Impairment and Child Maltreatment in Canada
ERIC Educational Resources Information Center
McConnell, David; Feldman, Maurice; Aunos, Marjorie; Prasad, Narasimha
2011-01-01
Objectives: The aim of this study was to determine the prevalence of parental cognitive impairment in cases opened for child maltreatment investigation in Canada, and to examine the relationship between parental cognitive impairment and maltreatment investigation outcomes including substantiation, case disposition and court application. Methods:…
Theory, implementation and applications of nonstationary Gabor frames
Balazs, P.; Dörfler, M.; Jaillet, F.; Holighaus, N.; Velasco, G.
2011-01-01
Signal analysis with classical Gabor frames leads to a fixed time–frequency resolution over the whole time–frequency plane. To overcome the limitations imposed by this rigidity, we propose an extension of Gabor theory that leads to the construction of frames with time–frequency resolution changing over time or frequency. We describe the construction of the resulting nonstationary Gabor frames and give the explicit formula for the canonical dual frame for a particular case, the painless case. We show that wavelet transforms, constant-Q transforms and more general filter banks may be modeled in the framework of nonstationary Gabor frames. Further, we present the results in the finite-dimensional case, which provides a method for implementing the above-mentioned transforms with perfect reconstruction. Finally, we elaborate on two applications of nonstationary Gabor frames in audio signal processing, namely a method for automatic adaptation to transients and an algorithm for an invertible constant-Q transform. PMID:22267893
Dependability analysis of parallel systems using a simulation-based approach. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sawyer, Darren Charles
1994-01-01
The analysis of dependability in large, complex, parallel systems executing real applications or workloads is examined in this thesis. To effectively demonstrate the wide range of dependability problems that can be analyzed through simulation, the analysis of three case studies is presented. For each case, the organization of the simulation model used is outlined, and the results from simulated fault injection experiments are explained, showing the usefulness of this method in dependability modeling of large parallel systems. The simulation models are constructed using DEPEND and C++. Where possible, methods to increase dependability are derived from the experimental results. Another interesting facet of all three cases is the presence of some kind of workload of application executing in the simulation while faults are injected. This provides a completely new dimension to this type of study, not possible to model accurately with analytical approaches.
Application of the critical pathway and integrated case teaching method to nursing orientation.
Goodman, D
1997-01-01
Nursing staff development programs must be responsive to current changes in healthcare. New nursing staff must be prepared to manage continuous change and to function competently in clinical practice. The orientation pathway, based on a case management model, is used as a structure for the orientation phase of staff development. The integrated case is incorporated as a teaching strategy in orientation. The integrated case method is based on discussion and analysis of patient situations with emphasis on role modeling and integration of theory and skill. The orientation pathway and integrated case teaching method provide a useful framework for orientation of new staff. Educators, preceptors and orientees find the structure provided by the orientation pathway very useful. Orientation that is developed, implemented and evaluated based on a case management model with the use of an orientation pathway and incorporation of an integrated case teaching method provides a standardized structure for orientation of new staff. This approach is designed for the adult learner, promotes conceptual reasoning, and encourages the social and contextual basis for continued learning.
Nonlinear modeling of chaotic time series: Theory and applications
NASA Astrophysics Data System (ADS)
Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.
We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.
The art and science of flow control - case studies using flow visualization methods
NASA Astrophysics Data System (ADS)
Alvi, F. S.; Cattafesta, L. N., III
2010-04-01
Active flow control (AFC) has been the focus of significant research in the last decade. This is mainly due to the potentially substantial benefits it affords. AFC applications range from the subsonic to the supersonic (and beyond) regime for both internal and external flows. These applications are wide and varied, such as controlling flow transition and separation over various external components of the aircraft to active management of separation and flow distortion in engine components and over turbine and compressor blades. High-speed AFC applications include control of flow oscillations in cavity flows, supersonic jet screech, impinging jets, and jet-noise control. In this paper we review some of our recent applications of AFC through a number of case studies that illustrate the typical benefits as well as limitations of present AFC methods. The case studies include subsonic and supersonic canonical flowfields such as separation control over airfoils, control of supersonic cavity flows and impinging jets. In addition, properties of zero-net mass-flux (ZNMF) actuators are also discussed as they represent one of the most widely studied actuators used for AFC. In keeping with the theme of this special issue, the flowfield properties and their response to actuation are examined through the use of various qualitative and quantitative flow visualization methods, such as smoke, shadowgraph, schlieren, planar-laser scattering, and Particle image velocimetry (PIV). The results presented here clearly illustrate the merits of using flow visualization to gain significant insight into the flow and its response to AFC.
NASA Astrophysics Data System (ADS)
Torres Cedillo, Sergio G.; Bonello, Philip
2016-01-01
The high pressure (HP) rotor in an aero-engine assembly cannot be accessed under operational conditions because of the restricted space for instrumentation and high temperatures. This motivates the development of a non-invasive inverse problem approach for unbalance identification and balancing, requiring prior knowledge of the structure. Most such methods in the literature necessitate linear bearing models, making them unsuitable for aero-engine applications which use nonlinear squeeze-film damper (SFD) bearings. A previously proposed inverse method for nonlinear rotating systems was highly limited in its application (e.g. assumed circular centered SFD orbits). The methodology proposed in this paper overcomes such limitations. It uses the Receptance Harmonic Balance Method (RHBM) to generate the backward operator using measurements of the vibration at the engine casing, provided there is at least one linear connection between rotor and casing, apart from the nonlinear connections. A least-squares solution yields the equivalent unbalance distribution in prescribed planes of the rotor, which is consequently used to balance it. The method is validated on distinct rotordynamic systems using simulated casing vibration readings. The method is shown to provide effective balancing under hitherto unconsidered practical conditions. The repeatability of the method, as well as its robustness to noise, model uncertainty and balancing errors, are satisfactorily demonstrated and the limitations of the process discussed.
The Vroom and Yetton Normative Leadership Model Applied to Public School Case Examples.
ERIC Educational Resources Information Center
Sample, John
This paper seeks to familiarize school administrators with the Vroom and Yetton Normative Leadership model by presenting its essential components and providing original case studies for its application to school settings. The five decision-making methods of the Vroom and Yetton model, including two "autocratic," two…
Application of the differential decay-curve method to γ-γ fast-timing lifetime measurements
NASA Astrophysics Data System (ADS)
Petkov, P.; Régis, J.-M.; Dewald, A.; Kisyov, S.
2016-10-01
A new procedure for the analysis of delayed-coincidence lifetime experiments focused on the Fast-timing case is proposed following the approach of the Differential decay-curve method. Examples of application of the procedure on experimental data reveal its reliability for lifetimes even in the sub-nanosecond range. The procedure is expected to improve both precision/reliability and treatment of systematic errors and scarce data as well as to provide an option for cross-check with the results obtained by means of other analyzing methods.
Harmony Search Method: Theory and Applications
Gao, X. Z.; Govindasamy, V.; Xu, H.; Wang, X.; Zenger, K.
2015-01-01
The Harmony Search (HS) method is an emerging metaheuristic optimization algorithm, which has been employed to cope with numerous challenging tasks during the past decade. In this paper, the essential theory and applications of the HS algorithm are first described and reviewed. Several typical variants of the original HS are next briefly explained. As an example of case study, a modified HS method inspired by the idea of Pareto-dominance-based ranking is also presented. It is further applied to handle a practical wind generator optimal design problem. PMID:25945083
Vaginal Candidiasis Infection Treated Using Apple Cider Vinegar: A Case Report.
Ozen, Betul; Baser, Muruvvet
2017-11-07
A 32-y-old married woman was admitted with intense vaginal discharge with foul odor, itching, groin pain, and infertility for the past 5 y. Candida albicans was isolated from the culture of vaginal swab. The patient was diagnosed with chronic vaginal candida infection. She failed to respond to integrative medicine methods prescribed. Recovery was achieved with the application of apple cider vinegar. Alternative treatment methods can be employed in patients unresponsive to medical therapies. As being one of these methods, application of apple cider vinegar can cure vaginal candida infection.
Jones, Cheryl Bland
2005-01-01
This is the second article in a 2-part series focusing on nurse turnover and its costs. Part 1 (December 2004) described nurse turnover costs within the context of human capital theory, and using human resource accounting methods, presented the updated Nursing Turnover Cost Calculation Methodology. Part 2 presents an application of this method in an acute care setting and the estimated costs of nurse turnover that were derived. Administrators and researchers can use these methods and cost information to build a business case for nurse retention.
Grouting applications in civil engineering. Volume I and II. [800 references
DOE Office of Scientific and Technical Information (OSTI.GOV)
Einstein, H.H.; Barvenik, M.J.
1975-01-01
A comprehensive description of grouting applications in civil engineering is presented that can serve as a basis for the selection of grouting methods in the borehole sealing problem. The breadth and depth of the study was assured by conducting the main part of the review, the collection and evaluation of information, without specifically considering the borehole sealing problem (but naturally incorporating any aspect of civil engineering applications that could be of potential use). Grouting is very much an art and not a science. In most cases, it is a trial and error procedure where an inexpensive method is initially triedmore » and then a more expensive one is used until the desired results are obtained. Once a desired effect is obtained, it is difficult to credit any one procedure with the success because the results are due to the summation of all the methods used. In many cases, the method that proves successful reflects a small abnormality in the ground or structure rather than its overall characteristics. Hence, successful grouting relies heavily on good engineering judgement and experience, and not on a basic set of standard correlations or equations. 800 references. (JRD)« less
Review and future prospects for DNA barcoding methods in forensic palynology.
Bell, Karen L; Burgess, Kevin S; Okamoto, Kazufusa C; Aranda, Roman; Brosi, Berry J
2016-03-01
Pollen can be a critical forensic marker in cases where determining geographic origin is important, including investigative leads, missing persons cases, and intelligence applications. However, its use has previously been limited by the need for a high level of specialization by expert palynologists, slow speeds of identification, and relatively poor taxonomic resolution (typically to the plant family or genus level). By contrast, identification of pollen through DNA barcoding has the potential to overcome all three of these limitations, and it may seem surprising that the method has not been widely implemented. Despite what might seem a straightforward application of DNA barcoding to pollen, there are technical issues that have delayed progress. However, recent developments of standard methods for DNA barcoding of pollen, along with improvements in high-throughput sequencing technology, have overcome most of these technical issues. Based on these recent methodological developments in pollen DNA barcoding, we believe that now is the time to start applying these techniques in forensic palynology. In this article, we discuss the potential for these methods, and outline directions for future research to further improve on the technology and increase its applicability to a broader range of situations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Asselineau, Charles-Alexis; Zapata, Jose; Pye, John
2015-06-01
A stochastic optimisation method adapted to illumination and radiative heat transfer problems involving Monte-Carlo ray-tracing is presented. A solar receiver shape optimisation case study illustrates the advantages of the method and its potential: efficient receivers are identified using a moderate computational cost.
NASA Technical Reports Server (NTRS)
Maskew, Brian
1987-01-01
The VSAERO low order panel method formulation is described for the calculation of subsonic aerodynamic characteristics of general configurations. The method is based on piecewise constant doublet and source singularities. Two forms of the internal Dirichlet boundary condition are discussed and the source distribution is determined by the external Neumann boundary condition. A number of basic test cases are examined. Calculations are compared with higher order solutions for a number of cases. It is demonstrated that for comparable density of control points where the boundary conditions are satisfied, the low order method gives comparable accuracy to the higher order solutions. It is also shown that problems associated with some earlier low order panel methods, e.g., leakage in internal flows and junctions and also poor trailing edge solutions, do not appear for the present method. Further, the application of the Kutta conditions is extremely simple; no extra equation or trailing edge velocity point is required. The method has very low computing costs and this has made it practical for application to nonlinear problems requiring iterative solutions for wake shape and surface boundary layer effects.
NASA Technical Reports Server (NTRS)
Perkins, S. C., Jr.; Mendenhall, M. R.
1980-01-01
A correlation method to predict pressures induced on an infinite plate by a jet exhausting normal to the plate into a subsonic free stream was extended to jets exhausting at angles to the plate and to jets exhausting normal to the surface of a body revolution. The complete method consisted of an analytical method which models the blockage and entrainment properties of the jet and an empirical correlation which accounts for viscous effects. For the flat plate case, the method was applicable to jet velocity ratios up to ten, jet inclination angles up to 45 deg from the normal, and radial distances up to five diameters from the jet. For the body of revolution case, the method was applicable to a body at zero degrees angle of attack, jet velocity ratios 1.96 and 3.43, circumferential angles around the body up to 25 deg from the jet, axial distances up to seven diameters from the jet, and jet-to-body diameter ratios less than 0.1.
Modeling Multibody Stage Separation Dynamics Using Constraint Force Equation Methodology
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Roithmayr, Carlos M.; Toniolo, Matthew D.; Karlgaard, Christopher D.; Pamadi, Bandu N.
2011-01-01
This paper discusses the application of the constraint force equation methodology and its implementation for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint, the second case involves two rigid bodies connected with a universal joint, and the third test case is that of Mach 7 separation of the X-43A vehicle. For the first two cases, the solutions obtained using the constraint force equation method compare well with those obtained using industry- standard benchmark codes. For the X-43A case, the constraint force equation solutions show reasonable agreement with the flight-test data. Use of the constraint force equation method facilitates the analysis of stage separation in end-to-end simulations of launch vehicle trajectories
Inferring imagined speech using EEG signals: a new approach using Riemannian manifold features
NASA Astrophysics Data System (ADS)
Nguyen, Chuong H.; Karavas, George K.; Artemiadis, Panagiotis
2018-02-01
Objective. In this paper, we investigate the suitability of imagined speech for brain-computer interface (BCI) applications. Approach. A novel method based on covariance matrix descriptors, which lie in Riemannian manifold, and the relevance vector machines classifier is proposed. The method is applied on electroencephalographic (EEG) signals and tested in multiple subjects. Main results. The method is shown to outperform other approaches in the field with respect to accuracy and robustness. The algorithm is validated on various categories of speech, such as imagined pronunciation of vowels, short words and long words. The classification accuracy of our methodology is in all cases significantly above chance level, reaching a maximum of 70% for cases where we classify three words and 95% for cases of two words. Significance. The results reveal certain aspects that may affect the success of speech imagery classification from EEG signals, such as sound, meaning and word complexity. This can potentially extend the capability of utilizing speech imagery in future BCI applications. The dataset of speech imagery collected from total 15 subjects is also published.
Sadleir, R J; Zhang, S U; Tucker, A S; Oh, Sungho
2008-08-01
Electrical impedance tomography (EIT) is particularly well-suited to applications where its portability, rapid acquisition speed and sensitivity give it a practical advantage over other monitoring or imaging systems. An EIT system's patient interface can potentially be adapted to match the target environment, and thereby increase its utility. It may thus be appropriate to use different electrode positions from those conventionally used in EIT in these cases. One application that may require this is the use of EIT on emergency medicine patients; in particular those who have suffered blunt abdominal trauma. In patients who have suffered major trauma, it is desirable to minimize the risk of spinal cord injury by avoiding lifting them. To adapt EIT to this requirement, we devised and evaluated a new electrode topology (the 'hemiarray') which comprises a set of eight electrodes placed only on the subject's anterior surface. Images were obtained using a two-dimensional sensitivity matrix and weighted singular value decomposition reconstruction. The hemiarray method's ability to quantify bleeding was evaluated by comparing its performance with conventional 2D reconstruction methods using data gathered from a saline phantom. We found that without applying corrections to reconstructed images it was possible to estimate blood volume in a two-dimensional hemiarray case with an uncertainty of around 27 ml. In an approximately 3D hemiarray case, volume prediction was possible with a maximum uncertainty of around 38 ml in the centre of the electrode plane. After application of a QI normalizing filter, average uncertainties in a two-dimensional hemiarray case were reduced to about 15 ml. Uncertainties in the approximate 3D case were reduced to about 30 ml.
ERIC Educational Resources Information Center
Pustejovsky, James E.
2013-01-01
Single-case designs (SCDs) are a class of research methods for evaluating intervention effects by taking repeated measurements of an outcome over time on a single case, both before and after the deliberate introduction of a treatment. SCDs are used heavily in fields such as special education, school psychology, social work, and applied behavior…
The clinical application of teaching people about pain.
Louw, Adriaan; Zimney, Kory; O'Hotto, Christine; Hilton, Sandra
2016-07-01
Teaching people about the neurobiology and neurophysiology of their pain experience has a therapeutic effect and has been referred to as pain neuroscience education (PNE). Various high-quality randomized controlled trials and systematic reviews have shown increasing efficacy of PNE decreasing pain, disability, pain catastrophization, movement restrictions, and healthcare utilization. Research studies, however, by virtue of their design, are very controlled environments and, therefore, in contrast to the ever-increasing evidence for PNE, little is known about the clinical application of this emerging therapy. In contrast, case studies, case series, and expert opinion and perspectives by authorities in the world of pain science provide clinicians with a glimpse into potential "real" clinical application of PNE in the face of the ever-increasing chronic pain epidemic. By taking the material from the randomized controlled trials, systematic reviews, case series, case studies, and expert opinion, this article aims to provide a proposed layout of the clinical application of PNE. The article systematically discusses key elements of PNE including examination, educational content, and delivery methods, merging of PNE with movement, goal setting, and progression. This perspectives article concludes with a call for research into the clinical application of PNE.
Application of theoretical methods to increase succinate production in engineered strains.
Valderrama-Gomez, M A; Kreitmayer, D; Wolf, S; Marin-Sanguino, A; Kremling, A
2017-04-01
Computational methods have enabled the discovery of non-intuitive strategies to enhance the production of a variety of target molecules. In the case of succinate production, reviews covering the topic have not yet analyzed the impact and future potential that such methods may have. In this work, we review the application of computational methods to the production of succinic acid. We found that while a total of 26 theoretical studies were published between 2002 and 2016, only 10 studies reported the successful experimental implementation of any kind of theoretical knowledge. None of the experimental studies reported an exact application of the computational predictions. However, the combination of computational analysis with complementary strategies, such as directed evolution and comparative genome analysis, serves as a proof of concept and demonstrates that successful metabolic engineering can be guided by rational computational methods.
fMRI for mapping language networks in neurosurgical cases
Gupta, Santosh S
2014-01-01
Evaluating language has been a long-standing application in functional magnetic resonance imaging (fMRI) studies, both in research and clinical circumstances, and still provides challenges. Localization of eloquent areas is important in neurosurgical cases, so that there is least possible damage to these areas during surgery, maintaining their function postoperatively, therefore providing good quality of life to the patient. Preoperative fMRI study is a non-invasive tool to localize the eloquent areas, including language, with other traditional methods generally used being invasive and at times perilous. In this article, we describe methods and various paradigms to study the language areas, in clinical neurosurgical cases, along with illustrations of cases from our institute. PMID:24851003
Methods commonly used to delineate protection zones for water-supply wells are often not directly applicable for springs. This investigation focuses on characterization of the hydrogeologic setting using hydrogeologic mapping methods to identify geologic and hydrologic features ...
An External Wire Frame Fixation Method of Skin Grafting for Burn Reconstruction.
Yoshino, Yukiko; Ueda, Hyakuzoh; Ono, Simpei; Ogawa, Rei
2017-06-28
The skin graft is a prevalent reconstructive method for burn injuries. We have been applying external wire frame fixation methods in combination with skin grafts since 1986 and have experienced better outcomes in percentage of successful graft take. The overall purpose of this method was to further secure skin graft adherence to wound beds in hard to stabilize areas. There are also location-specific benefits to this technique such as eliminating the need of tarsorrhaphy in periorbital area, allowing immediate food intake after surgery in perioral area, and performing less invasive fixing methods in digits, and so on. The purpose of this study was to clarify its benefits and applicable locations. We reviewed 22 postburn patients with skin graft reconstructions using the external wire frame method at our institution from December 2012 through September 2016. Details of the surgical technique and individual reports are also discussed. Of the 22 cases, 15 (68%) were split-thickness skin grafts and 7 (32%) were full-thickness skin grafts. Five cases (23%) involved periorbital reconstruction, 5 (23%) involved perioral reconstruction, 2 (9%) involved lower limb reconstruction, and 10 (45%) involved digital reconstruction. Complete (100%) survival of the skin graft was attained in all cases. No signs of complication were observed. With 30 years of experiences all combined, we have summarized fail-proof recommendations to a successful graft survival with an emphasis on the locations of its application.
NASA Astrophysics Data System (ADS)
Lopes de Oliveira, Paulo Sérgio; Garratt, Richard Charles
1998-11-01
We describe the application of a method for the reconstruction of three-dimensional atomic co-ordinates from a stereo ribbon diagram of a protein when additional information for some of the sidechain positions is available. The method has applications in cases where the 3D co-ordinates have not been made available by any means other than the original publication and are of interest as models for molecular replacement, homology modelling etc. The approach is, on the one hand, more general than other methods which are based on stereo figures which present specific atomic positions, but on the other hand relies on input from a specialist. Its exact implementation will depend on the figure of interest. We have applied the method to the case of the α-d-galactose-binding lectin jacalin with a resultant RMS deviation, compared to the crystal structure, of 1.5 Å for the 133 Cα positions of the α-chain and 2.6 Å for the less regular β-chain. The success of the method depends on the secondary structure of the protein under consideration and the orientation of the stereo diagram itself but can be expected to reproduce the mainchain co-ordinates more accurately than the sidechains. Some ways in which the method may be generalised to other cases are discussed.
Neelon, Brian; O'Malley, A James; Smith, Valerie A
2016-11-30
This article is the second installment of a two-part tutorial on the analysis of zero-modified count and semicontinuous data. Part 1, which appears as a companion piece in this issue of Statistics in Medicine, provides a general background and overview of the topic, with particular emphasis on applications to health services research. Here, we present three case studies highlighting various approaches for the analysis of zero-modified data. The first case study describes methods for analyzing zero-inflated longitudinal count data. Case study 2 considers the use of hurdle models for the analysis of spatiotemporal count data. The third case study discusses an application of marginalized two-part models to the analysis of semicontinuous health expenditure data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Efficacy of Carotid Artery Stenting by the Universal Protection Method.
Goto, Shunsaku; Ohshima, Tomotaka; Kato, Kyozo; Izumi, Takashi; Wakabayashi, Toshihiko
2018-04-18
To avoid distal plaques embolization during carotid artery stenting, we developed Universal Protection Method that combined the use of a proximal common carotid artery balloon, an external carotid artery balloon, and a distal internal carotid artery filter, with continuous flow reversal to the femoral vein. Herein, we assessed the efficacy of the Universal Protection Method by comparing stenting outcomes before and after its introduction. We assessed outcomes for 115 cases before and 41 cases after the Universal Protection Method was adopted (non-Universal Protection Method and Universal Protection Method groups, respectively). We then compared procedure details, magnetic resonance imaging (within 48 hours after the procedure), intraprocedural complications, and postoperative stroke rates. Ischemic stroke was not observed in the Universal Protection Method group, but 1 major stroke and 2 minor strokes were observed in the non-Universal Protection Method group. High-intensity areas were seen in 6 (15.0%) and 49 (42.6%) cases in the Universal Protection Method and non-Universal Protection Method groups, respectively (P = .001). Contrastingly, intraprocedural complications were observed in 9 (22.5%) and 21 (18.3%) cases in the Universal Protection Method and non-Universal Protection Method groups, respectively. Among these intraprocedural complication cases, high-intensity areas were observed in 1 case (11.1%) in the Universal Protection Method group and in 15 cases (71.4%) in the non-Universal Protection Method group. Universal Protection Method is a safe technique that is applicable to all patients undergoing carotid artery stenting, irrespective of individual risk factors. Notably, the incidence rates of both distal embolization and unexpected intraprocedural complications are low. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Environmental waters are important reservoirs of pathogenic microorganisms, many of which are of fecal origin. In most cases, the presence of pathogens is determined using surrogate bacterial indicators. In other cases, direct detection of the pathogen in question is required. M...
Consumer Learning for University Students: A Case for a Curriculum
ERIC Educational Resources Information Center
Crafford, Sharon; Bitzer, Eli
2009-01-01
This article indicates how the application of a simplified version of the analytical abstraction method (AAM) was used in curriculum development for consumer learning at one higher education institution in South Africa. We used a case study design and qualitative research methodology to generate data through semi-structured interviews with eight…
Knowledge Management Model: Practical Application for Competency Development
ERIC Educational Resources Information Center
Lustri, Denise; Miura, Irene; Takahashi, Sergio
2007-01-01
Purpose: This paper seeks to present a knowledge management (KM) conceptual model for competency development and a case study in a law service firm, which implemented the KM model in a competencies development program. Design/methodology/approach: The case study method was applied according to Yin (2003) concepts, focusing a six-professional group…
ERIC Educational Resources Information Center
Tingerthal, John Steven
2013-01-01
Using case study methodology and autoethnographic methods, this study examines a process of curricular development known as "Decoding the Disciplines" (Decoding) by documenting the experience of its application in a construction engineering mechanics course. Motivated by the call to integrate what is known about teaching and learning…
Feasibility study on the least square method for fitting non-Gaussian noise data
NASA Astrophysics Data System (ADS)
Xu, Wei; Chen, Wen; Liang, Yingjie
2018-02-01
This study is to investigate the feasibility of least square method in fitting non-Gaussian noise data. We add different levels of the two typical non-Gaussian noises, Lévy and stretched Gaussian noises, to exact value of the selected functions including linear equations, polynomial and exponential equations, and the maximum absolute and the mean square errors are calculated for the different cases. Lévy and stretched Gaussian distributions have many applications in fractional and fractal calculus. It is observed that the non-Gaussian noises are less accurately fitted than the Gaussian noise, but the stretched Gaussian cases appear to perform better than the Lévy noise cases. It is stressed that the least-squares method is inapplicable to the non-Gaussian noise cases when the noise level is larger than 5%.
2011-01-01
Background In view of the long term discussion on the appropriateness of the dengue classification into dengue fever (DF), dengue haemorrhagic fever (DHF) and dengue shock syndrome (DSS), the World Health Organization (WHO) has outlined in its new global dengue guidelines a revised classification into levels of severity: dengue fever with an intermediary group of "dengue fever with warning sings", and severe dengue. The objective of this paper was to compare the two classification systems regarding applicability in clinical practice and surveillance, as well as user-friendliness and acceptance by health staff. Methods A mix of quantitative (prospective and retrospective review of medical charts by expert reviewers, formal staff interviews), semi-quantitative (open questions in staff interviews) and qualitative methods (focus group discussions) were used in 18 countries. Quality control of data collected was undertaken by external monitors. Results The applicability of the DF/DHF/DSS classification was limited, even when strict DHF criteria were not applied (13.7% of dengue cases could not be classified using the DF/DHF/DSS classification by experienced reviewers, compared to only 1.6% with the revised classification). The fact that some severe dengue cases could not be classified in the DF/DHF/DSS system was of particular concern. Both acceptance and perceived user-friendliness of the revised system were high, particularly in relation to triage and case management. The applicability of the revised classification to retrospective data sets (of importance for dengue surveillance) was also favourable. However, the need for training, dissemination and further research on the warning signs was highlighted. Conclusions The revised dengue classification has a high potential for facilitating dengue case management and surveillance. PMID:21510901
Decision support systems and the healthcare strategic planning process: a case study.
Lundquist, D L; Norris, R M
1991-01-01
The repertoire of applications that comprises health-care decision support systems (DSS) includes analyses of clinical, financial, and operational activities. As a whole, these applications facilitate developing comprehensive and interrelated business and medical models that support the complex decisions required to successfully manage today's health-care organizations. Kennestone Regional Health Care System's use of DSS to facilitate strategic planning has precipitated marked changes in the organization's method of determining capital allocations. This case study discusses Kennestone's use of DSS in the strategic planning process, including profiles of key DSS modeling components.
Topography measurements and applications in ballistics and tool mark identifications*
Vorburger, T V; Song, J; Petraco, N
2016-01-01
The application of surface topography measurement methods to the field of firearm and toolmark analysis is fairly new. The field has been boosted by the development of a number of competing optical methods, which has improved the speed and accuracy of surface topography acquisitions. We describe here some of these measurement methods as well as several analytical methods for assessing similarities and differences among pairs of surfaces. We also provide a few examples of research results to identify cartridge cases originating from the same firearm or tool marks produced by the same tool. Physical standards and issues of traceability are also discussed. PMID:27182440
The Baldwin-Lomax model for separated and wake flows using the entropy envelope concept
NASA Technical Reports Server (NTRS)
Brock, J. S.; Ng, W. F.
1992-01-01
Implementation of the Baldwin-Lomax algebraic turbulence model is difficult and ambiguous within flows characterized by strong viscous-inviscid interactions and flow separations. A new method of implementation is proposed which uses an entropy envelope concept and is demonstrated to ensure the proper evaluation of modeling parameters. The method is simple, computationally fast, and applicable to both wake and boundary layer flows. The method is general, making it applicable to any turbulence model which requires the automated determination of the proper maxima of a vorticity-based function. The new method is evalulated within two test cases involving strong viscous-inviscid interaction.
Efficient Analysis of Complex Structures
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.
2000-01-01
Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).
Application of the coherent anomaly method to percolation
NASA Astrophysics Data System (ADS)
Takayasu, Misako; Takayasu, Hideki
1988-03-01
Applying the coherent anomaly method (CAM) to site percolation problems, we estimate the percolation threshold pc and critical exponents. We obtain pc=0.589, β=0.140, γ=2.426 on the two-dimensional square lattice. These values are in good agreement with the values already known. We also investigate higher-dimensional cases by this method.
Application of the Coherent Anomaly Method to Percolation
NASA Astrophysics Data System (ADS)
Takayasu, Misako; Takayasu, Hideki
Applying the coherent anomaly method (CAM) to site percolation problems, we estimate the percolation threshold ϱc and critical exponents. We obtain pc = 0.589, Β=0.140, Γ = 2.426 on the two-dimensional square lattice. These values are in good agreement with the values already known. We also investigate higher-dimensional cases by this method.
ERIC Educational Resources Information Center
Daghan, Gökhan; Akkoyunlu, Buket
2014-01-01
In this study, Information Technologies teachers' views and usage cases on performance based assesment methods (PBAMs) are examined. It is aimed to find out which of the PBAMs are used frequently or not used, preference reasons of these methods and opinions about the applicability of them. Study is designed with the phenomenological design which…
Reflections from the Application of Different Type of Activities: Special Training Methods Course
ERIC Educational Resources Information Center
Karadeniz, Mihriban Hacisalihoglu
2017-01-01
The aim of this study is to reveal the benefits gained from "Special Training Methods II" course and the problems prospective mathematics teachers encountered with it. The case study method was used in the study. The participants in the study were 34 prospective mathematics teachers studying at a Primary School Mathematics Education…
Applications of Taylor-Galerkin finite element method to compressible internal flow problems
NASA Technical Reports Server (NTRS)
Sohn, Jeong L.; Kim, Yongmo; Chung, T. J.
1989-01-01
A two-step Taylor-Galerkin finite element method with Lapidus' artificial viscosity scheme is applied to several test cases for internal compressible inviscid flow problems. Investigations for the effect of supersonic/subsonic inlet and outlet boundary conditions on computational results are particularly emphasized.
Average Likelihood Methods of Classification of Code Division Multiple Access (CDMA)
2016-05-01
case of cognitive radio applications. Modulation classification is part of a broader problem known as blind or uncooperative demodulation the goal of...Introduction 2 2.1 Modulation Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2.2 Research Objectives...6 3 Modulation Classification Methods 7 3.0.1 Ad Hoc
A methodological review of qualitative case study methodology in midwifery research.
Atchan, Marjorie; Davis, Deborah; Foureur, Maralyn
2016-10-01
To explore the use and application of case study research in midwifery. Case study research provides rich data for the analysis of complex issues and interventions in the healthcare disciplines; however, a gap in the midwifery research literature was identified. A methodological review of midwifery case study research using recognized templates, frameworks and reporting guidelines facilitated comprehensive analysis. An electronic database search using the date range January 2005-December 2014: Maternal and Infant Care, CINAHL Plus, Academic Search Complete, Web of Knowledge, SCOPUS, Medline, Health Collection (Informit), Cochrane Library Health Source: Nursing/Academic Edition, Wiley online and ProQuest Central. Narrative evaluation was undertaken. Clearly worded questions reflected the problem and purpose. The application, strengths and limitations of case study methods were identified through a quality appraisal process. The review identified both case study research's applicability to midwifery and its low uptake, especially in clinical studies. Many papers included the necessary criteria to achieve rigour. The included measures of authenticity and methodology were varied. A high standard of authenticity was observed, suggesting authors considered these elements to be routine inclusions. Technical aspects were lacking in many papers, namely a lack of reflexivity and incomplete transparency of processes. This review raises the profile of case study research in midwifery. Midwives will be encouraged to explore if case study research is suitable for their investigation. The raised profile will demonstrate further applicability; encourage support and wider adoption in the midwifery setting. © 2016 John Wiley & Sons Ltd.
Guan, Yong-ping; Zhao, Wen; Li, Shen-gang; Zhang, Guo-bin
2014-01-01
The design and construction of shallow-buried tunnels in densely populated urban areas involve many challenges. The ground movements induced by tunneling effects pose potential risks to infrastructure such as surface buildings, pipelines, and roads. In this paper, a case study of the Zhongjie subway station located in Shenyang, China, is examined to investigate the key construction techniques and the influence of the Pile-Beam-Arch (PBA) excavation method on the surrounding environment. This case study discusses the primary risk factors affecting the environmental safety and summarizes the corresponding risk mitigation measures and key techniques for subway station construction using the PBA excavation method in a densely populated urban area.
NASA Technical Reports Server (NTRS)
Chiu, Y. T.; Hilton, H. H.
1977-01-01
Exact closed-form solutions to the solar force-free magnetic-field boundary-value problem are obtained for constant alpha in Cartesian geometry by a Green's function approach. The uniqueness of the physical problem is discussed. Application of the exact results to practical solar magnetic-field calculations is free of series truncation errors and is at least as economical as the approximate methods currently in use. Results of some test cases are presented.
UXDs-Driven Transferring Method from TRIZ Solution to Domain Solution
NASA Astrophysics Data System (ADS)
Ma, Lihui; Cao, Guozhong; Chang, Yunxia; Wei, Zihui; Ma, Kai
The translation process from TRIZ solutions to domain solutions is an analogy-based process. TRIZ solutions, such as 40 inventive principles and the related cases, are medium-solutions for domain problems. Unexpected discoveries (UXDs) are the key factors to trigger designers to generate new ideas for domain solutions. The Algorithm of UXD resolving based on Means-Ends Analysis(MEA) is studied and an UXDs-driven transferring method from TRIZ solution to domain solution is formed. A case study shows the application of the process.
NASA Astrophysics Data System (ADS)
Potham, Sathya Prasad
Droplet collision and impingement on a substrate are widely observed phenomenon in many applications like spray injection of Internal Combustion Engines, spray cooling, spray painting and atomizers used in propulsion applications. Existing Lagrangian models do not provide a comprehensive picture of the outcome of these events and may involve model constants requiring experimental data for validation. Physics based models like Volume of Fluid (VOF) method involve no parametric tuning and are more accurate. The aim of this thesis is to extend the basic VOF method with an evaporation sub-model and implement in an open source Computational Fluid Dynamics (CFD) software, OpenFOAM. The new model is applied to numerically study the evaporation of spherical n-heptane droplets impinging on a hot wall at atmospheric pressure and a temperature above the Leidenfrost temperature. An additional vapor phase is introduced apart from the liquid and gas phases to understand the mixing and diffusion of vapor and gas phases. The evaporation model is validated quantitatively and qualitatively with fundamental problems having analytical solutions and published results. The effect of droplet number and arrangement on evaporation is studied by three cases with one (Case 1), two (Case 2) and four (Case 3) droplets impinging on hot wall in film boiling regime at a fixed temperature of wall and a constant non-dimensional distance between droplets. Droplet lift and spread, surface temperature, heat transfer, and evaporation rate are examined. It was observed that more liquid mass evaporated in Case 1 compared to the other cases. Droplet levitation begins early in Case 1 and very high levitation observed was partially due to contraction of its shape from elongated to a more circular form. Average surface temperature was also considerably reduced in Case 1 due to high droplet levitation.
NASA Astrophysics Data System (ADS)
Kurzweil, Yair; Head-Gordon, Martin
2009-07-01
We develop a method that can constrain any local exchange-correlation potential to preserve basic exact conditions. Using the method of Lagrange multipliers, we calculate for each set of given Kohn-Sham orbitals a constraint-preserving potential which is closest to the given exchange-correlation potential. The method is applicable to both the time-dependent (TD) and independent cases. The exact conditions that are enforced for the time-independent case are Galilean covariance, zero net force and torque, and Levy-Perdew virial theorem. For the time-dependent case we enforce translational covariance, zero net force, Levy-Perdew virial theorem, and energy balance. We test our method on the exchange (only) Krieger-Li-Iafrate (xKLI) approximate-optimized effective potential for both cases. For the time-independent case, we calculated the ground state properties of some hydrogen chains and small sodium clusters for some constrained xKLI potentials and Hartree-Fock (HF) exchange. The results (total energy, Kohn-Sham eigenvalues, polarizability, and hyperpolarizability) indicate that enforcing the exact conditions is not important for these cases. On the other hand, in the time-dependent case, constraining both energy balance and zero net force yields improved results relative to TDHF calculations. We explored the electron dynamics in small sodium clusters driven by cw laser pulses. For each laser pulse we compared calculations from TD constrained xKLI, TD partially constrained xKLI, and TDHF. We found that electron dynamics such as electron ionization and moment of inertia dynamics for the constrained xKLI are most similar to the TDHF results. Also, energy conservation is better by at least one order of magnitude with respect to the unconstrained xKLI. We also discuss the problems that arise in satisfying constraints in the TD case with a non-cw driving force.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurzweil, Yair; Head-Gordon, Martin
2009-07-15
We develop a method that can constrain any local exchange-correlation potential to preserve basic exact conditions. Using the method of Lagrange multipliers, we calculate for each set of given Kohn-Sham orbitals a constraint-preserving potential which is closest to the given exchange-correlation potential. The method is applicable to both the time-dependent (TD) and independent cases. The exact conditions that are enforced for the time-independent case are Galilean covariance, zero net force and torque, and Levy-Perdew virial theorem. For the time-dependent case we enforce translational covariance, zero net force, Levy-Perdew virial theorem, and energy balance. We test our method on the exchangemore » (only) Krieger-Li-Iafrate (xKLI) approximate-optimized effective potential for both cases. For the time-independent case, we calculated the ground state properties of some hydrogen chains and small sodium clusters for some constrained xKLI potentials and Hartree-Fock (HF) exchange. The results (total energy, Kohn-Sham eigenvalues, polarizability, and hyperpolarizability) indicate that enforcing the exact conditions is not important for these cases. On the other hand, in the time-dependent case, constraining both energy balance and zero net force yields improved results relative to TDHF calculations. We explored the electron dynamics in small sodium clusters driven by cw laser pulses. For each laser pulse we compared calculations from TD constrained xKLI, TD partially constrained xKLI, and TDHF. We found that electron dynamics such as electron ionization and moment of inertia dynamics for the constrained xKLI are most similar to the TDHF results. Also, energy conservation is better by at least one order of magnitude with respect to the unconstrained xKLI. We also discuss the problems that arise in satisfying constraints in the TD case with a non-cw driving force.« less
NASA Astrophysics Data System (ADS)
Brinkkemper, S.; Rossi, M.
1994-12-01
As customizable computer aided software engineering (CASE) tools, or CASE shells, have been introduced in academia and industry, there has been a growing interest into the systematic construction of methods and their support environments, i.e. method engineering. To aid the method developers and method selectors in their tasks, we propose two sets of metrics, which measure the complexity of diagrammatic specification techniques on the one hand, and of complete systems development methods on the other hand. Proposed metrics provide a relatively fast and simple way to analyze the technique (or method) properties, and when accompanied with other selection criteria, can be used for estimating the cost of learning the technique and the relative complexity of a technique compared to others. To demonstrate the applicability of the proposed metrics, we have applied them to 34 techniques and 15 methods.
1985-07-08
comparison to a library of known spectra. A preliminary study (Warner et al., 1984) of the application of this method to the pattern recognition of...case, the spectra from two blue-green algae are shown. Figure 3A indicates phycocyanin as the major fluorophore and 3B indicates phycoerythrin. Except...445. Ho, C.H., G.D. Christian, and E.R. Davidson, 1978. Application of the method of rank annihilation to quantitative analyses of multicomponent
NASA Astrophysics Data System (ADS)
Rasmussen, Karsten B.; Juhl, Peter
2004-05-01
Boundary element method (BEM) calculations are used for the purpose of predicting the acoustic influence of the human head in two cases. In the first case the sound source is the mouth and in the second case the sound is plane waves arriving from different directions in the horizontal plane. In both cases the sound field is studied in relation to two positions above the right ear being representative of hearing aid microphone positions. Both cases are relevant for hearing aid development. The calculations are based upon a direct BEM implementation in Matlab. The meshing is based on the original geometrical data files describing the B&K Head and Torso Simulator 4128 combined with a 3D scan of the pinna.
Secure E-Business applications based on the European Citizen Card
NASA Astrophysics Data System (ADS)
Zipfel, Christian; Daum, Henning; Meister, Gisela
The introduction of ID cards enhanced with electronic authentication services opens up the possibility to use these for identification and authentication in e-business applications. To avoid incompatible national solutions, the specification of the European Citizen Card aims at defining interoperable services for such use cases. Especially the given device authentication methods can help to eliminate security problems with current e-business and online banking applications.
In Vitro Electrochemistry of Biological Systems
Adams, Kelly L.; Puchades, Maja; Ewing, Andrew G.
2009-01-01
This article reviews recent work involving electrochemical methods for in vitro analysis of biomolecules, with an emphasis on detection and manipulation at and of single cells and cultures of cells. The techniques discussed include constant potential amperometry, chronoamperometry, cellular electroporation, scanning electrochemical microscopy, and microfluidic platforms integrated with electrochemical detection. The principles of these methods are briefly described, followed in most cases with a short description of an analytical or biological application and its significance. The use of electrochemical methods to examine specific mechanistic issues in exocytosis is highlighted, as a great deal of recent work has been devoted to this application. PMID:20151038
An exchange format for use-cases of hospital information systems.
Masuda, G; Sakamoto, N; Sakai, R; Yamamoto, R
2001-01-01
Object-oriented software development is a powerful methodology for development of large hospital information systems. We think use-case driven approach is particularly useful for the development. In the use-cases driven approach, use-cases are documented at the first stage in the software development process and they are used through the whole steps in a variety of ways. Therefore, it is important to exchange and share the use-cases and make effective use of them through the overall lifecycle of a development process. In this paper, we propose a method of sharing and exchanging use-case models between applications, developers, and projects. We design an XML based exchange format for use-cases. We then discuss an application of the exchange format to support several software development activities. We preliminarily implemented a support system for object-oriented analysis based on the exchange format. The result shows that using the structural and semantic information in the exchange format enables the support system to assist the object-oriented analysis successfully.
2012-01-01
Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846
NASA Astrophysics Data System (ADS)
Bonacker, Esther; Gibali, Aviv; Küfer, Karl-Heinz; Süss, Philipp
2017-04-01
Multicriteria optimization problems occur in many real life applications, for example in cancer radiotherapy treatment and in particular in intensity modulated radiation therapy (IMRT). In this work we focus on optimization problems with multiple objectives that are ranked according to their importance. We solve these problems numerically by combining lexicographic optimization with our recently proposed level set scheme, which yields a sequence of auxiliary convex feasibility problems; solved here via projection methods. The projection enables us to combine the newly introduced superiorization methodology with multicriteria optimization methods to speed up computation while guaranteeing convergence of the optimization. We demonstrate our scheme with a simple 2D academic example (used in the literature) and also present results from calculations on four real head neck cases in IMRT (Radiation Oncology of the Ludwig-Maximilians University, Munich, Germany) for two different choices of superiorization parameter sets suited to yield fast convergence for each case individually or robust behavior for all four cases.
Methods commonly used to delineate protection zones for water-supply wells are often not directly applicable for springs. This investigation focuses on the use of hydrogeologic mapping methods to identify physical and hydrologic features that control ground-water flow to springs...
ERIC Educational Resources Information Center
Shaw, Robert E.; And Others
1997-01-01
Proposes a theoretical framework for designing online-situated assessment tools for multimedia instructional systems. Uses a graphic method based on ecological psychology to monitor student performance through a learning activity. Explores the method's feasibility in case studies describing instructional systems teaching critical-thinking and…
Research in Distance Education: A System Modeling Approach.
ERIC Educational Resources Information Center
Saba, Farhad; Twitchell, David
This demonstration of the use of a computer simulation research method based on the System Dynamics modeling technique for studying distance education reviews research methods in distance education, including the broad categories of conceptual and case studies, and presents a rationale for the application of systems research in this area. The…
Achille, Cristiana; Adami, Andrea; Chiarini, Silvia; Cremonesi, Stefano; Fassi, Francesco; Fregonese, Luigi; Taffurelli, Laura
2015-01-01
This paper examines the survey of tall buildings in an emergency context like in the case of post-seismic events. The after-earthquake survey has to guarantee time-savings, high precision and security during the operational stages. The main goal is to optimize the application of methodologies based on acquisition and automatic elaborations of photogrammetric data even with the use of Unmanned Aerial Vehicle (UAV) systems in order to provide fast and low cost operations. The suggested methods integrate new technologies with commonly used technologies like TLS and topographic acquisition. The value of the photogrammetric application is demonstrated by a test case, based on the comparison of acquisition, calibration and 3D modeling results in case of use of a laser scanner, metric camera and amateur reflex camera. The test would help us to demonstrate the efficiency of image based methods in the acquisition of complex architecture. The case study is Santa Barbara Bell tower in Mantua. The applied survey solution allows a complete 3D database of the complex architectural structure to be obtained for the extraction of all the information needed for significant intervention. This demonstrates the applicability of the photogrammetry using UAV for the survey of vertical structures, complex buildings and difficult accessible architectural parts, providing high precision results. PMID:26134108
NASA Astrophysics Data System (ADS)
Proskurov, S.; Darbyshire, O. R.; Karabasov, S. A.
2017-12-01
The present work discusses modifications to the stochastic Fast Random Particle Mesh (FRPM) method featuring both tonal and broadband noise sources. The technique relies on the combination of incorporated vortex-shedding resolved flow available from Unsteady Reynolds-Averaged Navier-Stokes (URANS) simulation with the fine-scale turbulence FRPM solution generated via the stochastic velocity fluctuations in the context of vortex sound theory. In contrast to the existing literature, our method encompasses a unified treatment for broadband and tonal acoustic noise sources at the source level, thus, accounting for linear source interference as well as possible non-linear source interaction effects. When sound sources are determined, for the sound propagation, Acoustic Perturbation Equations (APE-4) are solved in the time-domain. Results of the method's application for two aerofoil benchmark cases, with both sharp and blunt trailing edges are presented. In each case, the importance of individual linear and non-linear noise sources was investigated. Several new key features related to the unsteady implementation of the method were tested and brought into the equation. Encouraging results have been obtained for benchmark test cases using the new technique which is believed to be potentially applicable to other airframe noise problems where both tonal and broadband parts are important.
NASA Technical Reports Server (NTRS)
Walton, William C., Jr.
1960-01-01
This paper reports the findings of an investigation of a finite - difference method directly applicable to calculating static or simple harmonic flexures of solid plates and potentially useful in other problems of structural analysis. The method, which was proposed in doctoral thesis by John C. Houbolt, is based on linear theory and incorporates the principle of minimum potential energy. Full realization of its advantages requires use of high-speed computing equipment. After a review of Houbolt's method, results of some applications are presented and discussed. The applications consisted of calculations of the natural modes and frequencies of several uniform-thickness cantilever plates and, as a special case of interest, calculations of the modes and frequencies of the uniform free-free beam. Computed frequencies and nodal patterns for the first five or six modes of each plate are compared with existing experiments, and those for one plate are compared with another approximate theory. Beam computations are compared with exact theory. On the basis of the comparisons it is concluded that the method is accurate and general in predicting plate flexures, and additional applications are suggested. An appendix is devoted t o computing procedures which evolved in the progress of the applications and which facilitate use of the method in conjunction with high-speed computing equipment.
A prognostic model for temporal courses that combines temporal abstraction and case-based reasoning.
Schmidt, Rainer; Gierl, Lothar
2005-03-01
Since clinical management of patients and clinical research are essentially time-oriented endeavours, reasoning about time has become a hot topic in medical informatics. Here we present a method for prognosis of temporal courses, which combines temporal abstractions with case-based reasoning. It is useful for application domains where neither well-known standards, nor known periodicity, nor a complete domain theory exist. We have used our method in two prognostic applications. The first one deals with prognosis of the kidney function for intensive care patients. The idea is to elicit impairments on time, especially to warn against threatening kidney failures. Our second application deals with a completely different domain, namely geographical medicine. Its intention is to compute early warnings against approaching infectious diseases, which are characterised by irregular cyclic occurrences. So far, we have applied our program on influenza and bronchitis. In this paper, we focus on influenza forecast and show first experimental results.
Nonlinear modeling of chaotic time series: Theory and applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casdagli, M.; Eubank, S.; Farmer, J.D.
1990-01-01
We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifyingmore » and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.« less
Thompson, Trevor DB
2004-01-01
Background Two main pathways exist for the development of knowledge in clinical homeopathy. These comprise clinical trials conducted primarily by university-based researchers and cases reports and homeopathic "provings" compiled by engaged homeopathic practitioners. In this paper the relative merits of these methods are examined and a middle way proposed. This consists of the "Formal Case Study" (FCS) in which qualitative methods are used to increase the rigour and sophistication with which homeopathic cases are studied. Before going into design issues this paper places the FCS in an historical and academic context and describes the relative merits of the method. Discussion Like any research, the FCS should have a clear focus. This focus can be both "internal", grounded in the discourse of homeopathy and also encompass issues of wider appeal. A selection of possible "internal" and "external" research questions is introduced. Data generation should be from multiple sources to ensure adequate triangulation. This could include the recording and transcription of actual consultations. Analysis is built around existing theory, involves cross-case comparison and the search for deviant cases. The trustworthiness of conclusions is ensured by the application of concepts from qualitative research including triangulation, groundedness, respondent validation and reflexivity. Though homeopathic case studies have been reported in mainstream literature, none has used formal qualitative methods – though some such studies are in progress. Summary This paper introduces the reader to a new strategy for homeopathic research. This strategy, termed the "formal case study", allows for a naturalistic enquiry into the players, processes and outcomes of homeopathic practice. Using ideas from qualitative research, it allows a rigorous approach to types of research question that cannot typically be addressed through clinical trials and numeric outcome studies. The FCS provides an opportunity for the practitioner-researcher to contribute to the evidence-base in homeopathy in a systematic fashion. The FCS can also be used to inform the design of clinical trials through holistic study of the "active ingredients" of the therapeutic process and its clinical outcomes. PMID:15018637
Reconfigurable Flight Control Designs With Application to the X-33 Vehicle
NASA Technical Reports Server (NTRS)
Burken, John J.; Lu, Ping; Wu, Zhenglu
1999-01-01
Two methods for control system reconfiguration have been investigated. The first method is a robust servomechanism control approach (optimal tracking problem) that is a generalization of the classical proportional-plus-integral control to multiple input-multiple output systems. The second method is a control-allocation approach based on a quadratic programming formulation. A globally convergent fixed-point iteration algorithm has been developed to make onboard implementation of this method feasible. These methods have been applied to reconfigurable entry flight control design for the X-33 vehicle. Examples presented demonstrate simultaneous tracking of angle-of-attack and roll angle commands during failures of the right body flap actuator. Although simulations demonstrate success of the first method in most cases, the control-allocation method appears to provide uniformly better performance in all cases.
A STRICTLY CONTRACTIVE PEACEMAN-RACHFORD SPLITTING METHOD FOR CONVEX PROGRAMMING.
Bingsheng, He; Liu, Han; Wang, Zhaoran; Yuan, Xiaoming
2014-07-01
In this paper, we focus on the application of the Peaceman-Rachford splitting method (PRSM) to a convex minimization model with linear constraints and a separable objective function. Compared to the Douglas-Rachford splitting method (DRSM), another splitting method from which the alternating direction method of multipliers originates, PRSM requires more restrictive assumptions to ensure its convergence, while it is always faster whenever it is convergent. We first illustrate that the reason for this difference is that the iterative sequence generated by DRSM is strictly contractive, while that generated by PRSM is only contractive with respect to the solution set of the model. With only the convexity assumption on the objective function of the model under consideration, the convergence of PRSM is not guaranteed. But for this case, we show that the first t iterations of PRSM still enable us to find an approximate solution with an accuracy of O (1/ t ). A worst-case O (1/ t ) convergence rate of PRSM in the ergodic sense is thus established under mild assumptions. After that, we suggest attaching an underdetermined relaxation factor with PRSM to guarantee the strict contraction of its iterative sequence and thus propose a strictly contractive PRSM. A worst-case O (1/ t ) convergence rate of this strictly contractive PRSM in a nonergodic sense is established. We show the numerical efficiency of the strictly contractive PRSM by some applications in statistical learning and image processing.
You Can Have Your Cake and Eat It Too: A Successful Case of Theory Applied to the Real World.
ERIC Educational Resources Information Center
Rojas, Alicia M.; Mulkey, Jamie
1990-01-01
Describes methods used by instructional designers to help subject matter experts (SMEs) create effective courseware, balanced between theory and practical application, that meets organizational objectives. A case study is presented that explains how to develop student performance objectives (SPOs) through needs assessment, the design of job aids,…
Regional air quality models are being used in a policy-setting to estimate the response of air pollutant concentrations to changes in emissions and meteorology. Dynamic evaluation entails examination of a retrospective case(s) to assess whether an air quality model has properly p...
Data Mining in Course Management Systems: Moodle Case Study and Tutorial
ERIC Educational Resources Information Center
Romero, Cristobal; Ventura, Sebastian; Garcia, Enrique
2008-01-01
Educational data mining is an emerging discipline, concerned with developing methods for exploring the unique types of data that come from the educational context. This work is a survey of the specific application of data mining in learning management systems and a case study tutorial with the Moodle system. Our objective is to introduce it both…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollinger, Greg L.
Background: The current rules in the nuclear section of the ASME Boiler and Pressure Vessel (B&PV) Code , Section III, Subsection NH for the evaluation of strain limits and creep-fatigue damage using simplified methods based on elastic analysis have been deemed inappropriate for Alloy 617 at temperatures above 1200F (650C)1. To address this issue, proposed code rules have been developed which are based on the use of elastic-perfectly plastic (E-PP) analysis methods and which are expected to be applicable to very high temperatures. The proposed rules for strain limits and creep-fatigue evaluation were initially documented in the technical literature 2,more » 3, and have been recently revised to incorporate comments and simplify their application. The revised code cases have been developed. Task Objectives: The goal of the Sample Problem task is to exercise these code cases through example problems to demonstrate their feasibility and, also, to identify potential corrections and improvements should problems be encountered. This will provide input to the development of technical background documents for consideration by the applicable B&PV committees considering these code cases for approval. This task has been performed by Hollinger and Pease of Becht Engineering Co., Inc., Nuclear Services Division and a report detailing the results of the E-PP analyses conducted on example problems per the procedures of the E-PP strain limits and creep-fatigue draft code cases is enclosed as Enclosure 1. Conclusions: The feasibility of the application of the E-PP code cases has been demonstrated through example problems that consist of realistic geometry (a nozzle attached to a semi-hemispheric shell with a circumferential weld) and load (pressure; pipe reaction load applied at the end of the nozzle, including axial and shear forces, bending and torsional moments; through-wall transient temperature gradient) and design and operating conditions (Levels A, B and C).« less
Voltera's Solution of the Wave Equation as Applied to Three-Dimensional Supersonic Airfoil Problems
NASA Technical Reports Server (NTRS)
Heslet, Max A; Lomax, Harvard; Jones, Arthur L
1947-01-01
A surface integral is developed which yields solutions of the linearized partial differential equation for supersonic flow. These solutions satisfy boundary conditions arising in wing theory. Particular applications of this general method are made, using acceleration potentials, to flat surfaces and to uniformly loaded lifting surfaces. Rectangular and trapezoidal plan forms are considered along with triangular forms adaptable to swept-forward and swept-back wings. The case of the triangular plan form in sideslip is also included. Emphasis is placed on the systematic application of the method to the lifting surfaces considered and on the possibility of further application.
Recent statistical methods for orientation data
NASA Technical Reports Server (NTRS)
Batschelet, E.
1972-01-01
The application of statistical methods for determining the areas of animal orientation and navigation are discussed. The method employed is limited to the two-dimensional case. Various tests for determining the validity of the statistical analysis are presented. Mathematical models are included to support the theoretical considerations and tables of data are developed to show the value of information obtained by statistical analysis.
Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh
2017-03-01
Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.
Simões, Carla L; Xará, Susana M; Bernardo, C A
2011-10-01
Recent legislation has stressed the need to decide the best end-of-life (EoL) option for post-consumer products considering their full life-cycle and the corresponding overall environmental impacts. The life cycle assessment (LCA) technique has become a common tool to evaluate those impacts. The present study aimed to contribute to the better understanding of the application of this technique, by evaluating the influence of the selection of the life cycle impact assessment (LCIA) method in its results and conclusions. A specific case study was chosen, using previous information related to an anti-glare lamellae (AGL) for highway use, made with virgin and recycled high-density polyethylene (HDPE). Five distinct LCIA methods were used: Eco-indicator 99, CML 2 (2000), EPS 2000, Eco-indicator 95 and EDIP 97. Consistent results between these methods were obtained for the Climate change, Ozone layer depletion, Acidification and Eutrophication environmental indicators. Conversely, the Summer smog indicator showed large discrepancies between impact assessment methods. The work sheds light on the advantages inherent in using various LCIA methods when doing the LCA study of a specific product, thus evidencing complementary analysis perspectives.
Innovation in prediction planning for anterior open bite correction.
Almuzian, Mohammed; Almukhtar, Anas; O'Neil, Michael; Benington, Philip; Al Anezi, Thamer; Ayoub, Ashraf
2015-05-01
This study applies recent advances in 3D virtual imaging for application in the prediction planning of dentofacial deformities. Stereo-photogrammetry has been used to create virtual and physical models, which are creatively combined in planning the surgical correction of anterior open bite. The application of these novel methods is demonstrated through the surgical correction of a case.
Brain Volume Estimation Enhancement by Morphological Image Processing Tools.
Zeinali, R; Keshtkar, A; Zamani, A; Gharehaghaji, N
2017-12-01
Volume estimation of brain is important for many neurological applications. It is necessary in measuring brain growth and changes in brain in normal/abnormal patients. Thus, accurate brain volume measurement is very important. Magnetic resonance imaging (MRI) is the method of choice for volume quantification due to excellent levels of image resolution and between-tissue contrast. Stereology method is a good method for estimating volume but it requires to segment enough MRI slices and have a good resolution. In this study, it is desired to enhance stereology method for volume estimation of brain using less MRI slices with less resolution. In this study, a program for calculating volume using stereology method has been introduced. After morphologic method, dilation was applied and the stereology method enhanced. For the evaluation of this method, we used T1-wighted MR images from digital phantom in BrainWeb which had ground truth. The volume of 20 normal brain extracted from BrainWeb, was calculated. The volumes of white matter, gray matter and cerebrospinal fluid with given dimension were estimated correctly. Volume calculation from Stereology method in different cases was made. In three cases, Root Mean Square Error (RMSE) was measured. Case I with T=5, d=5, Case II with T=10, D=10 and Case III with T=20, d=20 (T=slice thickness, d=resolution as stereology parameters). By comparing these results of two methods, it is obvious that RMSE values for our proposed method are smaller than Stereology method. Using morphological operation, dilation allows to enhance the estimation volume method, Stereology. In the case with less MRI slices and less test points, this method works much better compared to Stereology method.
Crespo, Alejandro; Rodriguez-Granillo, Agustina; Lim, Victoria T
2017-01-01
The development and application of quantum mechanics (QM) methodologies in computer- aided drug design have flourished in the last 10 years. Despite the natural advantage of QM methods to predict binding affinities with a higher level of theory than those methods based on molecular mechanics (MM), there are only a few examples where diverse sets of protein-ligand targets have been evaluated simultaneously. In this work, we review recent advances in QM docking and scoring for those cases in which a systematic analysis has been performed. In addition, we introduce and validate a simplified QM/MM expression to compute protein-ligand binding energies. Overall, QMbased scoring functions are generally better to predict ligand affinities than those based on classical mechanics. However, the agreement between experimental activities and calculated binding energies is highly dependent on the specific chemical series considered. The advantage of more accurate QM methods is evident in cases where charge transfer and polarization effects are important, for example when metals are involved in the binding process or when dispersion forces play a significant role as in the case of hydrophobic or stacking interactions. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
New chemical-DSMC method in numerical simulation of axisymmetric rarefied reactive flow
NASA Astrophysics Data System (ADS)
Zakeri, Ramin; Kamali Moghadam, Ramin; Mani, Mahmoud
2017-04-01
The modified quantum kinetic (MQK) chemical reaction model introduced by Zakeri et al. is developed for applicable cases in axisymmetric reactive rarefied gas flows using the direct simulation Monte Carlo (DSMC) method. Although, the MQK chemical model uses some modifications in the quantum kinetic (QK) method, it also employs the general soft sphere collision model and Stockmayer potential function to properly select the collision pairs in the DSMC algorithm and capture both the attraction and repulsion intermolecular forces in rarefied gas flows. For assessment of the presented model in the simulation of more complex and applicable reacting flows, first, the air dissociation is studied in a single cell for equilibrium and non-equilibrium conditions. The MQK results agree well with the analytical and experimental data and they accurately predict the characteristics of the rarefied flowfield with chemical reaction. To investigate accuracy of the MQK chemical model in the simulation of the axisymmetric flow, air dissociation is also assessed in an axial hypersonic flow around two geometries, the sphere as a benchmark case and the blunt body (STS-2) as an applicable test case. The computed results including the transient, rotational and vibrational temperatures, species concentration in the stagnation line, and also the heat flux and pressure coefficient on the surface are compared with those of the other chemical methods like the QK and total collision energy (TCE) models and available analytical and experimental data. Generally, the MQK chemical model properly simulates the chemical reactions and predicts flowfield characteristics more accurate rather than the typical QK model. Although in some cases, results of the MQK approaches match with those of the TCE method, the main point is that the MQK does not need any experimental data or unrealistic assumption of specular boundary condition as used in the TCE method. Another advantage of the MQK model is the significant reduction of computational cost rather than the QK chemical model to reach the same accuracy because of applying more proper collision model and consequently, decrease of the particles collision number.
Dexter, Franklin; Ledolter, Johannes; Hindman, Bradley J
2016-01-01
In this Statistical Grand Rounds, we review methods for the analysis of the diversity of procedures among hospitals, the activities among anesthesia providers, etc. We apply multiple methods and consider their relative reliability and usefulness for perioperative applications, including calculations of SEs. We also review methods for comparing the similarity of procedures among hospitals, activities among anesthesia providers, etc. We again apply multiple methods and consider their relative reliability and usefulness for perioperative applications. The applications include strategic analyses (e.g., hospital marketing) and human resource analytics (e.g., comparisons among providers). Measures of diversity of procedures and activities (e.g., Herfindahl and Gini-Simpson index) are used for quantification of each facility (hospital) or anesthesia provider, one at a time. Diversity can be thought of as a summary measure. Thus, if the diversity of procedures for 48 hospitals is studied, the diversity (and its SE) is being calculated for each hospital. Likewise, the effective numbers of common procedures at each hospital can be calculated (e.g., by using the exponential of the Shannon index). Measures of similarity are pairwise assessments. Thus, if quantifying the similarity of procedures among cases with a break or handoff versus cases without a break or handoff, a similarity index represents a correlation coefficient. There are several different measures of similarity, and we compare their features and applicability for perioperative data. We rely extensively on sensitivity analyses to interpret observed values of the similarity index.
Composite Socio-Technical Systems: A Method for Social Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yingchen; He, Fulin; Hao, Jun
In order to model and study the interactions between social on technical systems, a systemic method, namely the composite socio-technical systems (CSTS), is proposed to incorporate social systems, technical systems and the interaction mechanism between them. A case study on University of Denver (DU) campus grid is presented in paper to demonstrate the application of the proposed method. In the case study, the social system, technical system, and the interaction mechanism are defined and modelled within the framework of CSTS. Distributed and centralized control and management schemes are investigated, respectively, and numerical results verifies the feasibility and performance of themore » proposed composite system method.« less
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Sanderson, A. C.
1994-01-01
Robot coordination and control systems for remote teleoperation applications are by necessity implemented on distributed computers. Modeling and performance analysis of these distributed robotic systems is difficult, but important for economic system design. Performance analysis methods originally developed for conventional distributed computer systems are often unsatisfactory for evaluating real-time systems. The paper introduces a formal model of distributed robotic control systems; and a performance analysis method, based on scheduling theory, which can handle concurrent hard-real-time response specifications. Use of the method is illustrated by a case of remote teleoperation which assesses the effect of communication delays and the allocation of robot control functions on control system hardware requirements.
NASA Astrophysics Data System (ADS)
Domec, Brennan S.
In today's industry, engineering materials are continuously pushed to the limits. Often, the application only demands high-specification properties in a narrowly-defined region of the material, such as the outermost surface. This, in combination with the economic benefits, makes case hardening an attractive solution to meet industry demands. While case hardening has been in use for decades, applications demanding high hardness, deep case depth, and high corrosion resistance are often under-served by this process. Instead, new solutions are required. The goal of this study is to develop and characterize a new borochromizing process applied to a pre-carburized AISI 8620 alloy steel. The process was successfully developed using a combination of computational simulations, calculations, and experimental testing. Process kinetics were studied by fitting case depth measurement data to Fick's Second Law of Diffusion and an Arrhenius equation. Results indicate that the kinetics of the co-diffusion method are unaffected by the addition of chromium to the powder pack. The results also show that significant structural degradation of the case occurs when chromizing is applied sequentially to an existing boronized case. The amount of degradation is proportional to the chromizing parameters. Microstructural evolution was studied using metallographic methods, simulation and computational calculations, and analytical techniques. While the co-diffusion process failed to enrich the substrate with chromium, significant enrichment is obtained with the sequential diffusion process. The amount of enrichment is directly proportional to the chromizing parameters with higher parameters resulting in more enrichment. The case consists of M7C3 and M23C6 carbides nearest the surface, minor amounts of CrB, and a balance of M2B. Corrosion resistance was measured with salt spray and electrochemical methods. These methods confirm the benefit of surface enrichment by chromium in the sequential diffusion method with corrosion resistance increasing directly with chromium concentration. The results also confirm the deleterious effect of surface-breaking case defects and the need to reduce or eliminate them. The best combination of microstructural integrity, mean surface hardness, effective case depth, and corrosion resistance is obtained in samples sequentially boronized and chromized at 870°C for 6hrs. Additional work is required to further optimize process parameters and case properties.
Selection of Thermal Worst-Case Orbits via Modified Efficient Global Optimization
NASA Technical Reports Server (NTRS)
Moeller, Timothy M.; Wilhite, Alan W.; Liles, Kaitlin A.
2014-01-01
Efficient Global Optimization (EGO) was used to select orbits with worst-case hot and cold thermal environments for the Stratospheric Aerosol and Gas Experiment (SAGE) III. The SAGE III system thermal model changed substantially since the previous selection of worst-case orbits (which did not use the EGO method), so the selections were revised to ensure the worst cases are being captured. The EGO method consists of first conducting an initial set of parametric runs, generated with a space-filling Design of Experiments (DoE) method, then fitting a surrogate model to the data and searching for points of maximum Expected Improvement (EI) to conduct additional runs. The general EGO method was modified by using a multi-start optimizer to identify multiple new test points at each iteration. This modification facilitates parallel computing and decreases the burden of user interaction when the optimizer code is not integrated with the model. Thermal worst-case orbits for SAGE III were successfully identified and shown by direct comparison to be more severe than those identified in the previous selection. The EGO method is a useful tool for this application and can result in computational savings if the initial Design of Experiments (DoE) is selected appropriately.
Lee, Soo-Jeong; Mehler, Louise; Beckman, John; Diebolt-Brown, Brienne; Prado, Joanne; Lackovic, Michelle; Waltz, Justin; Mulay, Prakash; Schwartz, Abby; Mitchell, Yvette; Moraga-McHaley, Stephanie; Gergely, Rita
2011-01-01
Background: Pesticides are widely used in agriculture, and off-target pesticide drift exposes workers and the public to harmful chemicals. Objective: We estimated the incidence of acute illnesses from pesticide drift from outdoor agricultural applications and characterized drift exposure and illnesses. Methods: Data were obtained from the National Institute for Occupational Safety and Health’s Sentinel Event Notification System for Occupational Risks–Pesticides program and the California Department of Pesticide Regulation. Drift included off-target movement of pesticide spray, volatiles, and contaminated dust. Acute illness cases were characterized by demographics, pesticide and application variables, health effects, and contributing factors. Results: From 1998 through 2006, we identified 2,945 cases associated with agricultural pesticide drift from 11 states. Our findings indicate that 47% were exposed at work, 92% experienced low-severity illness, and 14% were children (< 15 years). The annual incidence ranged from 1.39 to 5.32 per million persons over the 9-year period. The overall incidence (in million person-years) was 114.3 for agricultural workers, 0.79 for other workers, 1.56 for nonoccupational cases, and 42.2 for residents in five agriculture-intensive counties in California. Soil applications with fumigants were responsible for the largest percentage (45%) of cases. Aerial applications accounted for 24% of cases. Common factors contributing to drift cases included weather conditions, improper seal of the fumigation site, and applicator carelessness near nontarget areas. Conclusions: Agricultural workers and residents in agricultural regions had the highest rate of pesticide poisoning from drift exposure, and soil fumigations were a major hazard, causing large drift incidents. Our findings highlight areas where interventions to reduce off-target drift could be focused. PMID:21642048
Guan, Yong-ping; Zhao, Wen; Li, Shen-gang; Zhang, Guo-bin
2014-01-01
The design and construction of shallow-buried tunnels in densely populated urban areas involve many challenges. The ground movements induced by tunneling effects pose potential risks to infrastructure such as surface buildings, pipelines, and roads. In this paper, a case study of the Zhongjie subway station located in Shenyang, China, is examined to investigate the key construction techniques and the influence of the Pile-Beam-Arch (PBA) excavation method on the surrounding environment. This case study discusses the primary risk factors affecting the environmental safety and summarizes the corresponding risk mitigation measures and key techniques for subway station construction using the PBA excavation method in a densely populated urban area. PMID:25221783
CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira
2013-01-01
Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study. PMID:24037076
Neuromyelitis optica: Application of computer diagnostics to historical case reports.
Garcia Reitboeck, Pablo; Garrard, Peter; Peters, Timothy
2017-01-01
The retrospective diagnosis of illnesses by medical historians can often be difficult and prone to bias, although knowledge of the medical disorders of historical figures is key to the understanding of their behavior and reactions. The recent application of computer diagnostics to historical figures allows an objective differential diagnosis to be accomplished. Taking an example from clinical neurology, we analyzed the earliest reported cases of Devic's disease (neuromyelitis optica) that commonly affects the optic nerve and spinal cord and was previously often confused with multiple sclerosis. We conclude that in most identified cases the software concurred with the contemporary physicians' interpretation, but some claimed cases either had insufficient data to provide a diagnosis or other possible diagnoses were suggested that had not been considered. Computational methods may, therefore, help historians to diagnose the ailments of historical figures with greater objectivity.
NASA Astrophysics Data System (ADS)
Małoszewski, P.; Zuber, A.
1982-06-01
Three new lumped-parameter models have been developed for the interpretation of environmental radioisotope data in groundwater systems. Two of these models combine other simpler models, i.e. the piston flow model is combined either with the exponential model (exponential distribution of transit times) or with the linear model (linear distribution of transit times). The third model is based on a new solution to the dispersion equation which more adequately represents the real systems than the conventional solution generally applied so far. The applicability of models was tested by the reinterpretation of several known case studies (Modry Dul, Cheju Island, Rasche Spring and Grafendorf). It has been shown that two of these models, i.e. the exponential-piston flow model and the dispersive model give better fitting than other simpler models. Thus, the obtained values of turnover times are more reliable, whereas the additional fitting parameter gives some information about the structure of the system. In the examples considered, in spite of a lower number of fitting parameters, the new models gave practically the same fitting as the multiparameter finite state mixing-cell models. It has been shown that in the case of a constant tracer input a prior physical knowledge of the groundwater system is indispensable for determining the turnover time. The piston flow model commonly used for age determinations by the 14C method is an approximation applicable only in the cases of low dispersion. In some cases the stable-isotope method aids in the interpretation of systems containing mixed waters of different ages. However, when 14C method is used for mixed-water systems a serious mistake may arise by neglecting the different bicarbonate contents in particular water components.
[Otomycosis and topical application of thimerosal: study of 152 cases].
Tisner, J; Millán, J; Rivas, P; Adiego, I; Castellote, A; Valles, H
1995-01-01
To evaluate the effectiveness of the topical application of Timerosal (merthilate tintura) in mycosis involving the external auditory canal. The study includes 152 patients with the clinical, otoscopic and microscopic diagnosis of otomycosis. Results were assessed 72 hours and 10 days after the application. Bacteriological study was performed in 83 patients, finding Aspergilly niger in 54.0% of the cases, Candida albicans in 25.4%, Aspergillus fumigatus in 15.8% and Penicillium in 4.8%. Improvement at 72 h. was found in 66.4% and at 10 days in 93.4% of the patients. Bacteriological contamination was found in 6.6% of the total. In most of the patients, the otomycosis healed after cleaning of the external auditory canal and topical application of timerosal. This method is easy to apply, fast, effective, of low cost and few side effects.
Selker, Harry P.; Leslie, Laurel K.
2015-01-01
Abstract There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in‐person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. PMID:26332869
A System to Create Training Systems
ERIC Educational Resources Information Center
Training in Business and Industry, 1971
1971-01-01
This article describes Kodak's procedure for analyzing, developing and pretesting training programs through its Marketing Education Center. Included is a case history of the application of the method which greatly affects cost effectiveness. (RR)
Calculation of two dimensional vortex/surface interference using panel methods
NASA Technical Reports Server (NTRS)
Maskew, B.
1980-01-01
The application of panel methods to the calculation of vortex/surface interference characteristics in two dimensional flow was studied over a range of situations starting with the simple case of a vortex above a plane and proceeding to the case of vortex separation from a prescribed point on a thick section. Low order and high order panel methods were examined, but the main factor influencing the accuracy of the solution was the distance between control stations in relation to the height of the vortex above the surface. Improvements over the basic solutions were demonstrated using a technique based on subpanels and an applied doublet distribution.
Arabic Supervised Learning Method Using N-Gram
ERIC Educational Resources Information Center
Sanan, Majed; Rammal, Mahmoud; Zreik, Khaldoun
2008-01-01
Purpose: Recently, classification of Arabic documents is a real problem for juridical centers. In this case, some of the Lebanese official journal documents are classified, and the center has to classify new documents based on these documents. This paper aims to study and explain the useful application of supervised learning method on Arabic texts…
ERIC Educational Resources Information Center
Busse, R. T.; Elliott, Stephen N.; Kratochwill, Thomas R.
2010-01-01
The purpose of this article is to present Convergent Evidence Scaling (CES) as an emergent method for combining data from multiple assessment indicators. The CES method combines single-case assessment data by converging data gathered across multiple persons, settings, or measures, thereby providing an overall criterion-referenced outcome on which…
Characterization of Developer Application Methods Used in Fluorescent Penetrant Inspection
NASA Astrophysics Data System (ADS)
Brasche, L. J. H.; Lopez, R.; Eisenmann, D.
2006-03-01
Fluorescent penetrant inspection (FPI) is the most widely used inspection method for aviation components seeing use for production as well as an inservice inspection applications. FPI is a multiple step process requiring attention to the process parameters for each step in order to enable a successful inspection. A multiyear program is underway to evaluate the most important factors affecting the performance of FPI, to determine whether existing industry specifications adequately address control of the process parameters, and to provide the needed engineering data to the public domain. The final step prior to the inspection is the application of developer with typical aviation inspections involving the use of dry powder (form d) usually applied using either a pressure wand or dust storm chamber. Results from several typical dust storm chambers and wand applications have shown less than optimal performance. Measurements of indication brightness and recording of the UVA image, and in some cases, formal probability of detection (POD) studies were used to assess the developer application methods. Key conclusions and initial recommendations are provided.
Current distribution on a cylindrical antenna with parallel orientation in a lossy magnetoplasma
NASA Technical Reports Server (NTRS)
Klein, C. A.; Klock, P. W.; Deschamps, G. A.
1972-01-01
The current distribution and impedance of a thin cylindrical antenna with parallel orientation to the static magnetic field of a lossy magnetoplasma is calculated with the method of moments. The electric field produced by an infinitesimal current source is first derived. Results are presented for a wide range of plasma parameters. Reasonable answers are obtained for all cases except for the overdense hyperbolic case. A discussion of the numerical stability is included which not only applies to this problem but other applications of the method of moments.
NASA Astrophysics Data System (ADS)
Lutz, Thomas; Veissier, Lucile; Thiel, Charles W.; Woodburn, Philip J. T.; Cone, Rufus L.; Barclay, Paul E.; Tittel, Wolfgang
2016-01-01
High-quality rare-earth-ion (REI) doped materials are a prerequisite for many applications such as quantum memories, ultra-high-resolution optical spectrum analyzers and information processing. Compared to bulk materials, REI doped powders offer low-cost fabrication and a greater range of accessible material systems. Here we show that crystal properties, such as nuclear spin lifetime, are strongly affected by mechanical treatment, and that spectral hole burning can serve as a sensitive method to characterize the quality of REI doped powders. We focus on the specific case of thulium doped ? (Tm:YAG). Different methods for obtaining the powders are compared and the influence of annealing on the spectroscopic quality of powders is investigated on a few examples. We conclude that annealing can reverse some detrimental effects of powder fabrication and, in certain cases, the properties of the bulk material can be reached. Our results may be applicable to other impurities and other crystals, including color centers in nano-structured diamond.
Zhang, Wei; Chen, Chuanhui; Cui, Jian; Bai, Wei; Zhou, Jing
2015-01-01
The present study explores the application of LAMP for rapid diagnosis of pathogenic bacteria in clinical sputum specimens of AECOPD as compared with conventional sputum culturing method. 120 sputum specimens of AECOPD patients, 46 sputum specimens of healthy controls, as well as 166 serum specimens as negative controls, were evaluated by LAMP assay using primers of eight typical respiratory pathogens. No cross-reactivity was observed in these negative control species using LAMP assay. The lower detection limit of LAMP assay was approximately 10(3) copies. 25 cases (20.8%) were detected at least one positive bacteria species by conventional sputum culturing method, while 73 cases (60.8%) were tested positive in LAMP assay. Moreover, compared with sputum culture, bacterial titers results of LAMP assay were more consistent with FEV1/FVC value of AECOPD patients. These results indicated that the sensitivity of LAMP assay was significantly higher than that of sputum culturing method.
Sclerotherapy of cervical cysts with Picibanil (OK-432).
Knipping, Stephan; Goetze, Gerrit; Neumann, Kerstin; Bloching, Marc
2007-04-01
The effectiveness of intralesional sclerotherapy of lymphangiomas and ranulas with OK-432 (Picibanil) has been proved in several clinical studies. The aim of our study was to review the effectiveness of sclerotherapy of benign cervical cysts with Picibanil as an alternative method to surgical excision. Between March 2002 and March 2006, a prospective observational study was carried out to assess the effects of Picibanil on cervical cysts. Between 2002 and 2006 we treated 14 patients having cervical cysts through intralesional application of Picibanil with a dose of 0.01 mg/ml. So far we used Picibanil with 13 patients achieving a high success rate. In eight cases we observed, both clinically and ultrasonographically, a nearly complete regression, and a complete regression of the cysts in three cases. In two cases the cysts atrophied. In these cases only residual findings could be observed. In one case we extirpated the remaining cyst. If there is no clear reaction of the cyst to the treatment, an excision is indicated 6 weeks after the injections to gain meaningful histological examination. No significant complication after sclerotherapy with Picibanil was observed. According to our results the application of OK-432 (Picibanil) is a safe and effective primary method for sclerotherapy of benign cervical cysts which can replace surgical extirpation in special cases. However, the risk of malign diseases has to be excluded before the commencement of the Picibanil treatment.
ERIC Educational Resources Information Center
Paley, Blair; O'Connor, Mary J.; Baillie, Susan J.; Guiton, Gretchen; Stuber, Margaret L.
2009-01-01
Objectives: This article describes the use of fetal alcohol spectrum disorders (FASDs) as a theme to connect the learning of basic neurosciences with clinical applications across the age span within a systems-based, integrated curricular structure that emphasizes problem-based learning. Methods: In collaboration with the Centers for Disease…
Gupte, M D; Murthy, B N; Mahmood, K; Meeralakshmi, S; Nagaraju, B; Prabhakaran, R
2004-04-01
The concept of elimination of an infectious disease is different from eradication and in a way from control as well. In disease elimination programmes the desired reduced level of prevalence is set up as the target to be achieved in a practical time frame. Elimination can be considered in the context of national or regional levels. Prevalence levels depend on occurrence of new cases and thus could remain fluctuating. There are no ready pragmatic methods to monitor the progress of leprosy elimination programmes. We therefore tried to explore newer methods to answer these demands. With the lowering of prevalence of leprosy to the desired level of 1 case per 10000 population at the global level, the programme administrators' concern will be shifted to smaller areas e.g. national and sub-national levels. For monitoring this situation, we earlier observed that lot quality assurance sampling (LQAS), a quality control tool in industry was useful in the initially high endemic areas. However, critical factors such as geographical distribution of cases and adoption of cluster sampling design instead of simple random sampling design deserve attention before LQAS could generally be recommended. The present exercise was aimed at validating applicability of LQAS, and adopting these modifications for monitoring leprosy elimination in Tamil Nadu state, which was highly endemic for leprosy. A representative sample of 64000 people drawn from eight districts of Tamil Nadu state, India, with maximum allowable number of 25 cases was considered, using LQAS methodology to test whether leprosy prevalence was at or below 7 per 10000 population. Expected number of cases for each district was obtained assuming Poisson distribution. Goodness of fit for the observed and expected cases (closeness of the expected number of cases to those observed) was tested through chi(2). Enhancing factor (design effect) for sample size was obtained by computing the intraclass correlation. The survey actually covered a population of 62157 individuals, of whom 56469 (90.8%) were examined. Ninety-six cases were detected and this number far exceeded the critical value of 25. The number of cases for each district and the number of cases in the entire surveyed area both followed Poisson distribution. The intraclass correlation coefficients were close to zero and the design effect was observed to be close to one. Based on the LQAS exercises leprosy prevalence in the state of Tamil Nadu in India was above 7 per 10000. LQAS method using clusters was validated for monitoring leprosy elimination in high endemic areas. Use of cluster sampling makes this method further useful as a rapid assessment procedure. This method needs to be tested for its applicability in moderate and low endemic areas, where the sample size may need increasing. It is further possible to consider LQAS as a monitoring tool for elimination programmes with respect to other disease conditions.
NASA Astrophysics Data System (ADS)
Yuen, Kevin Kam Fung
2009-10-01
The most appropriate prioritization method is still one of the unsettled issues of the Analytic Hierarchy Process, although many studies have been made and applied. Interestingly, many AHP applications apply only Saaty's Eigenvector method as many studies have found that this method may produce rank reversals and have proposed various prioritization methods as alternatives. Some methods have been proved to be better than the Eigenvector method. However, these methods seem not to attract the attention of researchers. In this paper, eight important prioritization methods are reviewed. A Mixed Prioritization Operators Strategy (MPOS) is developed to select a vector which is prioritized by the most appropriate prioritization operator. To verify this new method, a case study of high school selection is revised using the proposed method. The contribution is that MPOS is useful for solving prioritization problems in the AHP.
Qualitative case study methodology in nursing research: an integrative review.
Anthony, Susan; Jack, Susan
2009-06-01
This paper is a report of an integrative review conducted to critically analyse the contemporary use of qualitative case study methodology in nursing research. Increasing complexity in health care and increasing use of case study in nursing research support the need for current examination of this methodology. In 2007, a search for case study research (published 2005-2007) indexed in the CINAHL, MEDLINE, EMBASE, PsychINFO, Sociological Abstracts and SCOPUS databases was conducted. A sample of 42 case study research papers met the inclusion criteria. Whittemore and Knafl's integrative review method guided the analysis. Confusion exists about the name, nature and use of case study. This methodology, including terminology and concepts, is often invisible in qualitative study titles and abstracts. Case study is an exclusive methodology and an adjunct to exploring particular aspects of phenomena under investigation in larger or mixed-methods studies. A high quality of case study exists in nursing research. Judicious selection and diligent application of literature review methods promote the development of nursing science. Case study is becoming entrenched in the nursing research lexicon as a well-accepted methodology for studying phenomena in health and social care, and its growing use warrants continued appraisal to promote nursing knowledge development. Attention to all case study elements, process and publication is important in promoting authenticity, methodological quality and visibility.
Laser Doppler velocimetry primer
NASA Technical Reports Server (NTRS)
Bachalo, William D.
1985-01-01
Advanced research in experimental fluid dynamics required a familiarity with sophisticated measurement techniques. In some cases, the development and application of new techniques is required for difficult measurements. Optical methods and in particular, the laser Doppler velocimeter (LDV) are now recognized as the most reliable means for performing measurements in complex turbulent flows. And such, the experimental fluid dynamicist should be familiar with the principles of operation of the method and the details associated with its application. Thus, the goals of this primer are to efficiently transmit the basic concepts of the LDV method to potential users and to provide references that describe the specific areas in greater detail.
Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F
2016-01-01
Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006
NASA Astrophysics Data System (ADS)
Kasim, Maznah Mat; Abdullah, Siti Rohana Goh
2014-07-01
Many average methods are available to aggregate a set of numbers to become single number. However these methods do not consider the interdependencies between the criteria of the related numbers. This paper is highlighting the Choquet Integral method as an alternative aggregation method where the interdependency estimates between the criteria are comprised in the aggregation process. The interdependency values can be estimated by using lambda fuzzy measure method. By considering the interdependencies or interaction between the criteria, the resulted aggregated values are more meaningful as compared to the ones obtained by normal average methods. The application of the Choquet Integral is illustrated in a case study of finding the overall academic achievement of year six pupils in a selected primary school in a northern state of Malaysia.
Application of abstract harmonic analysis to the high-speed recognition of images
NASA Technical Reports Server (NTRS)
Usikov, D. A.
1979-01-01
Methods are constructed for rapidly computing correlation functions using the theory of abstract harmonic analysis. The theory developed includes as a particular case the familiar Fourier transform method for a correlation function which makes it possible to find images which are independent of their translation in the plane. Two examples of the application of the general theory described are the search for images, independent of their rotation and scale, and the search for images which are independent of their translations and rotations in the plane.
[Methods of artificial intelligence: a new trend in pharmacy].
Dohnal, V; Kuca, K; Jun, D
2005-07-01
Artificial neural networks (ANN) and genetic algorithms are one group of methods called artificial intelligence. The application of ANN on pharmaceutical data can lead to an understanding of the inner structure of data and a possibility to build a model (adaptation). In addition, for certain cases it is possible to extract rules from data. The adapted ANN is prepared for the prediction of properties of compounds which were not used in the adaptation phase. The applications of ANN have great potential in pharmaceutical industry and in the interpretation of analytical, pharmacokinetic or toxicological data.
NASA Astrophysics Data System (ADS)
Jiang, Ching-Fen; Wang, Chih-Yu; Chiang, Chun-Ping
2011-07-01
Optoelectronics techniques to induce protoporphyrin IX fluorescence with topically applied 5-aminolevulinic acid on the oral mucosa have been developed to noninvasively detect oral cancer. Fluorescence imaging enables wide-area screening for oral premalignancy, but the lack of an adequate fluorescence enhancement method restricts the clinical imaging application of these techniques. This study aimed to develop a reliable fluorescence enhancement method to improve PpIX fluorescence imaging systems for oral cancer detection. Three contrast features, red-green-blue reflectance difference, R/B ratio, and R/G ratio, were developed first based on the optical properties of the fluorescence images. A comparative study was then carried out with one negative control and four biopsy confirmed clinical cases to validate the optimal image processing method for the detection of the distribution of malignancy. The results showed the superiority of the R/G ratio in terms of yielding a better contrast between normal and neoplastic tissue, and this method was less prone to errors in detection. Quantitative comparison with the clinical diagnoses in the four neoplastic cases showed that the regions of premalignancy obtained using the proposed method accorded with the expert's determination, suggesting the potential clinical application of this method for the detection of oral cancer.
APPLICATION OF ISOTOPE ENCEPHALOGRAPHY AND ELECTROENCEPHALOSCOPY FOR LOCALIZATION OF BRAIN TUMOURS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shamov, V.N.; Badmayev, C.N.; Bekhtereva, N.P.
1959-10-31
The problems of diagnosis and localization of brain tumors in some cases present many difficulities and make the neurosurgeon seek for additional methods of investigation. In such circumstances usage of the tracer technique in diagnostics is of considerable help, as it has obvious advantages compared with other methods of investigation, such as safety, painlessness, non-traumatism, absence of undesirable after effects, accuracy, and relative simplicity. The present communication is based on the results of clinical observations on 150 patients with verified brain tumors. Analyses of the data show that the accuracy of the brain tumor localizations vary, depending upon the depthmore » of the tumor site and conceniration of labelled material in the area of tumor growth. The diagnostic value of the method is doubtful in cases of tumors of posterior fossa, base of the brain, or the lesions of median line. The application of isotope encephalography is successfully supplemented by the new method of investigations, i.e., electroencephaloscopy, which allows the localization of deeply set tumors. Possibilities and limitations of the method are discussed. It is concluded that the isotope encephalography and electroencephaloscopy represent very valuable diagnostic methods which alongside with other auxiliary methods are widely used in diagnosis of brain tumors. (C.H.)« less
Determination of wall shear stress from mean velocity and Reynolds shear stress profiles
NASA Astrophysics Data System (ADS)
Volino, Ralph J.; Schultz, Michael P.
2018-03-01
An analytical method is presented for determining the Reynolds shear stress profile in steady, two-dimensional wall-bounded flows using the mean streamwise velocity. The method is then utilized with experimental data to determine the local wall shear stress. The procedure is applicable to flows on smooth and rough surfaces with arbitrary pressure gradients. It is based on the streamwise component of the boundary layer momentum equation, which is transformed into inner coordinates. The method requires velocity profiles from at least two streamwise locations, but the formulation of the momentum equation reduces the dependence on streamwise gradients. The method is verified through application to laminar flow solutions and turbulent DNS results from both zero and nonzero pressure gradient boundary layers. With strong favorable pressure gradients, the method is shown to be accurate for finding the wall shear stress in cases where the Clauser fit technique loses accuracy. The method is then applied to experimental data from the literature from zero pressure gradient studies on smooth and rough walls, and favorable and adverse pressure gradient cases on smooth walls. Data from very near the wall are not required for determination of the wall shear stress. Wall friction velocities obtained using the present method agree with those determined in the original studies, typically to within 2%.
Dung, Van Than; Tjahjowidodo, Tegoeh
2017-01-01
B-spline functions are widely used in many industrial applications such as computer graphic representations, computer aided design, computer aided manufacturing, computer numerical control, etc. Recently, there exist some demands, e.g. in reverse engineering (RE) area, to employ B-spline curves for non-trivial cases that include curves with discontinuous points, cusps or turning points from the sampled data. The most challenging task in these cases is in the identification of the number of knots and their respective locations in non-uniform space in the most efficient computational cost. This paper presents a new strategy for fitting any forms of curve by B-spline functions via local algorithm. A new two-step method for fast knot calculation is proposed. In the first step, the data is split using a bisecting method with predetermined allowable error to obtain coarse knots. Secondly, the knots are optimized, for both locations and continuity levels, by employing a non-linear least squares technique. The B-spline function is, therefore, obtained by solving the ordinary least squares problem. The performance of the proposed method is validated by using various numerical experimental data, with and without simulated noise, which were generated by a B-spline function and deterministic parametric functions. This paper also discusses the benchmarking of the proposed method to the existing methods in literature. The proposed method is shown to be able to reconstruct B-spline functions from sampled data within acceptable tolerance. It is also shown that, the proposed method can be applied for fitting any types of curves ranging from smooth ones to discontinuous ones. In addition, the method does not require excessive computational cost, which allows it to be used in automatic reverse engineering applications.
CLustre: semi-automated lineament clustering for palaeo-glacial reconstruction
NASA Astrophysics Data System (ADS)
Smith, Mike; Anders, Niels; Keesstra, Saskia
2016-04-01
Palaeo glacial reconstructions, or "inversions", using evidence from the palimpsest landscape are increasingly being undertaken with larger and larger databases. Predominant in landform evidence is the lineament (or drumlin) where the biggest datasets number in excess of 50,000 individual forms. One stage in the inversion process requires the identification of lineaments that are generically similar and then their subsequent interpretation in to a coherent chronology of events. Here we present CLustre, a semi-authomated algorithm that clusters lineaments using a locally adaptive, region growing, method. This is initially tested using 1,500 model runs on a synthetic dataset, before application to two case studies (where manual clustering has been undertaken by independent researchers): (1) Dubawnt Lake, Canada and (2) Victoria island, Canada. Results using the synthetic data show that classifications are robust in most scenarios, although specific cases of cross-cutting lineaments may lead to incorrect clusters. Application to the case studies showed a very good match to existing published work, with differences related to limited numbers of unclassified lineaments and parallel cross-cutting lineaments. The value in CLustre comes from the semi-automated, objective, application of a classification method that is repeatable. Once classified, summary statistics of lineament groups can be calculated and then used in the inversion.
ERIC Educational Resources Information Center
Gist, Peter; Langley, David
2007-01-01
PRINCE2, which stands for Projects in Controlled Environments, is a project management method covering the organisation, management, and control of projects and is widely used in both government and commercial IT and building projects in the UK. This paper describes the application of PRINCE2 to the management of large clinical trials…
ERIC Educational Resources Information Center
Chen, Lih-Shyang; Cheng, Yuh-Ming; Weng, Sheng-Feng; Chen, Yong-Guo; Lin, Chyi-Her
2009-01-01
The prevalence of Internet applications nowadays has led many medical schools and centers to incorporate computerized Problem-Based Learning (PBL) methods into their training curricula. However, many of these PBL systems do not truly reflect the situations which practitioners may actually encounter in a real medical environment, and hence their…
Carvalho, Suzana Papile Maciel; Brito, Liz Magalhães; Paiva, Luiz Airton Saavedra de; Bicudo, Lucilene Arilho Ribeiro; Crosato, Edgard Michel; Oliveira, Rogério Nogueira de
2013-01-01
Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study.
Comparative study: TQ and Lean Production ownership models in health services
Eiro, Natalia Yuri; Torres-Junior, Alvair Silveira
2015-01-01
Objective: compare the application of Total Quality (TQ) models used in processes of a health service, cases of lean healthcare and literature from another institution that has also applied this model. Method: this is a qualitative research that was conducted through a descriptive case study. Results: through critical analysis of the institutions studied it was possible to make a comparison between the traditional quality approach checked in one case and the theoretical and practice lean production approach used in another case and the specifications are described below. Conclusion: the research identified that the lean model was better suited for people that work systemically and generate the flow. It also pointed towards some potential challenges in the introduction and implementation of lean methods in health. PMID:26487134
The Worst-Case Weighted Multi-Objective Game with an Application to Supply Chain Competitions.
Qu, Shaojian; Ji, Ying
2016-01-01
In this paper, we propose a worst-case weighted approach to the multi-objective n-person non-zero sum game model where each player has more than one competing objective. Our "worst-case weighted multi-objective game" model supposes that each player has a set of weights to its objectives and wishes to minimize its maximum weighted sum objectives where the maximization is with respect to the set of weights. This new model gives rise to a new Pareto Nash equilibrium concept, which we call "robust-weighted Nash equilibrium". We prove that the robust-weighted Nash equilibria are guaranteed to exist even when the weight sets are unbounded. For the worst-case weighted multi-objective game with the weight sets of players all given as polytope, we show that a robust-weighted Nash equilibrium can be obtained by solving a mathematical program with equilibrium constraints (MPEC). For an application, we illustrate the usefulness of the worst-case weighted multi-objective game to a supply chain risk management problem under demand uncertainty. By the comparison with the existed weighted approach, we show that our method is more robust and can be more efficiently used for the real-world applications.
Case analysis online: a strategic management case model for the health industry.
Walsh, Anne; Bearden, Eithne
2004-01-01
Despite the plethora of methods and tools available to support strategic management, the challenge for health executives in the next century will relate to their ability to access and interpret data from multiple and intricate communication networks. Integrated digital networks and satellite systems will expand the scope and ease of sharing information between business divisions, and networked systems will facilitate the use of virtual case discussions across universities. While the internet is frequently used to support clinical decisions in the healthcare industry, few executives rely upon the internetfor strategic analysis. Although electronic technologies can easily synthesize data from multiple information channels, research as well as technical issues may deter their application in strategic analysis. As digital models transform access to information, online models may become increasingly relevant in designing strategic solutions. While there are various pedagogical models available to support the strategic management process, this framework was designed to enhance strategic analysis through the application of technology and electronic research. A strategic analysis framework, which incorporated internet research and case analysis in a strategic managementcourse, is described alongwith design and application issues that emerged during the case analysis process.
Earl, David J; Deem, Michael W
2005-04-14
Adaptive Monte Carlo methods can be viewed as implementations of Markov chains with infinite memory. We derive a general condition for the convergence of a Monte Carlo method whose history dependence is contained within the simulated density distribution. In convergent cases, our result implies that the balance condition need only be satisfied asymptotically. As an example, we show that the adaptive integration method converges.
Surface Coating of Oxide Powders: A New Synthesis Method to Process Biomedical Grade Nano-Composites
Palmero, Paola; Montanaro, Laura; Reveron, Helen; Chevalier, Jérôme
2014-01-01
Composite and nanocomposite ceramics have achieved special interest in recent years when used for biomedical applications. They have demonstrated, in some cases, increased performance, reliability, and stability in vivo, with respect to pure monolithic ceramics. Current research aims at developing new compositions and architectures to further increase their properties. However, the ability to tailor the microstructure requires the careful control of all steps of manufacturing, from the synthesis of composite nanopowders, to their processing and sintering. This review aims at deepening understanding of the critical issues associated with the manufacturing of nanocomposite ceramics, focusing on the key role of the synthesis methods to develop homogeneous and tailored microstructures. In this frame, the authors have developed an innovative method, named “surface-coating process”, in which matrix oxide powders are coated with inorganic precursors of the second phase. The method is illustrated into two case studies; the former, on Zirconia Toughened Alumina (ZTA) materials for orthopedic applications, and the latter, on Zirconia-based composites for dental implants, discussing the advances and the potential of the method, which can become a valuable alternative to the current synthesis process already used at a clinical and industrial scale. PMID:28788117
45 CFR 302.56 - Guidelines for setting child support awards.
Code of Federal Regulations, 2010 CFR
2010-10-01
... earnings and income of the noncustodial parent; (2) Be based on specific descriptive and numeric criteria... children and analyze case data, gathered through sampling or other methods, on the application of, and...
Karataş, Abdullah
2017-01-01
Objective Intranasal steroid sprays (INSS) are frequently prescribed for treating inferior turbinate hypertrophy (ITH). Complications due to the long-term application of INSS such as crusting, epistaxis, nasal mucosa dryness, and septal perforation may occur. Predicting patients who would benefit from INSS early might lower treatment costs and complication rates. We examined the predictive value of nasal decongestant response rates for the outcomes of INSS in ITH. Methods Fifty patients with bilateral ITH were included in two groups: patients benefiting from INSS and those not benefiting. Nasal airflow was assessed by peak nasal inspiratory flow (PNIF) measurement in all cases. Measurements were taken three times: before and after the application of nasal decongestant sprays and after the application of INSS. Results In both groups, the nasal air flow rates significantly increased after the application of nasal decongestant sprays; however, the nasal decongestant response rates were higher in the group with patients benefiting from INSS. There was a strong correlation between the nasal air flow rates measured after the application of nasal decongestant sprays and after the application of INSS. The cut-off value for the relationship between increased nasal air flow rates after the application of nasal decongestant sprays and outcomes of INSS was 23%. Conclusion Measurement of nasal airflow increase rate after the application of nasal decongestant sprays is a simple and easy method for the early prediction of the outcomes of INSS in ITH. A higher than 23% increase in nasal air flow rates after the application of nasal decongestant sprays indicates much better outcomes of INSS for patients. PMID:29392066
NASA Astrophysics Data System (ADS)
Miralles-Wilhelm, F.; Serrat-Capdevila, A.; Rodriguez, D.
2017-12-01
This research is focused on development of remote sensing methods to assess surface water pollution issues, particularly in multipurpose reservoirs. Three case study applications are presented to comparatively analyze remote sensing techniquesforo detection of nutrient related pollution, i.e., Nitrogen, Phosphorus, Chlorophyll, as this is a major water quality issue that has been identified in terms of pollution of major water sources around the country. This assessment will contribute to a better understanding of options for nutrient remote sensing capabilities and needs and assist water agencies in identifying the appropriate remote sensing tools and devise an application strategy to provide information needed to support decision-making regarding the targeting and monitoring of nutrient pollution prevention and mitigation measures. A detailed review of the water quality data available from ground based measurements was conducted in order to determine their suitability for a case study application of remote sensing. In the first case study, the Valle de Bravo reservoir in Mexico City reservoir offers a larger database of water quality which may be used to better calibrate and validate the algorithms required to obtain water quality data from remote sensing raw data. In the second case study application, the relatively data scarce Lake Toba in Indonesia can be useful to illustrate the value added of remote sensing data in locations where water quality data is deficient or inexistent. The third case study in the Paso Severino reservoir in Uruguay offers a combination of data scarcity and persistent development of harmful algae blooms. Landsat-TM data was obteined for the 3 study sites and algorithms for three key water quality parameters that are related to nutrient pollution: Chlorophyll-a, Total Nitrogen, and Total Phosphorus were calibrated and validated at the study sites. The three case study applications were developed into capacity building/training workshops for water resources students, applied scientists, practitioners, reservoir and water quality managers, and other interested stakeholders.
On the precision of quasi steady state assumptions in stochastic dynamics
NASA Astrophysics Data System (ADS)
Agarwal, Animesh; Adams, Rhys; Castellani, Gastone C.; Shouval, Harel Z.
2012-07-01
Many biochemical networks have complex multidimensional dynamics and there is a long history of methods that have been used for dimensionality reduction for such reaction networks. Usually a deterministic mass action approach is used; however, in small volumes, there are significant fluctuations from the mean which the mass action approach cannot capture. In such cases stochastic simulation methods should be used. In this paper, we evaluate the applicability of one such dimensionality reduction method, the quasi-steady state approximation (QSSA) [L. Menten and M. Michaelis, "Die kinetik der invertinwirkung," Biochem. Z 49, 333369 (1913)] for dimensionality reduction in case of stochastic dynamics. First, the applicability of QSSA approach is evaluated for a canonical system of enzyme reactions. Application of QSSA to such a reaction system in a deterministic setting leads to Michaelis-Menten reduced kinetics which can be used to derive the equilibrium concentrations of the reaction species. In the case of stochastic simulations, however, the steady state is characterized by fluctuations around the mean equilibrium concentration. Our analysis shows that a QSSA based approach for dimensionality reduction captures well the mean of the distribution as obtained from a full dimensional simulation but fails to accurately capture the distribution around that mean. Moreover, the QSSA approximation is not unique. We have then extended the analysis to a simple bistable biochemical network model proposed to account for the stability of synaptic efficacies; the substrate of learning and memory [J. E. Lisman, "A mechanism of memory storage insensitive to molecular turnover: A bistable autophosphorylating kinase," Proc. Natl. Acad. Sci. U.S.A. 82, 3055-3057 (1985)], 10.1073/pnas.82.9.3055. Our analysis shows that a QSSA based dimensionality reduction method results in errors as big as two orders of magnitude in predicting the residence times in the two stable states.
Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki
2012-12-21
There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.
3DHZETRN: Inhomogeneous Geometry Issues
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.
2017-01-01
Historical methods for assessing radiation exposure inside complicated geometries for space applications were limited by computational constraints and lack of knowledge associated with nuclear processes occurring over a broad range of particles and energies. Various methods were developed and utilized to simplify geometric representations and enable coupling with simplified but efficient particle transport codes. Recent transport code development efforts, leading to 3DHZETRN, now enable such approximate methods to be carefully assessed to determine if past exposure analyses and validation efforts based on those approximate methods need to be revisited. In this work, historical methods of representing inhomogeneous spacecraft geometry for radiation protection analysis are first reviewed. Two inhomogeneous geometry cases, previously studied with 3DHZETRN and Monte Carlo codes, are considered with various levels of geometric approximation. Fluence, dose, and dose equivalent values are computed in all cases and compared. It is found that although these historical geometry approximations can induce large errors in neutron fluences up to 100 MeV, errors on dose and dose equivalent are modest (<10%) for the cases studied here.
Design consideration of resonance inverters with electro-technological application
NASA Astrophysics Data System (ADS)
Hinov, Nikolay
2017-12-01
This study presents design consideration of resonance inverters with electro-technological application. The presented methodology was achieved as a result of investigations and analyses of different types and working regimes of resonance inverters, made by the author. Are considered schemes of resonant inverters without inverse diodes. The first harmonic method is used in the analysis and design. This method for the case of inverters with electro-technological application gives very good accuracy. This does not require the use of a complex and heavy mathematical apparatus. The proposed methodology is easy to use and is suitable for use in training students in power electronics. Authenticity of achieved results is confirmed by simulating and physical prototypes research work.
Ma, Ruijie; Lin, Xianming
2015-12-01
The problem based teaching (PBT) has been the main approach to the training in the universities o the world. Combined with the team oriented learning method, PBT will become the method available to the education in medical universities. In the paper, based on the common questions in teaching Jingluo Shuxue Xue (Science of Meridian and Acupoint), the concepts and characters of PBT and the team oriented learning method were analyzed. The implementation steps of PBT were set up in reference to the team oriented learning method. By quoting the original text in Beiji Qianjin Yaofang (Essential recipes for emergent use worth a thousand gold), the case analysis on "the thirteen devil points" was established with PBT.
Pan, Xiaoyu; Zhang, Chunlei; Li, Xuchao; Chen, Shengpei; Ge, Huijuan; Zhang, Yanyan; Chen, Fang; Jiang, Hui; Jiang, Fuman; Zhang, Hongyun; Wang, Wei; Zhang, Xiuqing
2014-12-01
To develop a fetal sex determination method based on maternal plasma sequencing (MPS), assess its performance and potential use in X-linked disorder counseling. 900 cases of MPS data from a previous study were reviewed, in which 100 and 800 cases were used as training and validation set, respectively. The percentage of uniquely mapped sequencing reads on Y chromosome was calculated and used to classify male and female cases. Eight pregnant women who are carriers of Duchenne muscular dystrophy (DMD) mutations were recruited, whose plasma were subjected to multiplex sequencing and fetal sex determination analysis. In the training set, a sensitivity of 96% and false positive rate of 0% for male cases detection were reached in our method. The blinded validation results showed 421 in 423 male cases and 374 in 377 female cases were successfully identified, revealing sensitivity and specificity of 99.53% and 99.20% for fetal sex determination, at as early as 12 gestational weeks. Fetal sex for all eight DMD genetic counseling cases were correctly identified, which were confirmed by amniocentesis. Based on MPS, high accuracy of non-invasive fetal sex determination can be achieved. This method can potentially be used for prenatal genetic counseling.
NASA Technical Reports Server (NTRS)
Strganac, T. W.; Mook, D. T.
1986-01-01
A means of numerically simulating flutter is established by implementing a predictor-corrector algorithm to solve the equations of motion. Aerodynamic loads are provided by the unsteady vortex lattice method (UVLM). This method is illustrated via the obtainment of stable and unstable responses to initial disturbances in the case of two-degree-of-freedom motion. It was found that for some angles of attack and dynamic pressure, the initial disturbance decays, for others it grows (flutter). When flutter occurs, the solution yields the amplitude and period of the resulting limit cycle. The preliminaray results attest to the feasibility of this method for studying flutter in cases that would be difficult to treat using a classical approach.
Marginalized zero-inflated Poisson models with missing covariates.
Benecha, Habtamu K; Preisser, John S; Divaris, Kimon; Herring, Amy H; Das, Kalyan
2018-05-11
Unlike zero-inflated Poisson regression, marginalized zero-inflated Poisson (MZIP) models for counts with excess zeros provide estimates with direct interpretations for the overall effects of covariates on the marginal mean. In the presence of missing covariates, MZIP and many other count data models are ordinarily fitted using complete case analysis methods due to lack of appropriate statistical methods and software. This article presents an estimation method for MZIP models with missing covariates. The method, which is applicable to other missing data problems, is illustrated and compared with complete case analysis by using simulations and dental data on the caries preventive effects of a school-based fluoride mouthrinse program. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A STRICTLY CONTRACTIVE PEACEMAN–RACHFORD SPLITTING METHOD FOR CONVEX PROGRAMMING
BINGSHENG, HE; LIU, HAN; WANG, ZHAORAN; YUAN, XIAOMING
2014-01-01
In this paper, we focus on the application of the Peaceman–Rachford splitting method (PRSM) to a convex minimization model with linear constraints and a separable objective function. Compared to the Douglas–Rachford splitting method (DRSM), another splitting method from which the alternating direction method of multipliers originates, PRSM requires more restrictive assumptions to ensure its convergence, while it is always faster whenever it is convergent. We first illustrate that the reason for this difference is that the iterative sequence generated by DRSM is strictly contractive, while that generated by PRSM is only contractive with respect to the solution set of the model. With only the convexity assumption on the objective function of the model under consideration, the convergence of PRSM is not guaranteed. But for this case, we show that the first t iterations of PRSM still enable us to find an approximate solution with an accuracy of O(1/t). A worst-case O(1/t) convergence rate of PRSM in the ergodic sense is thus established under mild assumptions. After that, we suggest attaching an underdetermined relaxation factor with PRSM to guarantee the strict contraction of its iterative sequence and thus propose a strictly contractive PRSM. A worst-case O(1/t) convergence rate of this strictly contractive PRSM in a nonergodic sense is established. We show the numerical efficiency of the strictly contractive PRSM by some applications in statistical learning and image processing. PMID:25620862
Li, Chunguang; Chen, Luonan; Aihara, Kazuyuki
2008-06-01
Real systems are often subject to both noise perturbations and impulsive effects. In this paper, we study the stability and stabilization of systems with both noise perturbations and impulsive effects. In other words, we generalize the impulsive control theory from the deterministic case to the stochastic case. The method is based on extending the comparison method to the stochastic case. The method presented in this paper is general and easy to apply. Theoretical results on both stability in the pth mean and stability with disturbance attenuation are derived. To show the effectiveness of the basic theory, we apply it to the impulsive control and synchronization of chaotic systems with noise perturbations, and to the stability of impulsive stochastic neural networks. Several numerical examples are also presented to verify the theoretical results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strack, K.M.; Vozoff, K.
The applications of electromagnetics have increased in the past two decades because of an improved understanding of the methods, improves service availability, and the increased focus of exploration in the more complex reservoir characterization issues. For electromagnetic methods surface applications for hydrocarbon Exploration and Production are still a special case, while applications in borehole and airborne research and for engineering and environmental objectives are routine. In the past, electromagnetic techniques, in particular deep transient electromagnetics, made up a completely different discipline in geophysics, although many of the principles are similar to the seismic one. With an understanding of the specificmore » problems related to data processing initially and then acquisition, the inclusion of principles learned from seismics happened almost naturally. Initially, the data processing was very similar to seismic full-waveform processing. The hardware was also changed to include multichannel acquisition systems, and the field procedures became very similar to seismic surveying. As a consequence, the integration and synergism of the interpretation process is becoming almost automatic. The long-offset transient electromagnetic (LOTEM) technique will be summarized from the viewpoint of its similarity to seismics. The complete concept of the method will also be reviewed. An interpretation case history that integrates seismic and LOTEM from a hydrocarbon area in China clearly demonstrates the limitations and benefits of the method.« less
Treatment of severe burn with DermACELL(®), an acellular dermal matrix.
Chen, Shyi-Gen; Tzeng, Yuan-Sheng; Wang, Chih-Hsin
2012-01-01
For treatment of skin burn injuries, there exist several methods of treatment related to tissue regeneration, including the use of autograft skin and cryopreserved skin. However, each method has drawbacks. An alternative method for tissue regeneration is allograft acellular dermal matrix, with potential as a biocompatible scaffold for new tissue growth. One recently produced material of this type is DermACELL(®), which was used in this case presentation for treating a scar resulting from second- and third-degree burns in a 33-year-old female patient. The patient presented with significant hypertrophic scarring from the elbow to the hand and with limited wrist and elbow motion. The scarring was removed, and the patient was treated with a 1:3 mesh of DermACELL. The wound was resurfaced with a split thickness skin graft, and postoperative care included application of pressure garment and silicone sheet, as well as range of motion exercise and massage. At 30 days after DermACELL application, the wound appeared well-healed with little scar formation. At 180 days post-application, the wound continued to appear healed well without significant scar formation. Additionally, the wound was supple, and the patient experienced significant improvement in range of motion. In the case presented, DermACELL appears to have been a successful method of treatment for scarring due to severe burns by preventing further scar formation and improving range of motion.
NASA Astrophysics Data System (ADS)
Pairan, M. Rasidi; Asmuin, Norzelawati; Isa, Nurasikin Mat; Sies, Farid
2017-04-01
Water mist sprays are used in wide range of application. However it is depend to the spray characteristic to suit the particular application. This project studies the water droplet velocity and penetration angle generated by new development mist spray with a flat spray pattern. This research conducted into two part which are experimental and simulation section. The experimental was conducted by using particle image velocimetry (PIV) method, ANSYS software was used as tools for simulation section meanwhile image J software was used to measure the penetration angle. Three different of combination pressure of air and water were tested which are 1 bar (case A), 2 bar (case B) and 3 bar (case C). The flat spray generated by the new development nozzle was examined at 9cm vertical line from 8cm of the nozzle orifice. The result provided in the detailed analysis shows that the trend of graph velocity versus distance gives the good agreement within simulation and experiment for all the pressure combination. As the water and air pressure increased from 1 bar to 2 bar, the velocity and angle penetration also increased, however for case 3 which run under 3 bar condition, the water droplet velocity generated increased but the angle penetration is decreased. All the data then validated by calculate the error between experiment and simulation. By comparing the simulation data to the experiment data for all the cases, the standard deviation for this case A, case B and case C relatively small which are 5.444, 0.8242 and 6.4023.
Implementing a Flipped Classroom: A Case Study of Biology Teaching in a Greek High School
ERIC Educational Resources Information Center
Gariou-Papalexiou, Angeliki; Papadakis, Spyros; Manousou, Evangelia; Georgiadu, Irene
2017-01-01
The purpose of this study was to investigate the application of the model of the "flipped classroom" as a complementary method to school distance education in junior high school Biology. The "flipped classroom" model attempts a different way of organizing the educational process according to which the traditional methods of…
ERIC Educational Resources Information Center
Robinson, JoAnn; Herot, Christine; Mantz-Simmons, Linda; Haynes, Phillip
2000-01-01
This article explores using the MacArthur Story Stem Battery to investigate the interior life of children, its potential usefulness in evaluating interventions geared to prevent dysfunctional parenting, and how the method has been adapted for use with low-income African American children. Case examples support the method's application. (Author/CR)
The Use of Service Blueprinting as a Method of Improving Non-Academic College Student Experiences
ERIC Educational Resources Information Center
Roberts, J. Will
2017-01-01
Service blueprinting offers a method for analyzing and improving service experiences. This service design technique is commonly used in many service industries, but research exploring its application within a higher education context is very limited. The purpose of this case study was to describe the use and effects of extensive service…
Historical note: Drumine--a new Australian local anaesthetic.
Bailey, R J
1977-02-01
An article in the Australiasian Medical Gazette of October, 1886 indicates the method of extraction, experimentation and therapeutic application of an active principle, prepared from Euphorbia Drummondii. Further correspondence is noted, refining the method of extraction, reporting cases, answering criticisms, and announcing eventually, drumine's commercial preparation. Despite enthusiastic support, the drug soon disappears from the therapeutic scene.
ERIC Educational Resources Information Center
Raikou, Natassa
2016-01-01
This article addresses an application performed in tertiary education--a department of pedagogical and educational sciences--of a contemporary method, Transformative Learning through Aesthetic Experience. The method is based on the use of art and aims to reinforce and promote the development of critical thinking within educational settings.…
A fourth-order box method for solving the boundary layer equations
NASA Technical Reports Server (NTRS)
Wornom, S. F.
1977-01-01
A fourth order box method for calculating high accuracy numerical solutions to parabolic, partial differential equations in two variables or ordinary differential equations is presented. The method is the natural extension of the second order Keller Box scheme to fourth order and is demonstrated with application to the incompressible, laminar and turbulent boundary layer equations. Numerical results for high accuracy test cases show the method to be significantly faster than other higher order and second order methods.
NASA Astrophysics Data System (ADS)
Zolfaghari, M.; Ghaderi, R.; Sheikhol Eslami, A.; Ranjbar, A.; Hosseinnia, S. H.; Momani, S.; Sadati, J.
2009-10-01
The enhanced homotopy perturbation method (EHPM) is applied for finding improved approximate solutions of the well-known Bagley-Torvik equation for three different cases. The main characteristic of the EHPM is using a stabilized linear part, which guarantees the stability and convergence of the overall solution. The results are finally compared with the Adams-Bashforth-Moulton numerical method, the Adomian decomposition method (ADM) and the fractional differential transform method (FDTM) to verify the performance of the EHPM.
ERIC Educational Resources Information Center
Blackbourn, J. M.; Fillingim, Jennifer G.; McCelland, Susan; Elrod, G. Franklin; Medley, Meagan B.; Kritsonis, Mary Alice; Ray, Jan
2008-01-01
This study examines the use of wireless laptop technology to support the application of problem-based learning (PBL) in a special education methods course. This field based course used a progressive disclosure process in weekly seminars to address issues posed in a case study. Eight scenarios, all related to the case, were presented to upper level…
ERIC Educational Resources Information Center
Loeb, Katharine L.; Hirsch, Alicia M.; Greif, Rebecca; Hildebrandt, Thomas B.
2009-01-01
This article describes the successful application of family-based treatment (FBT) for a 17-year-old identical twin presenting with a 4-month history of clinically significant symptoms of anorexia nervosa (AN). FBT is a manualized treatment that has been studied in randomized controlled trials for adolescents with AN. This case study illustrates…
A case study of exploiting enterprise resource planning requirements
NASA Astrophysics Data System (ADS)
Niu, Nan; Jin, Mingzhou; Cheng, Jing-Ru C.
2011-05-01
The requirements engineering (RE) processes have become a key to conceptualising corporate-wide integrated solutions based on packaged enterprise resource planning (ERP) software. The RE literature has mainly focused on procuring the most suitable ERP package. Little is known about how an organisation exploits the chosen ERP RE model to frame the business application development. This article reports an exploratory case study of a key tenet of ERP RE adoption, namely that aligning business applications to the packaged RE model leads to integral practices and economic development. The case study analysed a series interrelated pilot projects developed for a business division of a large IT manufacturing and service company, using Oracle's appl1ication implementation method (AIM). The study indicated that AIM RE improved team collaboration and project management experience, but needed to make hidden assumptions explicit to support data visibility and integrity. Our study can direct researchers towards rigorous empirical evaluations of ERP RE adoption, collect experiences and lessons learned for practitioners, and help generate more effective and mature processes when exploiting ERP RE methods.
Novel application of three-dimensional technologies in a case of dismemberment.
Baier, Waltraud; Norman, Danielle G; Warnett, Jason M; Payne, Mark; Harrison, Nigel P; Hunt, Nicholas C A; Burnett, Brian A; Williams, Mark A
2017-01-01
This case study reports the novel application of three-dimensional technologies such as micro-CT and 3D printing to the forensic investigation of a complex case of dismemberment. Micro-CT was successfully employed to virtually align severed skeletal elements found in different locations, analyse tool marks created during the dismemberment process, and virtually dissect a charred piece of evidence. High resolution 3D prints of the burnt human bone contained within were created for physical visualisation to assist the investigation team. Micro-CT as a forensic radiological method provided vital information and the basis for visualisation both during the investigation and in the subsequent trial making it one of the first examples of such technology in a UK court. Copyright © 2016. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Cheng, T.; Zhou, X.; Jia, Y.; Yang, G.; Bai, J.
2018-04-01
In the project of China's First National Geographic Conditions Census, millions of sample data have been collected all over the country for interpreting land cover based on remote sensing images, the quantity of data files reaches more than 12,000,000 and has grown in the following project of National Geographic Conditions Monitoring. By now, using database such as Oracle for storing the big data is the most effective method. However, applicable method is more significant for sample data's management and application. This paper studies a database construction method which is based on relational database with distributed file system. The vector data and file data are saved in different physical location. The key issues and solution method are discussed. Based on this, it studies the application method of sample data and analyzes some kinds of using cases, which could lay the foundation for sample data's application. Particularly, sample data locating in Shaanxi province are selected for verifying the method. At the same time, it takes 10 first-level classes which defined in the land cover classification system for example, and analyzes the spatial distribution and density characteristics of all kinds of sample data. The results verify that the method of database construction which is based on relational database with distributed file system is very useful and applicative for sample data's searching, analyzing and promoted application. Furthermore, sample data collected in the project of China's First National Geographic Conditions Census could be useful in the earth observation and land cover's quality assessment.
Comparison of performance of inclinometer casing and TDR technique
NASA Astrophysics Data System (ADS)
Aghda, S. M. Fatemi; Ganjalipour, K.; Nabiollahi, K.
2018-03-01
TDR (Time Domain Reflectometry) and GPR (Ground Penetrating Radar) are two of the electromagnetic methods in applied geophysics, which using them for various applications are developing. The Time Domain Reflectometry is a remote sensing method that has been used for years to determine the nature of the materials and spatial location. The use of TDR system has led to innovative applications of it and comparing it with previous measuring techniques, since it has developed. In this study, not only a summary of the basics of TDR application for monitoring of ground deformation is offered, but also a comparison of this technology with other measurement techniques (inclinometer casing) is provided. Actually, this paper presents a case study in which the opportunity arose to compare these two technologies in detecting subsurface deformation in slopes. A TDR system includes a radar wave receiver & generator, a transmission line and a waveguide. The generated electro-magnetic pulse moves toward the waveguide within the conductor cable and enters the test environment. For this study, slopes overlooking the Darian dam bottom outlet, power house and spillway were instrumented with RG59/U coaxial cables for TDR monitoring and slope inclinometer. Coaxial cables - as a TDR sensor - and inclinometer casings were installed in a same bore hole where coaxial cable was attached to the inclinometer casing. Shear and tensile deformations of the cable, which is caused by ground movements, significantly impacts on cable reflection coefficient. In Darian dam boreholes, the cable points subject to the shear and stretch were correlated with deformation points of the inclinometer casings in incremental displacement graphs. This study shows that TDR technique is more sensitive than inclinometer casing for small movement in the slide planes. Because manual processing of TDR data is hard and need experienced personnel, the authors have designed an algorithm to compare the shape of the new TDR waveforms with the base reading waveform in order to monitor the subsurface deformations.
NASA Technical Reports Server (NTRS)
Oswald, J. E.; Siegel, P. H.
1994-01-01
The finite difference time domain (FDTD) method is applied to the analysis of microwave, millimeter-wave and submillimeter-wave filter circuits. In each case, the validity of this method is confirmed by comparison with measured data. In addition, the FDTD calculations are used to design a new ultra-thin coplanar-strip filter for feeding a THz planar-antenna mixer.
High-quality slab-based intermixing method for fusion rendering of multiple medical objects.
Kim, Dong-Joon; Kim, Bohyoung; Lee, Jeongjin; Shin, Juneseuk; Kim, Kyoung Won; Shin, Yeong-Gil
2016-01-01
The visualization of multiple 3D objects has been increasingly required for recent applications in medical fields. Due to the heterogeneity in data representation or data configuration, it is difficult to efficiently render multiple medical objects in high quality. In this paper, we present a novel intermixing scheme for fusion rendering of multiple medical objects while preserving the real-time performance. First, we present an in-slab visibility interpolation method for the representation of subdivided slabs. Second, we introduce virtual zSlab, which extends an infinitely thin boundary (such as polygonal objects) into a slab with a finite thickness. Finally, based on virtual zSlab and in-slab visibility interpolation, we propose a slab-based visibility intermixing method with the newly proposed rendering pipeline. Experimental results demonstrate that the proposed method delivers more effective multiple-object renderings in terms of rendering quality, compared to conventional approaches. And proposed intermixing scheme provides high-quality intermixing results for the visualization of intersecting and overlapping surfaces by resolving aliasing and z-fighting problems. Moreover, two case studies are presented that apply the proposed method to the real clinical applications. These case studies manifest that the proposed method has the outstanding advantages of the rendering independency and reusability. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Science in the Service of Religion: The Case of Islam.
ERIC Educational Resources Information Center
King, David A.
1990-01-01
Described is how scholars of medieval Islam used simple and adequate methods for regulating the calendar and prayer-times and for finding the sacred direction. Other applications of science to daily life are provided. (KR)
Geometrically derived difference formulae for the numerical integration of trajectory problems
NASA Technical Reports Server (NTRS)
Mcleod, R. J. Y.; Sanz-Serna, J. M.
1982-01-01
An initial value problem for the autonomous system of ordinary differential equations dy/dt = f(y), where y is a vector, is considered. In a number of practical applications the interest lies in obtaining the curve traced by the solution y. These applications include the computation of trajectories in mechanical problems. The term 'trajectory problem' is employed to refer to these cases. Lambert and McLeod (1979) have introduced a method involving local rotation of the axes in the y-plane for the two-dimensional case. The present investigation continues the study of difference schemes specifically derived for trajectory problems. A simple geometrical way of constructing such methods is presented, and the local accuracy of the schemes is investigated. A circularly exact, fixed-step predictor-corrector algorithm is defined, and a variable-step version of a circularly exact algorithm is presented.
Rapid, Contactless and Non-Destructive Testing of Chemical Composition of Samples
NASA Astrophysics Data System (ADS)
Ivanov, O.; Vaseashta, A.; Stoychev, L.
Our results demonstrate that a new effect can be induced in each solid in a wide spectral range of electromagnetic irradiation. In the present manuscript we prove experimentally that one of the possible applications of this effect is for an express contactless control of the chemical composition of a series of samples, in this case, coins. The method has wide applicability ranging from defense and homeland security to several applications requiring rapid and nondestructive identification of chemical composition.
A generalized sound extrapolation method for turbulent flows
NASA Astrophysics Data System (ADS)
Zhong, Siyang; Zhang, Xin
2018-02-01
Sound extrapolation methods are often used to compute acoustic far-field directivities using near-field flow data in aeroacoustics applications. The results may be erroneous if the volume integrals are neglected (to save computational cost), while non-acoustic fluctuations are collected on the integration surfaces. In this work, we develop a new sound extrapolation method based on an acoustic analogy using Taylor's hypothesis (Taylor 1938 Proc. R. Soc. Lon. A 164, 476-490. (doi:10.1098/rspa.1938.0032)). Typically, a convection operator is used to filter out the acoustically inefficient components in the turbulent flows, and an acoustics dominant indirect variable Dcp‧ is solved. The sound pressure p' at the far field is computed from Dcp‧ based on the asymptotic properties of the Green's function. Validations results for benchmark problems with well-defined sources match well with the exact solutions. For aeroacoustics applications: the sound predictions by the aerofoil-gust interaction are close to those by an earlier method specially developed to remove the effect of vortical fluctuations (Zhong & Zhang 2017 J. Fluid Mech. 820, 424-450. (doi:10.1017/jfm.2017.219)); for the case of vortex shedding noise from a cylinder, the off-body predictions by the proposed method match well with the on-body Ffowcs-Williams and Hawkings result; different integration surfaces yield close predictions (of both spectra and far-field directivities) for a co-flowing jet case using an established direct numerical simulation database. The results suggest that the method may be a potential candidate for sound projection in aeroacoustics applications.
A fast, parallel algorithm for distant-dependent calculation of crystal properties
NASA Astrophysics Data System (ADS)
Stein, Matthew
2017-12-01
A fast, parallel algorithm for distant-dependent calculation and simulation of crystal properties is presented along with speedup results and methods of application. An illustrative example is used to compute the Lennard-Jones lattice constants up to 32 significant figures for 4 ≤ p ≤ 30 in the simple cubic, face-centered cubic, body-centered cubic, hexagonal-close-pack, and diamond lattices. In most cases, the known precision of these constants is more than doubled, and in some cases, corrected from previously published figures. The tools and strategies to make this computation possible are detailed along with application to other potentials, including those that model defects.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2015-01-01
This report documents a case study on the application of Reliability Engineering techniques to achieve an optimal balance between performance and robustness by tuning the functional parameters of a complex non-linear control system. For complex systems with intricate and non-linear patterns of interaction between system components, analytical derivation of a mathematical model of system performance and robustness in terms of functional parameters may not be feasible or cost-effective. The demonstrated approach is simple, structured, effective, repeatable, and cost and time efficient. This general approach is suitable for a wide range of systems.
[Application of DNA labeling technology in forensic botany].
Znang, Xian; Li, Jing-Lin; Zhang, Xiang-Yu
2008-12-01
Forensic botany is a study of judicial plant evidence. Recently, researches on DNA labeling technology have been a mainstream of forensic botany. The article systematically reviews various types of DNA labeling techniques in forensic botany with enumerated practical cases, as well as the potential forensic application of each individual technique. The advantages of the DNA labeling technology over traditional morphological taxonomic methods are also summarized.
Robust parallel iterative solvers for linear and least-squares problems, Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saad, Yousef
2014-01-16
The primary goal of this project is to study and develop robust iterative methods for solving linear systems of equations and least squares systems. The focus of the Minnesota team is on algorithms development, robustness issues, and on tests and validation of the methods on realistic problems. 1. The project begun with an investigation on how to practically update a preconditioner obtained from an ILU-type factorization, when the coefficient matrix changes. 2. We investigated strategies to improve robustness in parallel preconditioners in a specific case of a PDE with discontinuous coefficients. 3. We explored ways to adapt standard preconditioners formore » solving linear systems arising from the Helmholtz equation. These are often difficult linear systems to solve by iterative methods. 4. We have also worked on purely theoretical issues related to the analysis of Krylov subspace methods for linear systems. 5. We developed an effective strategy for performing ILU factorizations for the case when the matrix is highly indefinite. The strategy uses shifting in some optimal way. The method was extended to the solution of Helmholtz equations by using complex shifts, yielding very good results in many cases. 6. We addressed the difficult problem of preconditioning sparse systems of equations on GPUs. 7. A by-product of the above work is a software package consisting of an iterative solver library for GPUs based on CUDA. This was made publicly available. It was the first such library that offers complete iterative solvers for GPUs. 8. We considered another form of ILU which blends coarsening techniques from Multigrid with algebraic multilevel methods. 9. We have released a new version on our parallel solver - called pARMS [new version is version 3]. As part of this we have tested the code in complex settings - including the solution of Maxwell and Helmholtz equations and for a problem of crystal growth.10. As an application of polynomial preconditioning we considered the problem of evaluating f(A)v which arises in statistical sampling. 11. As an application to the methods we developed, we tackled the problem of computing the diagonal of the inverse of a matrix. This arises in statistical applications as well as in many applications in physics. We explored probing methods as well as domain-decomposition type methods. 12. A collaboration with researchers from Toulouse, France, considered the important problem of computing the Schur complement in a domain-decomposition approach. 13. We explored new ways of preconditioning linear systems, based on low-rank approximations.« less
Evaluating business value of IT towards optimisation of the application portfolio
NASA Astrophysics Data System (ADS)
Sun, Lily; Liu, Kecheng; Indrayani Jambari, Dian; Michell, Vaughan
2016-05-01
Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced is a strategic decision for business, IT and business-aligned IT. In this article, we present a method that aims to analyse business functions and IT roles and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results that are illustrated and validated through a real-life case study of a UK borough council and followed by discussion on implications for researchers and practitioners.
Digital enhancement of sub-quality bitemark photographs.
Karazalus, C P; Palmbach, T T; Lee, H C
2001-07-01
Digital enhancement software was used to enhance bitemark photographs. This enhancement technique improved the resolution of the bitemark images. Lucis was the software program utilized in this study and case applications. First, this technique was applied on known bitemark images to evaluate the potential effectiveness of this digital enhancement method. Subsequently, Lucis was utilized on two separate unsolved cases involving enhancement of bitemark evidence. One case involved a severely beaten infant with a bitemark on the upper thigh. The second case involves a bitemark observed on the breast of a female sexual assault strangulation victim. In both cases, bitemark images were significantly improved after digital enhancement.
NASA Astrophysics Data System (ADS)
Choi, Chu Hwan
2002-09-01
Ab initio chemistry has shown great promise in reproducing experimental results and in its predictive power. The many complicated computational models and methods seem impenetrable to an inexperienced scientist, and the reliability of the results is not easily interpreted. The application of midbond orbitals is used to determine a general method for use in calculating weak intermolecular interactions, especially those involving electron-deficient systems. Using the criteria of consistency, flexibility, accuracy and efficiency we propose a supermolecular method of calculation using the full counterpoise (CP) method of Boys and Bernardi, coupled with Moller-Plesset (MP) perturbation theory as an efficient electron-correlative method. We also advocate the use of the highly efficient and reliable correlation-consistent polarized valence basis sets of Dunning. To these basis sets, we add a general set of midbond orbitals and demonstrate greatly enhanced efficiency in the calculation. The H2-H2 dimer is taken as a benchmark test case for our method, and details of the computation are elaborated. Our method reproduces with great accuracy the dissociation energies of other previous theoretical studies. The added efficiency of extending the basis sets with conventional means is compared with the performance of our midbond-extended basis sets. The improvement found with midbond functions is notably superior in every case tested. Finally, a novel application of midbond functions to the BH5 complex is presented. The system is an unusual van der Waals complex. The interaction potential curves are presented for several standard basis sets and midbond-enhanced basis sets, as well as for two popular, alternative correlation methods. We report that MP theory appears to be superior to coupled-cluster (CC) in speed, while it is more stable than B3LYP, a widely-used density functional theory (DFT). Application of our general method yields excellent results for the midbond basis sets. Again they prove superior to conventional extended basis sets. Based on these results, we recommend our general approach as a highly efficient, accurate method for calculating weakly interacting systems.
NASA Astrophysics Data System (ADS)
Mercier, Sylvain; Gratton, Serge; Tardieu, Nicolas; Vasseur, Xavier
2017-12-01
Many applications in structural mechanics require the numerical solution of sequences of linear systems typically issued from a finite element discretization of the governing equations on fine meshes. The method of Lagrange multipliers is often used to take into account mechanical constraints. The resulting matrices then exhibit a saddle point structure and the iterative solution of such preconditioned linear systems is considered as challenging. A popular strategy is then to combine preconditioning and deflation to yield an efficient method. We propose an alternative that is applicable to the general case and not only to matrices with a saddle point structure. In this approach, we consider to update an existing algebraic or application-based preconditioner, using specific available information exploiting the knowledge of an approximate invariant subspace or of matrix-vector products. The resulting preconditioner has the form of a limited memory quasi-Newton matrix and requires a small number of linearly independent vectors. Numerical experiments performed on three large-scale applications in elasticity highlight the relevance of the new approach. We show that the proposed method outperforms the deflation method when considering sequences of linear systems with varying matrices.
Forensic applications of ambient ionization mass spectrometry.
Ifa, Demian R; Jackson, Ayanna U; Paglia, Giuseppe; Cooks, R Graham
2009-08-01
This review highlights and critically assesses forensic applications in the developing field of ambient ionization mass spectrometry. Ambient ionization methods permit the ionization of samples outside the mass spectrometer in the ordinary atmosphere, with minimal sample preparation. Several ambient ionization methods have been created since 2004 and they utilize different mechanisms to create ions for mass-spectrometric analysis. Forensic applications of these techniques--to the analysis of toxic industrial compounds, chemical warfare agents, illicit drugs and formulations, explosives, foodstuff, inks, fingerprints, and skin--are reviewed. The minimal sample pretreatment needed is illustrated with examples of analysis from complex matrices (e.g., food) on various substrates (e.g., paper). The low limits of detection achieved by most of the ambient ionization methods for compounds of forensic interest readily offer qualitative confirmation of chemical identity; in some cases quantitative data are also available. The forensic applications of ambient ionization methods are a growing research field and there are still many types of applications which remain to be explored, particularly those involving on-site analysis. Aspects of ambient ionization currently undergoing rapid development include molecular imaging and increased detection specificity through simultaneous chemical reaction and ionization by addition of appropriate chemical reagents.
Merritt, Maria W.; Tediosi, Fabrizio
2015-01-01
It has been suggested that initiatives to eradicate specific communicable diseases need to be informed by eradication investment cases to assess the feasibility, costs, and consequences of eradication compared with elimination or control. A methodological challenge of eradication investment cases is how to account for the ethical importance of the benefits, burdens, and distributions thereof that are salient in people’s experiences of the diseases and related interventions but are not assessed in traditional approaches to health and economic evaluation. We have offered a method of ethical analysis grounded in theories of social justice. We have described the method and its philosophical rationale and illustrated its use in application to eradication investment cases for lymphatic filariasis and onchocerciasis, 2 neglected tropical diseases that are candidates for eradication. PMID:25713967
Rabani, Eran; Reichman, David R.; Krilov, Goran; Berne, Bruce J.
2002-01-01
We present a method based on augmenting an exact relation between a frequency-dependent diffusion constant and the imaginary time velocity autocorrelation function, combined with the maximum entropy numerical analytic continuation approach to study transport properties in quantum liquids. The method is applied to the case of liquid para-hydrogen at two thermodynamic state points: a liquid near the triple point and a high-temperature liquid. Good agreement for the self-diffusion constant and for the real-time velocity autocorrelation function is obtained in comparison to experimental measurements and other theoretical predictions. Improvement of the methodology and future applications are discussed. PMID:11830656
A Review of Spectral Methods for Variable Amplitude Fatigue Prediction and New Results
NASA Technical Reports Server (NTRS)
Larsen, Curtis E.; Irvine, Tom
2013-01-01
A comprehensive review of the available methods for estimating fatigue damage from variable amplitude loading is presented. The dependence of fatigue damage accumulation on power spectral density (psd) is investigated for random processes relevant to real structures such as in offshore or aerospace applications. Beginning with the Rayleigh (or narrow band) approximation, attempts at improved approximations or corrections to the Rayleigh approximation are examined by comparison to rainflow analysis of time histories simulated from psd functions representative of simple theoretical and real world applications. Spectral methods investigated include corrections by Wirsching and Light, Ortiz and Chen, the Dirlik formula, and the Single-Moment method, among other more recent proposed methods. Good agreement is obtained between the spectral methods and the time-domain rainflow identification for most cases, with some limitations. Guidelines are given for using the several spectral methods to increase confidence in the damage estimate.
A class of hybrid finite element methods for electromagnetics: A review
NASA Technical Reports Server (NTRS)
Volakis, J. L.; Chatterjee, A.; Gong, J.
1993-01-01
Integral equation methods have generally been the workhorse for antenna and scattering computations. In the case of antennas, they continue to be the prominent computational approach, but for scattering applications the requirement for large-scale computations has turned researchers' attention to near neighbor methods such as the finite element method, which has low O(N) storage requirements and is readily adaptable in modeling complex geometrical features and material inhomogeneities. In this paper, we review three hybrid finite element methods for simulating composite scatterers, conformal microstrip antennas, and finite periodic arrays. Specifically, we discuss the finite element method and its application to electromagnetic problems when combined with the boundary integral, absorbing boundary conditions, and artificial absorbers for terminating the mesh. Particular attention is given to large-scale simulations, methods, and solvers for achieving low memory requirements and code performance on parallel computing architectures.
[A new methodological approach for leptospira persistence studies in case of mixed leptospirosis].
Samsonova, A P; Petrov, E M; Vyshivkina, N V; Anan'ina, Iu V
2003-01-01
A new methodical approach for Leptospira persistence studies in case of mixed leptospirosis, based on the use of PCR test systems with different taxonomic specificity for the indication and identification of leptospires, was developed. Two PCR test systems (G and B) were used in experiments on BALB/c white mice to study patterns of the development of mixed infection caused by leptospires of serovar poi (genomospecies L. borgpeterseni) and grippotyphosa (genomospecies L. kirschneri). The conclusion was made of good prospects of this method application in studies on symbiotic relationships of leptospires both in vivo and in vitro.
Torrens, George Edward
2018-01-01
Summative content analysis was used to define methods and heuristics from each case study. The review process was in two parts: (1) A literature review to identify conventional research methods and (2) a summative content analysis of published case studies, based on the identified methods and heuristics to suggest an order and priority of where and when were used. Over 200 research and design methods and design heuristics were identified. From the review of the 20 case studies 42 were identified as being applied. The majority of methods and heuristics were applied in phase two, market choice. There appeared a disparity between the limited numbers of methods frequently used, under 10 within the 20 case studies, when hundreds were available. Implications for Rehabilitation The communication highlights a number of issues that have implication for those involved in assistive technology new product development: •The study defined over 200 well-established research and design methods and design heuristics that are available for use by those who specify and design assistive technology products, which provide a comprehensive reference list for practitioners in the field; •The review within the study suggests only a limited number of research and design methods are regularly used by industrial design focused assistive technology new product developers; and, •Debate is required within the practitioners working in this field to reflect on how a wider range of potentially more effective methods and heuristics may be incorporated into daily working practice.
Yu, Yun; Degnan, James H.; Nakhleh, Luay
2012-01-01
Gene tree topologies have proven a powerful data source for various tasks, including species tree inference and species delimitation. Consequently, methods for computing probabilities of gene trees within species trees have been developed and widely used in probabilistic inference frameworks. All these methods assume an underlying multispecies coalescent model. However, when reticulate evolutionary events such as hybridization occur, these methods are inadequate, as they do not account for such events. Methods that account for both hybridization and deep coalescence in computing the probability of a gene tree topology currently exist for very limited cases. However, no such methods exist for general cases, owing primarily to the fact that it is currently unknown how to compute the probability of a gene tree topology within the branches of a phylogenetic network. Here we present a novel method for computing the probability of gene tree topologies on phylogenetic networks and demonstrate its application to the inference of hybridization in the presence of incomplete lineage sorting. We reanalyze a Saccharomyces species data set for which multiple analyses had converged on a species tree candidate. Using our method, though, we show that an evolutionary hypothesis involving hybridization in this group has better support than one of strict divergence. A similar reanalysis on a group of three Drosophila species shows that the data is consistent with hybridization. Further, using extensive simulation studies, we demonstrate the power of gene tree topologies at obtaining accurate estimates of branch lengths and hybridization probabilities of a given phylogenetic network. Finally, we discuss identifiability issues with detecting hybridization, particularly in cases that involve extinction or incomplete sampling of taxa. PMID:22536161
Schematic representation of case study research designs.
Rosenberg, John P; Yates, Patsy M
2007-11-01
The paper is a report of a study to demonstrate how the use of schematics can provide procedural clarity and promote rigour in the conduct of case study research. Case study research is a methodologically flexible approach to research design that focuses on a particular case - whether an individual, a collective or a phenomenon of interest. It is known as the 'study of the particular' for its thorough investigation of particular, real-life situations and is gaining increased attention in nursing and social research. However, the methodological flexibility it offers can leave the novice researcher uncertain of suitable procedural steps required to ensure methodological rigour. This article provides a real example of a case study research design that utilizes schematic representation drawn from a doctoral study of the integration of health promotion principles and practices into a palliative care organization. The issues discussed are: (1) the definition and application of case study research design; (2) the application of schematics in research; (3) the procedural steps and their contribution to the maintenance of rigour; and (4) the benefits and risks of schematics in case study research. The inclusion of visual representations of design with accompanying explanatory text is recommended in reporting case study research methods.
Pang, Shaoning; Ban, Tao; Kadobayashi, Youki; Kasabov, Nikola K
2012-04-01
To adapt linear discriminant analysis (LDA) to real-world applications, there is a pressing need to equip it with an incremental learning ability to integrate knowledge presented by one-pass data streams, a functionality to join multiple LDA models to make the knowledge sharing between independent learning agents more efficient, and a forgetting functionality to avoid reconstruction of the overall discriminant eigenspace caused by some irregular changes. To this end, we introduce two adaptive LDA learning methods: LDA merging and LDA splitting. These provide the benefits of ability of online learning with one-pass data streams, retained class separability identical to the batch learning method, high efficiency for knowledge sharing due to condensed knowledge representation by the eigenspace model, and more preferable time and storage costs than traditional approaches under common application conditions. These properties are validated by experiments on a benchmark face image data set. By a case study on the application of the proposed method to multiagent cooperative learning and system alternation of a face recognition system, we further clarified the adaptability of the proposed methods to complex dynamic learning tasks.
Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques
2012-09-01
The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.
Improved method of laser photovaporization for endometrium cases
NASA Astrophysics Data System (ADS)
Xia, En-ju; Lu, Hua; Chen, En-ling; Wan, Hong-yue
1995-03-01
Endometrium was destroyed by Nd-YAG laser under B ultrasonic monitoring through direct application of a fibrous catheter specially designed by Dong-lin Lee (an optical engineer). Of the 12 patients 11 cases had dysfunctional uterine bleeding and one case had postmenopausal uterine bleeding. The mean time of laser treatment was 5' 45', the mean volume of irrigating solution was 850 ml, and the mean estimated blood loss was less than 9 ml. All patients showed neither evidence of infection nor uterine perforation. Follow-up examination for 3 - 16 months showed amenorrhea in 4 cases, spotting in 4 cases, and obviously reduced menstrual flow in 3 cases. One patient, partly treated, had normal menstruation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerner, Ryan; Mann, R.B.
We investigate quantum tunnelling methods for calculating black hole temperature, specifically the null-geodesic method of Parikh and Wilczek and the Hamilton-Jacobi Ansatz method of Angheben et al. We consider application of these methods to a broad class of spacetimes with event horizons, including Rindler and nonstatic spacetimes such as Kerr-Newman and Taub-NUT. We obtain a general form for the temperature of Taub-NUT-AdS black holes that is commensurate with other methods. We examine the limitations of these methods for extremal black holes, taking the extremal Reissner-Nordstrom spacetime as a case in point.
Scope and applications of translation invariant wavelets to image registration
NASA Technical Reports Server (NTRS)
Chettri, Samir; LeMoigne, Jacqueline; Campbell, William
1997-01-01
The first part of this article introduces the notion of translation invariance in wavelets and discusses several wavelets that have this property. The second part discusses the possible applications of such wavelets to image registration. In the case of registration of affinely transformed images, we would conclude that the notion of translation invariance is not really necessary. What is needed is affine invariance and one way to do this is via the method of moment invariants. Wavelets or, in general, pyramid processing can then be combined with the method of moment invariants to reduce the computational load.
Prediction and characterization of application power use in a high-performance computing environment
Bugbee, Bruce; Phillips, Caleb; Egan, Hilary; ...
2017-02-27
Power use in data centers and high-performance computing (HPC) facilities has grown in tandem with increases in the size and number of these facilities. Substantial innovation is needed to enable meaningful reduction in energy footprints in leadership-class HPC systems. In this paper, we focus on characterizing and investigating application-level power usage. We demonstrate potential methods for predicting power usage based on a priori and in situ characteristics. Lastly, we highlight a potential use case of this method through a simulated power-aware scheduler using historical jobs from a real scientific HPC system.
Craniofacial Reconstruction Using Rational Cubic Ball Curves
Majeed, Abdul; Mt Piah, Abd Rahni; Gobithaasan, R. U.; Yahya, Zainor Ridzuan
2015-01-01
This paper proposes the reconstruction of craniofacial fracture using rational cubic Ball curve. The idea of choosing Ball curve is based on its robustness of computing efficiency over Bezier curve. The main steps are conversion of Digital Imaging and Communications in Medicine (Dicom) images to binary images, boundary extraction and corner point detection, Ball curve fitting with genetic algorithm and final solution conversion to Dicom format. The last section illustrates a real case of craniofacial reconstruction using the proposed method which clearly indicates the applicability of this method. A Graphical User Interface (GUI) has also been developed for practical application. PMID:25880632
Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise
NASA Astrophysics Data System (ADS)
Rozhkov, Mikhail; Kitov, Ivan
2015-04-01
Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the International Monitoring System of CTBTO and by small-aperture seismic array Mikhnevo (MHVAR) operated by the Institute of Geosphere Dynamics, Russian Academy of Sciences. Our approach demonstrated a good ability of separation of seismic sources with very close origin times and locations (hundreds of meters), and/or having close arrival times (fractions of seconds), and recovering their waveforms from the mixture. Perspectives and limitations of the method are discussed.
Infrared Cephalic-Vein to Assist Blood Extraction Tasks: Automatic Projection and Recognition
NASA Astrophysics Data System (ADS)
Lagüela, S.; Gesto, M.; Riveiro, B.; González-Aguilera, D.
2017-05-01
Thermal infrared band is not commonly used in photogrammetric and computer vision algorithms, mainly due to the low spatial resolution of this type of imagery. However, this band captures sub-superficial information, increasing the capabilities of visible bands regarding applications. This fact is especially important in biomedicine and biometrics, allowing the geometric characterization of interior organs and pathologies with photogrammetric principles, as well as the automatic identification and labelling using computer vision algorithms. This paper presents advances of close-range photogrammetry and computer vision applied to thermal infrared imagery, with the final application of Augmented Reality in order to widen its application in the biomedical field. In this case, the thermal infrared image of the arm is acquired and simultaneously projected on the arm, together with the identification label of the cephalic-vein. This way, blood analysts are assisted in finding the vein for blood extraction, especially in those cases where the identification by the human eye is a complex task. Vein recognition is performed based on the Gaussian temperature distribution in the area of the vein, while the calibration between projector and thermographic camera is developed through feature extraction and pattern recognition. The method is validated through its application to a set of volunteers, with different ages and genres, in such way that different conditions of body temperature and vein depth are covered for the applicability and reproducibility of the method.
A Study on the Evaluation of the Applicability of an Environmental Education Modular Curriculum
ERIC Educational Resources Information Center
Artun, Hüseyin; Özsevgeç, Tuncay
2016-01-01
The purpose of this study was, in line with the views of the students & teacher, to examine Environmental Education Modular Curriculum (EEMC) developed to give environmental education with a specific content. In the study, the case study method was used. The research sample was determined with the purposeful sampling method & made up of 23…
Accommodating Uncertainty in Prior Distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picard, Richard Roy; Vander Wiel, Scott Alan
2017-01-19
A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.
Code of Federal Regulations, 2014 CFR
2014-04-01
... in funding method, and experience gains and losses of previous years. (3) Limit adjustment. The term... full funding limitation described in paragraph (k) of the section, where applicable) with respect to a... account under section 412(b)(3). In the case of a plan using a spread gain funding method which maintains...
Code of Federal Regulations, 2012 CFR
2012-04-01
... in funding method, and experience gains and losses of previous years. (3) Limit adjustment. The term... full funding limitation described in paragraph (k) of the section, where applicable) with respect to a... account under section 412(b)(3). In the case of a plan using a spread gain funding method which maintains...
Code of Federal Regulations, 2010 CFR
2010-04-01
... method, and experience gains and losses of previous years. (3) Limit adjustment. The term “limit... (k) of the section, where applicable) with respect to a given plan year in computing deductible... case of a plan using a spread gain funding method which maintains an unfunded liability (e.g., the...
Code of Federal Regulations, 2011 CFR
2011-04-01
... in funding method, and experience gains and losses of previous years. (3) Limit adjustment. The term... full funding limitation described in paragraph (k) of the section, where applicable) with respect to a... account under section 412(b)(3). In the case of a plan using a spread gain funding method which maintains...
Pustejovsky, James E; Swan, Daniel M
2015-01-01
Partial interval recording (PIR) is a procedure for collecting measurements during direct observation of behavior. It is used in several areas of educational and psychological research, particularly in connection with single-case research. Measurements collected using partial interval recording suffer from construct invalidity because they are not readily interpretable in terms of the underlying characteristics of the behavior. Using an alternating renewal process model for the behavior under observation, we demonstrate that ignoring the construct invalidity of PIR data can produce misleading inferences, such as inferring that an intervention reduces the prevalence of an undesirable behavior when in fact it has the opposite effect. We then propose four different methods for analyzing PIR summary measurements, each of which can be used to draw inferences about interpretable behavioral parameters. We demonstrate the methods by applying them to data from two single-case studies of problem behavior.
Color separation of signature and stamp inks to facilitate handwriting examination.
Chaikovsky, Alan; Brown, Sharon; David, Laser Sin; Balman, Alex; Barzovski, Avner
2003-11-01
The questioned documents laboratory often encounters cases where handwriting that is to be examined intersects with some interfering factor such as a rubber stamp, typewriting or background printing. In these cases, line direction, beginning and ending features of letters and other fine details of the handwriting may be lost in the "noise" of the intersecting ink. The purpose of this paper is to show several new digital photography methods that may be used to "subtract" the effect of the intersecting ink, thereby enhancing that of the handwriting ink in order to enable the document examiner to conduct a complete examination. These methods have the advantage of being fast and do not involve the use of expensive material or equipment. Several new methods are described that may be used to separate the colors of the handwriting ink from that of the intersecting ink: the analog method and several digital methods such as RGB-HSB-CMYK, L*a*b color, and color separation using the Channel Mixer function of Adobe Photoshop. Successful application of these color separation methods to specific handwriting ink/rubber stamp ink color combinations shows that the effect of the intersecting ink may indeed be minimalized if not canceled altogether. Application of the suggested methods may well make the difference between a nonconclusive handwriting examination and a full analysis of the questioned handwriting.
ERIC Educational Resources Information Center
Bohanon, Hank; Fenning, Pamela; Hicks, Kira; Weber, Stacey; Thier, Kimberly; Aikins, Brigit; Morrissey, Kelly; Briggs, Alissa; Bartucci, Gina; McArdle, Lauren; Hoeper, Lisa; Irvin, Larry
2012-01-01
The purpose of this case study was to expand the literature base regarding the application of high school schoolwide positive behavior support in an urban setting for practitioners and policymakers to address behavior issues. In addition, the study describes the use of the Change Point Test as a method for analyzing time series data that are…
Improvement of Liquefiable Foundation Conditions Beneath Existing Structures.
1985-08-01
filter zones, and drains. Drilling fluids can cause hydraulic fracturing . These hazards can lead to to piping and hvdraulic fracturing Compression . 7...with results of piping and hydraulic fracturing (Continued) * Site conditions have been classified into three cases; Case 1 is for beneath -d...which could lead to piping and hydraulic fracturing Soil Reinforcement 16. Vibro-replacement See methods 2 and 3 stone and sand columns applicable to
1991-01-01
EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE
Protein-ligand docking using FFT based sampling: D3R case study.
Padhorny, Dzmitry; Hall, David R; Mirzaei, Hanieh; Mamonov, Artem B; Moghadasi, Mohammad; Alekseenko, Andrey; Beglov, Dmitri; Kozakov, Dima
2018-01-01
Fast Fourier transform (FFT) based approaches have been successful in application to modeling of relatively rigid protein-protein complexes. Recently, we have been able to adapt the FFT methodology to treatment of flexible protein-peptide interactions. Here, we report our latest attempt to expand the capabilities of the FFT approach to treatment of flexible protein-ligand interactions in application to the D3R PL-2016-1 challenge. Based on the D3R assessment, our FFT approach in conjunction with Monte Carlo minimization off-grid refinement was among the top performing methods in the challenge. The potential advantage of our method is its ability to globally sample the protein-ligand interaction landscape, which will be explored in further applications.
Kereszturya, László; Rajczya, Katalin; Lászikb, András; Gyódia, Eva; Pénzes, Mária; Falus, András; Petrányia, Gyõzõ G
2002-03-01
In cases of disputed paternity, the scientific goal is to promote either the exclusion of a falsely accused man or the affiliation of the alleged father. Until now, in addition to anthropologic characteristics, the determination of genetic markers included human leukocyte antigen gene variants; erythrocyte antigens and serum proteins were used for that reason. Recombinant DNA techniques provided a new set of highly variable genetic markers based on DNA nucleotide sequence polymorphism. From the practical standpoint, the application of these techniques to paternity testing provides greater versatility than do conventional genetic marker systems. The use of methods to detect the polymorphism of human leukocyte antigen loci significantly increases the chance of validation of ambiguous results in paternity testing. The outcome of 2384 paternity cases investigated by serologic and/or DNA-based human leukocyte antigen typing was statistically analyzed. Different cases solved by DNA typing are presented involving cases with one or two accused men, exclusions and nonexclusions, and tests of the paternity of a deceased man. The results provide evidence for the advantage of the combined application of various techniques in forensic diagnostics and emphasizes the outstanding possibilities of DNA-based assays. Representative examples demonstrate the strength of combined techniques in paternity testing.
Cauliflower ear - a minimally invasive treatment method in a wrestling athlete: a case report.
Haik, Josef; Givol, Or; Kornhaber, Rachel; Cleary, Michelle; Ofir, Hagit; Harats, Moti
2018-01-01
Acute auricular hematoma can be caused by direct blunt trauma or other injury to the external ear. It is typically seen in those who practice full contact sports such as boxing, wrestling, and rugby. "Cauliflower ear" deformity, fibrocartilage formation during scarring, is a common complication of auricular hematomas. Therefore, acute drainage of the hematoma and postprocedural techniques for preventing recurrence are necessary for preventing the deformity. There are many techniques although no superior method of treatment has been found. In this case report, we describe a novel method using needle aspiration followed by the application of a magnet and an adapted disc to the affected area of the auricular. This minimally invasive, simple, and accessible method could potentially facilitate the treatment of cauliflower ear among full contact sports athletes.
The Trojan Horse Method in Nuclear Astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spitaleri, C.
2010-11-24
The Trojan Horse Method allows for the measurements of cross section in nuclear reaction between charged particles at astrophysical energies. The basic features of the method are discussed in the non resonant reactions case. A review of applications aimed to extract the bare nucleus astrophysical S{sub b}(E) factor for two body processes are presented. The information on electron screening potential U{sub e} were obtained from comparison with direct experiments of fusion reactions.
Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K
2015-12-01
There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.
Dynamic Monte Carlo description of thermal desorption processes
NASA Astrophysics Data System (ADS)
Weinketz, Sieghard
1994-07-01
The applicability of the dynamic Monte Carlo method of Fichthorn and Weinberg, in which the time evolution of a system is described in terms of the absolute number of different microscopic possible events and their associated transition rates, is discussed for the case of thermal desorption simulations. It is shown that the definition of the time increment at each successful event leads naturally to the macroscopic differential equation of desorption, in the case of simple first- and second-order processes in which the only possible events are desorption and diffusion. This equivalence is numerically demonstrated for a second-order case. In the sequence, the equivalence of this method with the Monte Carlo method of Sales and Zgrablich for more complex desorption processes, allowing for lateral interactions between adsorbates, is shown, even though the dynamic Monte Carlo method does not bear their limitation of a rapid surface diffusion condition, thus being able to describe a more complex ``kinetics'' of surface reactive processes, and therefore be applied to a wider class of phenomena, such as surface catalysis.
The application of hybrid artificial intelligence systems for forecasting
NASA Astrophysics Data System (ADS)
Lees, Brian; Corchado, Juan
1999-03-01
The results to date are presented from an ongoing investigation, in which the aim is to combine the strengths of different artificial intelligence methods into a single problem solving system. The premise underlying this research is that a system which embodies several cooperating problem solving methods will be capable of achieving better performance than if only a single method were employed. The work has so far concentrated on the combination of case-based reasoning and artificial neural networks. The relative merits of artificial neural networks and case-based reasoning problem solving paradigms, and their combination are discussed. The integration of these two AI problem solving methods in a hybrid systems architecture, such that the neural network provides support for learning from past experience in the case-based reasoning cycle, is then presented. The approach has been applied to the task of forecasting the variation of physical parameters of the ocean. Results obtained so far from tests carried out in the dynamic oceanic environment are presented.
NASA Astrophysics Data System (ADS)
Barreto, Patricia R. P.; Cruz, Ana Claudia P. S.; Barreto, Rodrigo L. P.; Palazzetti, Federico; Albernaz, Alessandra F.; Lombardi, Andrea; Maciel, Glauciete S.; Aquilanti, Vincenzo
2017-07-01
The spherical-harmonics expansion is a mathematically rigorous procedure and a powerful tool for the representation of potential energy surfaces of interacting molecular systems, determining their spectroscopic and dynamical properties, specifically in van der Waals clusters, with applications also to classical and quantum molecular dynamics simulations. The technique consists in the construction (by ab initio or semiempirical methods) of the expanded potential interaction up to terms that provide the generation of a number of leading configurations sufficient to account for faithful geometrical representations. This paper reports the full general description of the method of the spherical-harmonics expansion as applied to diatomic-molecule - diatomic-molecule systems of increasing complexity: the presentation of the mathematical background is given for providing both the application to the prototypical cases considered previously (O2sbnd O2, N2sbnd N2, and N2sbnd O2 systems) and the generalization to: (i) the COsbnd CO system, where a characteristic feature is the lower symmetry order with respect to the cases studied before, requiring a larger number of expansion terms necessary to adequately represent the potential energy surface; and (ii) the COsbnd HF system, which exhibits the lowest order of symmetry among this class of aggregates and therefore the highest number of leading configurations.
The Worst-Case Weighted Multi-Objective Game with an Application to Supply Chain Competitions
Qu, Shaojian; Ji, Ying
2016-01-01
In this paper, we propose a worst-case weighted approach to the multi-objective n-person non-zero sum game model where each player has more than one competing objective. Our “worst-case weighted multi-objective game” model supposes that each player has a set of weights to its objectives and wishes to minimize its maximum weighted sum objectives where the maximization is with respect to the set of weights. This new model gives rise to a new Pareto Nash equilibrium concept, which we call “robust-weighted Nash equilibrium”. We prove that the robust-weighted Nash equilibria are guaranteed to exist even when the weight sets are unbounded. For the worst-case weighted multi-objective game with the weight sets of players all given as polytope, we show that a robust-weighted Nash equilibrium can be obtained by solving a mathematical program with equilibrium constraints (MPEC). For an application, we illustrate the usefulness of the worst-case weighted multi-objective game to a supply chain risk management problem under demand uncertainty. By the comparison with the existed weighted approach, we show that our method is more robust and can be more efficiently used for the real-world applications. PMID:26820512
Light Weight MP3 Watermarking Method for Mobile Terminals
NASA Astrophysics Data System (ADS)
Takagi, Koichi; Sakazawa, Shigeyuki; Takishima, Yasuhiro
This paper proposes a novel MP3 watermarking method which is applicable to a mobile terminal with limited computational resources. Considering that in most cases the embedded information is copyright information or metadata, which should be extracted before playing back audio contents, the watermark detection process should be executed at high speed. However, when conventional methods are used with a mobile terminal, it takes a considerable amount of time to detect a digital watermark. This paper focuses on scalefactor manipulation to enable high speed watermark embedding/detection for MP3 audio and also proposes the manipulation method which minimizes audio quality degradation adaptively. Evaluation tests showed that the proposed method is capable of embedding 3 bits/frame information without degrading audio quality and detecting it at very high speed. Finally, this paper describes application examples for authentication with a digital signature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, K.L.; Robinson, C.A.; Ikonen, A.T.K.
2007-07-01
The protection of the environment from the effects of ionising radiation has become increasingly more topical over the last few years as the intentions enshrined in international principles and agreements have become more binding through national and international law. For example, the Directive on impact of certain projects on the environment (EIA Directive 85/337/EEC) [CEC, 1985], amended in 1997 [CEC, 1997], places a mandatory requirement on all EU Member States to conduct environmental impact assessments for a range of project having potential impact on the environment, including radioactive waste disposal. Such assessments must consider humans, fauna and flora, the abioticmore » environment (soil, water, air), material assets and cultural heritage as well as the interactions between these factors. In Finland, Posiva Oy are responsible for the overall repository programme for spent nuclear fuel and, as such, are conducting the Safety Case Assessment for a proposed geological repository for nuclear waste. Within the European legislation framework, the Finnish regulatory body requires that the repository safety case assessment should include not only human radiological safety, but also an assessment of the potential impact upon populations of non-human biota. Specifically, the Safety Case should demonstrate that there will be: - no decline in the biodiversity of currently living populations; - no significant detriment to populations of fauna and flora; and, - no detrimental effects on individuals of domestic animals and rare plants and animals. At present, there are no internationally agreed criteria that explicitly address protection of the environment from ionising radiation. However, over recent years a number of assessment methodologies have been developed including, at a European level, the Framework for the Assessment of Environmental impact (FASSET) and Environmental Risks from Ionising Contaminants (ERICA). The International Committee on Radiation Protection (ICRP) have also proposed an approach to allow for assessments of potential impacts on non-human species, in its report in 2003. This approach is based on the development and use of a small set of reference animals and plants, with their associated dose models and data sets. Such approaches are broadly applicable to the Posiva Safety Case. However, the specific biota of concern and the current climatic conditions within Finland present an additional challenge to the assessment. The assessment methods most applicable to the Posiva Safety Case have therefore been reviewed in consideration of the regulatory requirements for the assessment and recommendations made on a suitable assessment approach. This has been applied within a test case and adaptations to the overall assessment method have been made to enable both population and individual impacts to be assessed where necessary. The test case has been undertaken to demonstrate the application of the recommended methodology, but also to identify data gaps, uncertainties and other specific issues associated with the application of an assessment method within the regulatory context. (authors)« less
van Kasteren, Yasmin; Musiat, Peter; Kidd, Michael
2018-01-01
Background My Health Record (MyHR) is Australia’s national electronic health record (EHR) system. Poor usability and functionality have resulted in low utility, affecting enrollment and participation rates by both patients and clinicians alike. Similar to apps on mobile phone app stores, innovative third-party applications of MyHR platform data can enhance the usefulness of the platform, but there is a paucity of research into the processes involved in developing third-party applications that integrate and use data from EHR systems. Objective The research describes the challenges involved in pioneering the development of a patient and clinician Web-based software application for MyHR and insights resulting from this experience. Methods This research uses a case study approach, investigating the development and implementation of Actionable Intime Insights (AI2), a third-party application for MyHR, which translates Medicare claims records stored in MyHR into a clinically meaningful timeline visualization of health data for both patients and clinicians. This case study identifies the challenges encountered by the Personal Health Informatics team from Flinders University in the MyHR third-party application development environment. Results The study presents a nuanced understanding of different data types and quality of data in MyHR and the complexities associated with developing secondary-use applications. Regulatory requirements associated with utilization of MyHR data, restrictions on visualizations of data, and processes of testing third-party applications were encountered during the development of the application. Conclusions This study identified several processes, technical and regulatory barriers which, if addressed, can make MyHR a thriving ecosystem of health applications. It clearly identifies opportunities and considerations for the Australian Digital Health Agency and other national bodies wishing to encourage the development of new and innovative use cases for national EHRs. PMID:29691211
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL (Reliability, Maintainability, Supportability, and Logistics) Division activities include identification and fulfillment of joint industry, government, and academia needs for development and implementation of RMSL technologies. Four Projects in the Probabilistic Methods area and two in the area of RMSL have been identified. These are: (1) Evaluation of Probabilistic Technology - progress has been made toward the selection of probabilistic application cases. Future effort will focus on assessment of multiple probabilistic softwares in solving selected engineering problems using probabilistic methods. Relevance to Industry & Government - Case studies of typical problems encountering uncertainties, results of solutions to these problems run by different codes, and recommendations on which code is applicable for what problems; (2) Probabilistic Input Preparation - progress has been made in identifying problem cases such as those with no data, little data and sufficient data. Future effort will focus on developing guidelines for preparing input for probabilistic analysis, especially with no or little data. Relevance to Industry & Government - Too often, we get bogged down thinking we need a lot of data before we can quantify uncertainties. Not True. There are ways to do credible probabilistic analysis with little data; (3) Probabilistic Reliability - probabilistic reliability literature search has been completed along with what differentiates it from statistical reliability. Work on computation of reliability based on quantification of uncertainties in primitive variables is in progress. Relevance to Industry & Government - Correct reliability computations both at the component and system level are needed so one can design an item based on its expected usage and life span; (4) Real World Applications of Probabilistic Methods (PM) - A draft of volume 1 comprising aerospace applications has been released. Volume 2, a compilation of real world applications of probabilistic methods with essential information demonstrating application type and timehost savings by the use of probabilistic methods for generic applications is in progress. Relevance to Industry & Government - Too often, we say, 'The Proof is in the Pudding'. With help from many contributors, we hope to produce such a document. Problem is - not too many people are coming forward due to proprietary nature. So, we are asking to document only minimum information including problem description, what method used, did it result in any savings, and how much?; (5) Software Reliability - software reliability concept, program, implementation, guidelines, and standards are being documented. Relevance to Industry & Government - software reliability is a complex issue that must be understood & addressed in all facets of business in industry, government, and other institutions. We address issues, concepts, ways to implement solutions, and guidelines for maximizing software reliability; (6) Maintainability Standards - maintainability/serviceability industry standard/guidelines and industry best practices and methodologies used in performing maintainability/ serviceability tasks are being documented. Relevance to Industry & Government - Any industry or government process, project, and/or tool must be maintained and serviced to realize the life and performance it was designed for. We address issues and develop guidelines for optimum performance & life.
Scaling images using their background ratio. An application in statistical comparisons of images.
Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J
2003-06-07
Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.
NASA Astrophysics Data System (ADS)
Bortolozo, Cassiano Antonio; Bokhonok, Oleg; Porsani, Jorge Luís; Monteiro dos Santos, Fernando Acácio; Diogo, Liliana Alcazar; Slob, Evert
2017-11-01
Ambiguities in geophysical inversion results are always present. How these ambiguities appear in most cases open to interpretation. It is interesting to investigate ambiguities with regard to the parameters of the models under study. Residual Function Dispersion Map (RFDM) can be used to differentiate between global ambiguities and local minima in the objective function. We apply RFDM to Vertical Electrical Sounding (VES) and TEM Sounding inversion results. Through topographic analysis of the objective function we evaluate the advantages and limitations of electrical sounding data compared with TEM sounding data, and the benefits of joint inversion in comparison with the individual methods. The RFDM analysis proved to be a very interesting tool for understanding the joint inversion method of VES/TEM. Also the advantage of the applicability of the RFDM analyses in real data is explored in this paper to demonstrate not only how the objective function of real data behaves but the applicability of the RFDM approach in real cases. With the analysis of the results, it is possible to understand how the joint inversion can reduce the ambiguity of the methods.
Myers, Lori K
2016-04-01
Neuroplasticity theory has gained considerable attention in recent years in the professions of medicine, psychology and neuroscience. Most research on neuroplasticity has been in neurology focusing on stroke and other central nervous system disease and injury. Further research is necessary to advance the connection of neuroplasticity theory to musculoskeletal conditions and rehabilitation. The theory of neuroplasticity as it applies to the acquisition of new skills and modification of maladaptive, pain-perpetuating and inefficient movement patterns is fundamental to the Feldenkrais Method. This case report demonstrates the application of neuroplasticity theory with the Feldenkrais Method as the primary intervention for a 42-year-old female runner with a history of adolescent idiopathic scoliosis who presented with hip and lumbar pain. The client had clinically meaningful improvements in pain intensity and the Global Rating of Change scale while meeting her goals to resume pain free running, repetitive stair climbing at work, and other leisure activities. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Shibing; Yang, Bingen
2017-10-01
Flexible multistage rotor systems with water-lubricated rubber bearings (WLRBs) have a variety of engineering applications. Filling a technical gap in the literature, this effort proposes a method of optimal bearing placement that minimizes the vibration amplitude of a WLRB-supported flexible rotor system with a minimum number of bearings. In the development, a new model of WLRBs and a distributed transfer function formulation are used to define a mixed continuous-and-discrete optimization problem. To deal with the case of uncertain number of WLRBs in rotor design, a virtual bearing method is devised. Solution of the optimization problem by a real-coded genetic algorithm yields the locations and lengths of water-lubricated rubber bearings, by which the prescribed operational requirements for the rotor system are satisfied. The proposed method is applicable either to preliminary design of a new rotor system with the number of bearings unforeknown or to redesign of an existing rotor system with a given number of bearings. Numerical examples show that the proposed optimal bearing placement is efficient, accurate and versatile in different design cases.
Biomedical coatings on magnesium alloys - a review.
Hornberger, H; Virtanen, S; Boccaccini, A R
2012-07-01
This review comprehensively covers research carried out in the field of degradable coatings on Mg and Mg alloys for biomedical applications. Several coating methods are discussed, which can be divided, based on the specific processing techniques used, into conversion and deposition coatings. The literature review revealed that in most cases coatings increase the corrosion resistance of Mg and Mg alloys. The critical factors determining coating performance, such as corrosion rate, surface chemistry, adhesion and coating morphology, are identified and discussed. The analysis of the literature showed that many studies have focused on calcium phosphate coatings produced either using conversion or deposition methods which were developed for orthopaedic applications. However, the control of phases and the formation of cracks still appear unsatisfactory. More research and development is needed in the case of biodegradable organic based coatings to generate reproducible and relevant data. In addition to biocompatibility, the mechanical properties of the coatings are also relevant, and the development of appropriate methods to study the corrosion process in detail and in the long term remains an important area of research. Copyright © 2012 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Investigations into distribution of lidocaine in human autopsy material.
Oertel, Reinhard; Arenz, Norman; Zeitz, Sten Gunnar; Pietsch, Jörg
2015-08-01
With screening methods in the legal medicine drugs were often detected in autopsy material. In this study the antiarrhythmic and the local anesthetic drug lidocaine could be proved in fifty-one cases and determined in different autopsy materials. For the first time the comparison of so many distribution patterns of lidocaine in human compartments was possible. A liquid-liquid extraction procedure, a standard addition method and LC/MS/MS were used for analytics. The measured concentrations in blood were in the therapeutic range or lower. The time between lidocaine application and death was given in twenty-nine cases. These data were very helpful to estimate and interpret the distribution process of lidocaine between application and death. This time exerted a crucial influence on the distribution of lidocaine in the compartments. Most of the intravenous applicated lidocaine was found in heart blood after a very short time of distribution. Afterwards the highest concentrations were measured in brain. Later the highest concentration was found in the kidney samples or in urine. If the time between lidocaine application and death is known, the results of this study can be used to deepen the knowledge of its pharmacokinetics. If this time is unknown, the circumstances and the causes of death can be better explained. Copyright © 2015 John Wiley & Sons, Ltd.
Optimal control penalty finite elements - Applications to integrodifferential equations
NASA Astrophysics Data System (ADS)
Chung, T. J.
The application of the optimal-control/penalty finite-element method to the solution of integrodifferential equations in radiative-heat-transfer problems (Chung et al.; Chung and Kim, 1982) is discussed and illustrated. The nonself-adjointness of the convective terms in the governing equations is treated by utilizing optimal-control cost functions and employing penalty functions to constrain auxiliary equations which permit the reduction of second-order derivatives to first order. The OCPFE method is applied to combined-mode heat transfer by conduction, convection, and radiation, both without and with scattering and viscous dissipation; the results are presented graphically and compared to those obtained by other methods. The OCPFE method is shown to give good results in cases where standard Galerkin FE fail, and to facilitate the investigation of scattering and dissipation effects.
Web-conference supervision for advanced psychotherapy training: a practical guide.
Abbass, Allan; Arthey, Stephen; Elliott, Jason; Fedak, Tim; Nowoweiski, Dion; Markovski, Jasmina; Nowoweiski, Sarah
2011-06-01
The advent of readily accessible, inexpensive Web-conferencing applications has opened the door for distance psychotherapy supervision, using video recordings of treated clients. Although relatively new, this method of supervision is advantageous given the ease of use and low cost of various Internet applications. This method allows periodic supervision from point to point around the world, with no travel costs and no long gaps between direct training contacts. Web-conferencing permits face-to-face training so that the learner and supervisor can read each other's emotional responses while reviewing case material. It allows group learning from direct supervision to complement local peer-to-peer learning methods. In this article, we describe the relevant literature on this type of learning method, the practical points in its utilization, its limitations, and its benefits.
Detecting submerged objects: the application of side scan sonar to forensic contexts.
Schultz, John J; Healy, Carrie A; Parker, Kenneth; Lowers, Bim
2013-09-10
Forensic personnel must deal with numerous challenges when searching for submerged objects. While traditional water search methods have generally involved using dive teams, remotely operated vehicles (ROVs), and water scent dogs for cases involving submerged objects and bodies, law enforcement is increasingly integrating multiple methods that include geophysical technologies. There are numerous advantages for integrating geophysical technologies, such as side scan sonar and ground penetrating radar (GPR), with more traditional search methods. Overall, these methods decrease the time involved searching, in addition to increasing area searched. However, as with other search methods, there are advantages and disadvantages when using each method. For example, in instances with excessive aquatic vegetation or irregular bottom terrain, it may not be possible to discern a submersed body with side scan sonar. As a result, forensic personnel will have the highest rate of success during searches for submerged objects when integrating multiple search methods, including deploying multiple geophysical technologies. The goal of this paper is to discuss the methodology of various search methods that are employed for submerged objects and how these various methods can be integrated as part of a comprehensive protocol for water searches depending upon the type of underwater terrain. In addition, two successful case studies involving the search and recovery of a submerged human body using side scan sonar are presented to illustrate the successful application of integrating a geophysical technology with divers when searching for a submerged object. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
de Laborderie, J.; Duchaine, F.; Gicquel, L.; Vermorel, O.; Wang, G.; Moreau, S.
2018-06-01
Large-Eddy Simulation (LES) is recognized as a promising method for high-fidelity flow predictions in turbomachinery applications. The presented approach consists of the coupling of several instances of the same LES unstructured solver through an overset grid method. A high-order interpolation, implemented within this coupling method, is introduced and evaluated on several test cases. It is shown to be third order accurate, to preserve the accuracy of various second and third order convective schemes and to ensure the continuity of diffusive fluxes and subgrid scale tensors even in detrimental interface configurations. In this analysis, three types of spurious waves generated at the interface are identified. They are significantly reduced by the high-order interpolation at the interface. The latter having the same cost as the original lower order method, the high-order overset grid method appears as a promising alternative to be used in all the applications.
NASA Astrophysics Data System (ADS)
Åström, Anders; Forchheimer, Robert
2012-03-01
Based on the Near-Sensor Image Processing (NSIP) concept and recent results concerning optical flow and Time-to- Impact (TTI) computation with this architecture, we show how these results can be used and extended for robot vision applications. The first case involves estimation of the tilt of an approaching planar surface. The second case concerns the use of two NSIP cameras to estimate absolute distance and speed similar to a stereo-matching system but without the need to do image correlations. Going back to a one-camera system, the third case deals with the problem to estimate the shape of the approaching surface. It is shown that the previously developed TTI method not only gives a very compact solution with respect to hardware complexity, but also surprisingly high performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Keppens, R.; Xia, C.
2016-09-10
We report our implementation of the magneto-frictional method in the Message Passing Interface Adaptive Mesh Refinement Versatile Advection Code (MPI-AMRVAC). The method aims at applications where local adaptive mesh refinement (AMR) is essential to make follow-up dynamical modeling affordable. We quantify its performance in both domain-decomposed uniform grids and block-adaptive AMR computations, using all frequently employed force-free, divergence-free, and other vector comparison metrics. As test cases, we revisit the semi-analytic solution of Low and Lou in both Cartesian and spherical geometries, along with the topologically challenging Titov–Démoulin model. We compare different combinations of spatial and temporal discretizations, and find thatmore » the fourth-order central difference with a local Lax–Friedrichs dissipation term in a single-step marching scheme is an optimal combination. The initial condition is provided by the potential field, which is the potential field source surface model in spherical geometry. Various boundary conditions are adopted, ranging from fully prescribed cases where all boundaries are assigned with the semi-analytic models, to solar-like cases where only the magnetic field at the bottom is known. Our results demonstrate that all the metrics compare favorably to previous works in both Cartesian and spherical coordinates. Cases with several AMR levels perform in accordance with their effective resolutions. The magneto-frictional method in MPI-AMRVAC allows us to model a region of interest with high spatial resolution and large field of view simultaneously, as required by observation-constrained extrapolations using vector data provided with modern instruments. The applications of the magneto-frictional method to observations are shown in an accompanying paper.« less
Oster, Natalia V; Carney, Patricia A; Allison, Kimberly H; Weaver, Donald L; Reisch, Lisa M; Longton, Gary; Onega, Tracy; Pepe, Margaret; Geller, Berta M; Nelson, Heidi D; Ross, Tyler R; Tosteson, Aanna N A; Elmore, Joann G
2013-02-05
Diagnostic test sets are a valuable research tool that contributes importantly to the validity and reliability of studies that assess agreement in breast pathology. In order to fully understand the strengths and weaknesses of any agreement and reliability study, however, the methods should be fully reported. In this paper we provide a step-by-step description of the methods used to create four complex test sets for a study of diagnostic agreement among pathologists interpreting breast biopsy specimens. We use the newly developed Guidelines for Reporting Reliability and Agreement Studies (GRRAS) as a basis to report these methods. Breast tissue biopsies were selected from the National Cancer Institute-funded Breast Cancer Surveillance Consortium sites. We used a random sampling stratified according to woman's age (40-49 vs. ≥50), parenchymal breast density (low vs. high) and interpretation of the original pathologist. A 3-member panel of expert breast pathologists first independently interpreted each case using five primary diagnostic categories (non-proliferative changes, proliferative changes without atypia, atypical ductal hyperplasia, ductal carcinoma in situ, and invasive carcinoma). When the experts did not unanimously agree on a case diagnosis a modified Delphi method was used to determine the reference standard consensus diagnosis. The final test cases were stratified and randomly assigned into one of four unique test sets. We found GRRAS recommendations to be very useful in reporting diagnostic test set development and recommend inclusion of two additional criteria: 1) characterizing the study population and 2) describing the methods for reference diagnosis, when applicable.
The application of use case modeling in designing medical imaging information systems.
Safdari, Reza; Farzi, Jebraeil; Ghazisaeidi, Marjan; Mirzaee, Mahboobeh; Goodini, Azadeh
2013-01-01
Introduction. The essay at hand is aimed at examining the application of use case modeling in analyzing and designing information systems to support Medical Imaging services. Methods. The application of use case modeling in analyzing and designing health information systems was examined using electronic databases (Pubmed, Google scholar) resources and the characteristics of the modeling system and its effect on the development and design of the health information systems were analyzed. Results. Analyzing the subject indicated that Provident modeling of health information systems should provide for quick access to many health data resources in a way that patients' data can be used in order to expand distant services and comprehensive Medical Imaging advices. Also these experiences show that progress in the infrastructure development stages through gradual and repeated evolution process of user requirements is stronger and this can lead to a decline in the cycle of requirements engineering process in the design of Medical Imaging information systems. Conclusion. Use case modeling approach can be effective in directing the problems of health and Medical Imaging information systems towards understanding, focusing on the start and analysis, better planning, repetition, and control.
Usability evaluation techniques in mobile commerce applications: A systematic review
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.
2016-08-01
Obviously, there are a number of literatures concerning the usability of mobile commerce (m-commerce) applications and related areas, but they do not adequately provide knowledge about usability techniques used in most of the empirical usability evaluation for m-commerce application. Therefore, this paper is aimed at producing the usability techniques frequently used in the aspect of usability evaluation for m-commerce applications. To achieve the stated objective, systematic literature review was employed. Sixty seven papers were downloaded in usability evaluation for m-commerce and related areas; twenty one most relevant studies were selected for review in order to extract the appropriate information. The results from the review shows that heuristic evaluation, formal test and think aloud methods are the most commonly used methods in m-commerce application in comparison to cognitive walkthrough and the informal test methods. Moreover, most of the studies applied control experiment (33.3% of the total studies); other studies that applied case study for usability evaluation are 14.28%. The results from this paper provide additional knowledge to the usability practitioners and research community for the current state and use of usability techniques in m-commerce application.
RDBMS Applications as Online Based Data Archive: A Case of Harbour Medical Center in Pekanbaru
NASA Astrophysics Data System (ADS)
Febriadi, Bayu; Zamsuri, Ahmad
2017-12-01
Kantor Kesehatan Pelabuhan Kelas II Pekanbaru is a government office that concerns about healthy, especially about environment health. There is a problem in case of saving electronic data, also in analyzing daily data both for internal and external data. The office has some computers and other tools that are useful in saving electronic data. In fact, the data are still saved in available cupboards and it is not efficient for an important data that is analyzed for more than one time. In other words, it is not good for a data is needed to be analyzed continuously. Rational Data Base Management System (RDBMS) application is an online based saving data and it uses System Development Life Cycle (SDLC) method. Hopefully, the application will be very useful for employees Kantor Kesehatan Pelabuhan Pekanbaru in managing their work.
The time-frequency method of signal analysis in internal combustion engine diagnostics
NASA Astrophysics Data System (ADS)
Avramchuk, V. S.; Kazmin, V. P.; Faerman, V. A.; Le, V. T.
2017-01-01
The paper presents the results of the study of applicability of time-frequency correlation functions to solving the problems of internal combustion engine fault diagnostics. The proposed methods are theoretically justified and experimentally tested. In particular, the method’s applicability is illustrated by the example of specially generated signals that simulate the vibration of an engine both during the normal operation and in the case of a malfunction in the system supplying fuel to the cylinders. This method was confirmed during an experiment with an automobile internal combustion engine. The study offers the main findings of the simulation and the experiment and highlights certain characteristic features of time-frequency autocorrelation functions that allow one to identify malfunctions in an engine’s cylinder. The possibility in principle of using time-frequency correlation functions in function testing of the internal combustion engine is demonstrated. The paper’s conclusion proposes further research directions including the application of the method to diagnosing automobile gearboxes.
Prospects for the application of radiometric methods in the measurement of two-phase flows
NASA Astrophysics Data System (ADS)
Zych, Marcin
2018-06-01
The article constitutes an overview of the application of radiometric methods in the research of two-phase flows: liquid-solid particles and liquid-gas flows. The methods which were used were described on the basis of the experiments which were conducted in the Water Laboratory of the Wrocław University of Environmental and Life Sciences and in the Sedimentological Laboratory of the Faculty of Geology, Geophysics and Environmental Protection, AGH-UST in Kraków. The advanced mathematical methods for the analysis of signals from scintillation probes that were applied enable the acquisition of a number of parameters associated with the flowing two-phase mixture, such as: average velocities of the particular phases, concentration of the solid phase, and void fraction for a liquid-gas mixture. Despite the fact that the application of radioactive sources requires considerable carefulness and a number of state permits, in many cases these sources become useful in the experiments which are presented.
Interactive surface correction for 3D shape based segmentation
NASA Astrophysics Data System (ADS)
Schwarz, Tobias; Heimann, Tobias; Tetzlaff, Ralf; Rau, Anne-Mareike; Wolf, Ivo; Meinzer, Hans-Peter
2008-03-01
Statistical shape models have become a fast and robust method for segmentation of anatomical structures in medical image volumes. In clinical practice, however, pathological cases and image artifacts can lead to local deviations of the detected contour from the true object boundary. These deviations have to be corrected manually. We present an intuitively applicable solution for surface interaction based on Gaussian deformation kernels. The method is evaluated by two radiological experts on segmentations of the liver in contrast-enhanced CT images and of the left heart ventricle (LV) in MRI data. For both applications, five datasets are segmented automatically using deformable shape models, and the resulting surfaces are corrected manually. The interactive correction step improves the average surface distance against ground truth from 2.43mm to 2.17mm for the liver, and from 2.71mm to 1.34mm for the LV. We expect this method to raise the acceptance of automatic segmentation methods in clinical application.
Archer, Stuart K; Shirokikh, Nikolay E; Preiss, Thomas
2015-04-01
Most applications for RNA-seq require the depletion of abundant transcripts to gain greater coverage of the underlying transcriptome. The sequences to be targeted for depletion depend on application and species and in many cases may not be supported by commercial depletion kits. This unit describes a method for generating RNA-seq libraries that incorporates probe-directed degradation (PDD), which can deplete any unwanted sequence set, with the low-bias split-adapter method of library generation (although many other library generation methods are in principle compatible). The overall strategy is suitable for applications requiring customized sequence depletion or where faithful representation of fragment ends and lack of sequence bias is paramount. We provide guidelines to rapidly design specific probes against the target sequence, and a detailed protocol for library generation using the split-adapter method including several strategies for streamlining the technique and reducing adapter dimer content. Copyright © 2015 John Wiley & Sons, Inc.
STAKEHOLDER INVOLVEMENT THROUGHOUT HEALTH TECHNOLOGY ASSESSMENT: AN EXAMPLE FROM PALLIATIVE CARE.
Brereton, Louise; Wahlster, Philip; Mozygemba, Kati; Lysdahl, Kristin Bakke; Burns, Jake; Polus, Stephanie; Tummers, Marcia; Refolo, Pietro; Sacchini, Dario; Leppert, Wojciech; Chilcott, James; Ingleton, Christine; Gardiner, Clare; Goyder, Elizabeth
2017-01-01
Internationally, funders require stakeholder involvement throughout health technology assessment (HTA). We report successes, challenges, and lessons learned from extensive stakeholder involvement throughout a palliative care case study that demonstrates new concepts and methods for HTA. A 5-step "INTEGRATE-HTA Model" developed within the INTEGRATE-HTA project guided the case study. Using convenience or purposive sampling or directly / indirectly identifying and approaching individuals / groups, stakeholders participated in qualitative research or consultation meetings. During scoping, 132 stakeholders, aged ≥ 18 years in seven countries (England, Italy, Germany, The Netherlands, Norway, Lithuania, and Poland), highlighted key issues in palliative care that assisted identification of the intervention and comparator. Subsequently stakeholders in four countries participated in face-face, telephone and / or video Skype meetings to inform evidence collection and / or review assessment results. An applicability assessment to identify contextual and implementation barriers and enablers for the case study findings involved twelve professionals in the three countries. Finally, thirteen stakeholders participated in a mock decision-making meeting in England. Views about the best methods of stakeholder involvement vary internationally. Stakeholders make valuable contributions in all stages of HTA; assisting decision making about interventions, comparators, research questions; providing evidence and insights into findings, gap analyses and applicability assessments. Key challenges exist regarding inclusivity, time, and resource use. Stakeholder involvement is feasible and worthwhile throughout HTA, sometimes providing unique insights. Various methods can be used to include stakeholders, although challenges exist. Recognition of stakeholder expertise and further guidance about stakeholder consultation methods is needed.
Degrell, I
1979-08-02
The case of a 32-year-old female patient with multiple malformations (hare-lip, polythelia, fibroadenoma in an accessory mammary gland) and independent of these, another fibroadenoma in the breast is reported. The fibroadenoma developing in the accessory breast around the vulva, diagnosed by means of aspiration biopsy cytology, should be payed special attention. This case also confirms the applicability in preoperative diagnostics of aspiration biopsy cytology, a method which has proved to be effective for years.
Use of miniplates as a method for orthodontic anchorage: a case report.
Peres, Fernando Gianzanti; Padovan, Luis Eduardo Marques; Kluppel, Leandro Eduardo; Albuquerque, Gustavo Calvalcanti; Souza, Paulo Cesar Ulson de; Claudino, Marcela
2016-01-01
Temporary anchorage devices (TADs) have been developed to be used as direct adjuncts in orthodontic treatment and have facilitated treatment of more complex orthodontic cases, including patients with dental impaction. This clinical case reports the applicability of TADs in the orthodontic treatment of a patient with impacted mandibular second molars. Surgical and orthodontic procedures related to the use of miniplates were also discussed in this study. The use of temporary anchorage devices, such as miniplates, can be suggested as an alternative to treat patients with impacted mandibular second molars.
Aleksic-Shihabi, Anka; Jadrijevic, Eni; Milekic, Nina; Bulicic, Ana Repic; Titlic, Marina; Suljic, Enra
2016-01-01
Introduction: Stroke is a medical emergency in neurology, and is one of the leading causes of death nowadays. At a recent time, a therapeutic method used in adequate conditions is thrombolysis, a treatment of an emerging clot in the brain vascular system by alteplase. The application of alteplase also has a high risk of life threatening conditions. Case report: This is a brief report of a case with thrombolysis complication which manifested as a spleen rupture. PMID:26980937
Robust large-scale parallel nonlinear solvers for simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson
2005-11-01
This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their usemore » in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple to write and easily portable. However, the method usually takes twice as long to solve as Newton-GMRES on general problems because it solves two linear systems at each iteration. In this paper, we discuss modifications to Bouaricha's method for a practical implementation, including a special globalization technique and other modifications for greater efficiency. We present numerical results showing computational advantages over Newton-GMRES on some realistic problems. We further discuss a new approach for dealing with singular (or ill-conditioned) matrices. In particular, we modify an algorithm for identifying a turning point so that an increasingly ill-conditioned Jacobian does not prevent convergence.« less
Fang, Jiang B; Robertson, Vivian K; Rawat, Archana; Flick, Tawnya; Tang, Zhe J; Cauchon, Nina S; McElvain, James S
2010-10-04
Dissolution testing is frequently used to determine the rate and extent at which a drug is released from a dosage form, and it plays many important roles throughout drug product development. However, the traditional dissolution approach often emphasizes its application in quality control testing and usually strives to obtain 100% drug release. As a result, dissolution methods are not necessarily biorelevant and meaningful application of traditional dissolution methods in the early phases of drug product development can be very limited. This article will describe the development of a biorelevant in vitro dissolution method using USP apparatus 4, biorelevant media, and real-time online UV analysis. Several case studies in the areas of formulation selection, lot-to-lot variability, and food effect will be presented to demonstrate the application of this method in early phase formulation development. This biorelevant dissolution method using USP apparatus 4 provides a valuable tool to predict certain aspects of the in vivo drug release. It can be used to facilitate the formulation development/selection for pharmacokinetic (PK) and clinical studies. It may also potentially be used to minimize the number of PK studies, and to aid in the design of more efficient PK and clinical studies.
[Client centered psychotherapy].
Werthmann, H V
1979-01-01
In the discussion concerning which psychotherapeutic methods should come under the auspices of the medical health system in West Germany, the question is raised regarding the client-centered therapy of Carl Rogers. Can it be considered a distinct psychotherapeutic method? A review of the scientific literature dealing with this method shows that it provides neither a theory of mental illness nor a theory of clinical application based on individual cases or specific neurotic disturbances, Therefore it should be categorized as a useful method of communication in the field of psychology and not as a therapeutic method for treating mental illness.
Stahl, Ido; Katsman, Alexander; Zaidman, Michael; Keshet, Doron; Sigal, Amit; Eidelman, Mark
2017-07-11
Smartphones have the ability to capture and send images, and their use has become common in the emergency setting for transmitting radiographic images with the intent to consult an off-site specialist. Our objective was to evaluate the reliability of smartphone-based instant messaging applications for the evaluation of various pediatric limb traumas, as compared with the standard method of viewing images of a workstation-based picture archiving and communication system (PACS). X-ray images of 73 representative cases of pediatric limb trauma were captured and transmitted to 5 pediatric orthopedic surgeons by the Whatsapp instant messaging application on an iPhone 6 smartphone. Evaluators were asked to diagnose, classify, and determine the course of treatment for each case over their personal smartphones. Following a 4-week interval, revaluation was conducted using the PACS. Intraobserver agreement was calculated for overall agreement and per fracture site. The overall results indicate "near perfect agreement" between interpretations of the radiographs on smartphones compared with computer-based PACS, with κ of 0.84, 0.82, and 0.89 for diagnosis, classification, and treatment planning, respectively. Looking at the results per fracture site, we also found substantial to near perfect agreement. Smartphone-based instant messaging applications are reliable for evaluation of a wide range of pediatric limb fractures. This method of obtaining an expert opinion from the off-site specialist is immediately accessible and inexpensive, making smartphones a powerful tool for doctors in the emergency department, primary care clinics, or remote medical centers, enabling timely and appropriate treatment for the injured child. This method is not a substitution for evaluation of the images in the standard method over computer-based PACS, which should be performed before final decision-making.
Groves, Ethan; Palenik, Skip; Palenik, Christopher S
2018-04-18
While color is arguably the most important optical property of evidential fibers, the actual dyestuffs responsible for its expression in them are, in forensic trace evidence examinations, rarely analyzed and still less often identified. This is due, primarily, to the exceedingly small quantities of dye present in a single fiber as well as to the fact that dye identification is a challenging analytical problem, even when large quantities are available for analysis. Among the practical reasons for this are the wide range of dyestuffs available (and the even larger number of trade names), the low total concentration of dyes in the finished product, the limited amount of sample typically available for analysis in forensic cases, and the complexity of the dye mixtures that may exist within a single fiber. Literature on the topic of dye analysis is often limited to a specific method, subset of dyestuffs, or an approach that is not applicable given the constraints of a forensic analysis. Here, we present a generalized approach to dye identification that ( 1 ) combines several robust analytical methods, ( 2 ) is broadly applicable to a wide range of dye chemistries, application classes, and fiber types, and ( 3 ) can be scaled down to forensic casework-sized samples. The approach is based on the development of a reference collection of 300 commercially relevant textile dyes that have been characterized by a variety of microanalytical methods (HPTLC, Raman microspectroscopy, infrared microspectroscopy, UV-Vis spectroscopy, and visible microspectrophotometry). Although there is no single approach that is applicable to all dyes on every type of fiber, a combination of these analytical methods has been applied using a reproducible approach that permits the use of reference libraries to constrain the identity of and, in many cases, identify the dye (or dyes) present in a textile fiber sample.
NASA Astrophysics Data System (ADS)
Kruglova, T. V.
2004-01-01
The detailed spectroscope information about highly excited molecules and radicals such us as H+3, H2, HI, H2O, CH2 is needed for a number of applications in the field of laser physics, astrophysics and chemistry. Studies of highly excited molecular vibration-rotation states face several problems connected with slowly convergence or even divergences of perturbation expansions. The physical reason for a perturbation expansion divergence is the large amplitude motion and strong vibration-rotation coupling. In this case one needs to use the special method of series summation. There were a number of papers devoted to this problem: papers 1-10 in the reference list are only example of studies on this topic. The present report is aimed at the application of GET method (Generalized Euler Transformation) to the diatomic molecule. Energy levels of a diatomic molecule is usually represented as Dunham series on rotational J(J+1) and vibrational (V+1/2) quantum numbers (within the perturbation approach). However, perturbation theory is not applicable for highly excited vibration-rotation states because the perturbation expansion in this case becomes divergent. As a consequence one need to use special method for the series summation. The Generalized Euler Transformation (GET) is known to be efficient method for summing of slowly convergent series, it was already used for solving of several quantum problems Refs.13 and 14. In this report the results of Euler transformation of diatomic molecule Dunham series are presented. It is shown that Dunham power series can be represented of functional series that is equivalent to its partial summation. It is also shown that transformed series has the butter convergent properties, than the initial series.
Schwenke, Michael; Georgii, Joachim; Preusser, Tobias
2017-07-01
Focused ultrasound (FUS) is rapidly gaining clinical acceptance for several target tissues in the human body. Yet, treating liver targets is not clinically applied due to a high complexity of the procedure (noninvasiveness, target motion, complex anatomy, blood cooling effects, shielding by ribs, and limited image-based monitoring). To reduce the complexity, numerical FUS simulations can be utilized for both treatment planning and execution. These use-cases demand highly accurate and computationally efficient simulations. We propose a numerical method for the simulation of abdominal FUS treatments during respiratory motion of the organs and target. Especially, a novel approach is proposed to simulate the heating during motion by solving Pennes' bioheat equation in a computational reference space, i.e., the equation is mathematically transformed to the reference. The approach allows for motion discontinuities, e.g., the sliding of the liver along the abdominal wall. Implementing the solver completely on the graphics processing unit and combining it with an atlas-based ultrasound simulation approach yields a simulation performance faster than real time (less than 50-s computing time for 100 s of treatment time) on a modern off-the-shelf laptop. The simulation method is incorporated into a treatment planning demonstration application that allows to simulate real patient cases including respiratory motion. The high performance of the presented simulation method opens the door to clinical applications. The methods bear the potential to enable the application of FUS for moving organs.
NASA Astrophysics Data System (ADS)
Allard, Alexandre; Fischer, Nicolas
2018-06-01
Sensitivity analysis associated with the evaluation of measurement uncertainty is a very important tool for the metrologist, enabling them to provide an uncertainty budget and to gain a better understanding of the measurand and the underlying measurement process. Using the GUM uncertainty framework, the contribution of an input quantity to the variance of the output quantity is obtained through so-called ‘sensitivity coefficients’. In contrast, such coefficients are no longer computed in cases where a Monte-Carlo method is used. In such a case, supplement 1 to the GUM suggests varying the input quantities one at a time, which is not an efficient method and may provide incorrect contributions to the variance in cases where significant interactions arise. This paper proposes different methods for the elaboration of the uncertainty budget associated with a Monte Carlo method. An application to the mass calibration example described in supplement 1 to the GUM is performed with the corresponding R code for implementation. Finally, guidance is given for choosing a method, including suggestions for a future revision of supplement 1 to the GUM.
Harris, Chad T; Haw, Dustin W; Handler, William B; Chronik, Blaine A
2013-09-01
Eddy currents are generated in MR by the use of rapidly switched electromagnets, resulting in time varying and spatially varying magnetic fields that must be either minimized or corrected. This problem is further complicated when non-cylindrical insert magnets are used for specialized applications. Interruption of the coupling between an insert coil and the MR system is typically accomplished using active magnetic shielding. A new method of actively shielding insert gradient and shim coils of any surface geometry by use of the boundary element method for coil design with a minimum energy constraint is presented. This method was applied to shield x- and z-gradient coils for two separate cases: a traditional cylindrical primary gradient with cylindrical shield and, to demonstrate its versatility in surface geometry, the same cylindrical primary gradients with a rectangular box-shaped shield. For the cylindrical case this method produced shields that agreed with analytic solutions. For the second case, the rectangular box-shaped shields demonstrated very good shielding characteristics despite having a different geometry than the primary coils. Copyright © 2013 Elsevier Inc. All rights reserved.
Vortical Flow Prediction Using an Adaptive Unstructured Grid Method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2001-01-01
A computational fluid dynamics (CFD) method has been employed to compute vortical flows around slender wing/body configurations. The emphasis of the paper is on the effectiveness of an adaptive grid procedure in "capturing" concentrated vortices generated at sharp edges or flow separation lines of lifting surfaces flying at high angles of attack. The method is based on a tetrahedral unstructured grid technology developed at the NASA Langley Research Center. Two steady-state, subsonic, inviscid and Navier-Stokes flow test cases are presented to demonstrate the applicability of the method for solving practical vortical flow problems. The first test case concerns vortex flow over a simple 65deg delta wing with different values of leading-edge bluntness, and the second case is that of a more complex fighter configuration. The superiority of the adapted solutions in capturing the vortex flow structure over the conventional unadapted results is demonstrated by comparisons with the windtunnel experimental data. The study shows that numerical prediction of vortical flows is highly sensitive to the local grid resolution and that the implementation of grid adaptation is essential when applying CFD methods to such complicated flow problems.
Seo, Bommie F; Kang, In Sook; Jeong, Yeon Jin; Moon, Suk Ho
2014-06-01
The Morel-Lavallée lesion is a collection of serous fluid that develops after closed degloving injuries and after surgical procedures particularly in the pelvis and abdomen. It is a persistent seroma and is usually resistant to conservative methods of treatment such as percutaneous drainage and compression. Various methods of curative treatment have been reported in the literature, such as application of fibrin sealant, doxycycline, or alcohol sclerodhesis. We present a case of a huge recurrent Morel-Lavallée lesion in the lower back and buttock region that was treated with quilting sutures, fibrin sealant, and compression, with a review of the literature. © The Author(s) 2014.
Analysis of methods of processing of expert information by optimization of administrative decisions
NASA Astrophysics Data System (ADS)
Churakov, D. Y.; Tsarkova, E. G.; Marchenko, N. D.; Grechishnikov, E. V.
2018-03-01
In the real operation the measure definition methodology in case of expert estimation of quality and reliability of application-oriented software products is offered. In operation methods of aggregation of expert estimates on the example of a collective choice of an instrumental control projects in case of software development of a special purpose for needs of institutions are described. Results of operation of dialogue decision making support system are given an algorithm of the decision of the task of a choice on the basis of a method of the analysis of hierarchies and also. The developed algorithm can be applied by development of expert systems to the solution of a wide class of the tasks anyway connected to a multicriteria choice.
NASA Astrophysics Data System (ADS)
Wilson, F.; Neukirch, T.
2018-01-01
We present new analytical three-dimensional solutions of the magnetohydrostatic equations, which are applicable to the co-rotating frame of reference outside a rigidly rotating cylindrical body, and have potential applications to planetary magnetospheres and stellar coronae. We consider the case with centrifugal force only, and use a transformation method in which the governing equation for the "pseudo-potential" (from which the magnetic field can be calculated) becomes the Laplace partial differential equation. The new solutions extend the set of previously found solutions to those of a "fractional multipole" nature, and offer wider possibilities for modelling than before. We consider some special cases, and present example solutions.
Luoma, Pekka; Natschläger, Thomas; Malli, Birgit; Pawliczek, Marcin; Brandstetter, Markus
2018-05-12
A model recalibration method based on additive Partial Least Squares (PLS) regression is generalized for multi-adjustment scenarios of independent variance sources (referred to as additive PLS - aPLS). aPLS allows for effortless model readjustment under changing measurement conditions and the combination of independent variance sources with the initial model by means of additive modelling. We demonstrate these distinguishing features on two NIR spectroscopic case-studies. In case study 1 aPLS was used as a readjustment method for an emerging offset. The achieved RMS error of prediction (1.91 a.u.) was of similar level as before the offset occurred (2.11 a.u.). In case-study 2 a calibration combining different variance sources was conducted. The achieved performance was of sufficient level with an absolute error being better than 0.8% of the mean concentration, therefore being able to compensate negative effects of two independent variance sources. The presented results show the applicability of the aPLS approach. The main advantages of the method are that the original model stays unadjusted and that the modelling is conducted on concrete changes in the spectra thus supporting efficient (in most cases straightforward) modelling. Additionally, the method is put into context of existing machine learning algorithms. Copyright © 2018 Elsevier B.V. All rights reserved.
A monitoring/auditing mechanism for SSL/TLS secured service sessions in Health Care Applications.
Kavadias, C D; Koutsopoulos, K A; Vlachos, M P; Bourka, A; Kollias, V; Stassinopoulos, G
2003-01-01
This paper analyzes the SSL/TLS procedures and defines the functionality of a monitoring/auditing entity running in parallel with the protocol, which is decoding, checking the certificate and permitting session establishment based on the decoded certificate information, the network addresses of the endpoints and a predefined access list. Finally, this paper discusses how such a facility can be used for detection impersonation attempts in Health Care applications and provides case studies to show the effectiveness and applicability of the proposed method.
[Moral case deliberation: time for ethical reflection in the daily practice of mental health care].
Vellinga, A; van Melle-Baaijens, E A H
2016-01-01
Nowadays, reflecting on ethics, which we choose to call moral case deliberation, is occurring more and more frequently in psychiatric institutions. We have personal experience of organising and supervising moral case deliberation in a large psychiatric institute and we can confirm the positive effects of moral case deliberation which have been reported in the literature. To describe a structured method for moral case deliberation which enables care-givers in health care and/or addiction care to reflect on moral dilemmas. We refer to the main findings in relevant literature and describe how we developed a structured method for implementing moral case deliberation. Our studies of the literature indicate that systematic reflection about ethical dilemmas can improve the quality of care and make care-givers more satisfied with their work. This is why we have developed our own method which is applicable particularly to psychiatric and/or addition care and which can be used systematically in discussions of moral dilemmas. Our method for discussing ethical issues works well in clinical practice, particularly when it is embedded in a multidisciplinary context. Of course, to ensure the continuity of the system, deliberation about moral and ethical issues needs to be financially safeguarded and embedded in the organisation. Discussion of moral issues improves the quality of care and increases care-givers' satisfaction with their work.
Frambach, Janneke M; Driessen, Erik W; Chan, Li-Chong; van der Vleuten, Cees P M
2012-08-01
Medical schools worldwide are increasingly switching to student-centred methods such as problem-based learning (PBL) to foster lifelong self-directed learning (SDL). The cross-cultural applicability of these methods has been questioned because of their Western origins and because education contexts and learning approaches differ across cultures. This study evaluated PBL's cross-cultural applicability by investigating how it is applied in three medical schools in regions with different cultures in, respectively, East Asia, the Middle East and Western Europe. Specifically, it investigated how students' cultural backgrounds impact on SDL in PBL and how this impact affects students. A qualitative, cross-cultural, comparative case study was conducted in three medical schools. Data were collected through 88 semi-structured, in-depth interviews with Year 1 and 3 students, tutors and key persons involved in PBL, 32 observations of Year 1 and 3 PBL tutorials, document analysis, and contextual information. The data were thematically analysed using the template analysis method. Comparisons were made among the three medical schools and between Year 1 and 3 students across and within the schools. The cultural factors of uncertainty and tradition posed a challenge to Middle Eastern students' SDL. Hierarchy posed a challenge to Asian students and achievement impacted on both sets of non-Western students. These factors were less applicable to European students, although the latter did experience some challenges. Several contextual factors inhibited or enhanced SDL across the cases. As students grew used to PBL, SDL skills increased across the cases, albeit to different degrees. Although cultural factors can pose a challenge to the application of PBL in non-Western settings, it appears that PBL can be applied in different cultural contexts. However, its globalisation does not postulate uniform processes and outcomes, and culturally sensitive alternatives might be developed. © Blackwell Publishing Ltd 2012.
Radar studies of the atmosphere using spatial and frequency diversity
NASA Astrophysics Data System (ADS)
Yu, Tian-You
This work provides results from a thorough investigation of atmospheric radar imaging including theory, numerical simulations, observational verification, and applications. The theory is generalized to include the existing imaging techniques of coherent radar imaging (CRI) and range imaging (RIM), which are shown to be special cases of three-dimensional imaging (3D Imaging). Mathematically, the problem of atmospheric radar imaging is posed as an inverse problem. In this study, the Fourier, Capon, and maximum entropy (MaxEnt) methods are proposed to solve the inverse problem. After the introduction of the theory, numerical simulations are used to test, validate, and exercise these techniques. Statistical comparisons of the three methods of atmospheric radar imaging are presented for various signal-to-noise ratio (SNR), receiver configuration, and frequency sampling. The MaxEnt method is shown to generally possess the best performance for low SNR. The performance of the Capon method approaches the performance of the MaxEnt method for high SNR. In limited cases, the Capon method actually outperforms the MaxEnt method. The Fourier method generally tends to distort the model structure due to its limited resolution. Experimental justification of CRI and RIM is accomplished using the Middle and Upper (MU) Atmosphere Radar in Japan and the SOUnding SYstem (SOUSY) in Germany, respectively. A special application of CRI to the observation of polar mesosphere summer echoes (PMSE) is used to show direct evidence of wave steepening and possibly explain gravity wave variations associated with PMSE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, P E
Tips and case histories on computer use for idea and outline processing: Productivity software to solve problems of idea hierarchy, transitions, and developments is matched to solutions for communicators. One case is text that ranges from methods and procedures to histories and legal definitions of classification for the US Department of Energy. Applications of value to writers, editors, and managers are for research; calendars; creativity; prioritization; idea discovery and manipulation; file and time management; and contents, indexes, and glossaries. 6 refs., 7 figs.
Mendoza, G A; Prabhu, R
2000-12-01
This paper describes an application of multiple criteria analysis (MCA) in assessing criteria and indicators adapted for a particular forest management unit. The methods include: ranking, rating, and pairwise comparisons. These methods were used in a participatory decision-making environment where a team representing various stakeholders and professionals used their expert opinions and judgements in assessing different criteria and indicators (C&I) on the one hand, and how suitable and applicable they are to a forest management unit on the other. A forest concession located in Kalimantan, Indonesia, was used as the site for the case study. Results from the study show that the multicriteria methods are effective tools that can be used as structured decision aids to evaluate, prioritize, and select sets of C&I for a particular forest management unit. Ranking and rating approaches can be used as a screening tool to develop an initial list of C&I. Pairwise comparison, on the other hand, can be used as a finer filter to further reduce the list. In addition to using these three MCA methods, the study also examines two commonly used group decision-making techniques, the Delphi method and the nominal group technique. Feedback received from the participants indicates that the methods are transparent, easy to implement, and provide a convenient environment for participatory decision-making.
Integrated pest management and allocation of control efforts for vector-borne diseases
Ginsberg, H.S.
2001-01-01
Applications of various control methods were evaluated to determine how to integrate methods so as to minimize the number of human cases of vector-borne diseases. These diseases can be controlled by lowering the number of vector-human contacts (e.g., by pesticide applications or use of repellents), or by lowering the proportion of vectors infected with pathogens (e.g., by lowering or vaccinating reservoir host populations). Control methods should be combined in such a way as to most efficiently lower the probability of human encounter with an infected vector. Simulations using a simple probabilistic model of pathogen transmission suggest that the most efficient way to integrate different control methods is to combine methods that have the same effect (e.g., combine treatments that lower the vector population; or combine treatments that lower pathogen prevalence in vectors). Combining techniques that have different effects (e.g., a technique that lowers vector populations with a technique that lowers pathogen prevalence in vectors) will be less efficient than combining two techniques that both lower vector populations or combining two techniques that both lower pathogen prevalence, costs being the same. Costs of alternative control methods generally differ, so the efficiency of various combinations at lowering human contact with infected vectors should be estimated at available funding levels. Data should be collected from initial trials to improve the effects of subsequent interventions on the number of human cases.
Bacon therapy and furuncular myiasis.
Brewer, T F; Wilson, M E; Gonzalez, E; Felsenstein, D
1993-11-03
To evaluate a simple, noninvasive method for removing fly larvae from patients with furuncular myiasis. Case series. Ambulatory office of a tertiary care center. Three patients who presented with Dermatobia hominis infestation. The patients with D hominis infestation were treated with the application of bacon fat over the larval apertures. Removal of intact larvae. Within 3 hours of the application of bacon, the larvae had migrated sufficiently far out of the skin to be removed with tweezers. Ten larvae were removed with this method. There were no treatment failures or complications. Furuncular myiasis will be seen more frequently in temperate areas as individuals travel to endemic areas. We describe the clinical characteristics of myiasis and a simple method of treatment that permits rapid diagnosis and cure.
Iterative integral parameter identification of a respiratory mechanics model.
Schranz, Christoph; Docherty, Paul D; Chiew, Yeong Shiong; Möller, Knut; Chase, J Geoffrey
2012-07-18
Patient-specific respiratory mechanics models can support the evaluation of optimal lung protective ventilator settings during ventilation therapy. Clinical application requires that the individual's model parameter values must be identified with information available at the bedside. Multiple linear regression or gradient-based parameter identification methods are highly sensitive to noise and initial parameter estimates. Thus, they are difficult to apply at the bedside to support therapeutic decisions. An iterative integral parameter identification method is applied to a second order respiratory mechanics model. The method is compared to the commonly used regression methods and error-mapping approaches using simulated and clinical data. The clinical potential of the method was evaluated on data from 13 Acute Respiratory Distress Syndrome (ARDS) patients. The iterative integral method converged to error minima 350 times faster than the Simplex Search Method using simulation data sets and 50 times faster using clinical data sets. Established regression methods reported erroneous results due to sensitivity to noise. In contrast, the iterative integral method was effective independent of initial parameter estimations, and converged successfully in each case tested. These investigations reveal that the iterative integral method is beneficial with respect to computing time, operator independence and robustness, and thus applicable at the bedside for this clinical application.
Lean Manufacturing Principles Improving the Targeting Process
2012-06-08
author has familiarity with Lean manufacturing principles. Third, Lean methods have been used in different industries and have proven adaptable to the...92 The case study also demonstrates the multi organizational application of VSM, JIT and the 5S method ...new members not knowing the process, this will serve as a start point for the developing of understanding. Within the Food industry we observed “the
Extended time-interval analysis
NASA Astrophysics Data System (ADS)
Fynbo, H. O. U.; Riisager, K.
2014-01-01
Several extensions of the halflife analysis method recently suggested by Horvat and Hardy are put forward. Goodness-of-fit testing is included, and the method is extended to cases where more information is available for each decay event which allows applications also for e.g. γ decay data. The results are tested with Monte Carlo simulations and are applied to the decays of 64Cu and 56Mn.
ERIC Educational Resources Information Center
Varguez, Ricardo
2012-01-01
The constant expansion of Web 2.0 applications available on the World Wide Web and expansion of technology resources has prompted the need to better prepare current and future educators to make more effective use of such resources in their classrooms. The purpose of this embedded mixed methods case study was to describe the experiences and changes…
ERIC Educational Resources Information Center
Harding, Lora Mitchell
2018-01-01
Application-based assignments are often forgone in large marketing classes because of the daunting implementation and assessment challenges they present. One solution is to divide large classes into groups, but groups present their own challenges--of note, the potential for students to free-ride on the efforts of others. The 4Ps method of case…
NASA Technical Reports Server (NTRS)
Baird, J.
1967-01-01
This supplement to Task lB-Large Solid Rocket Motor Case Fabrication Methods supplies additional supporting cost data and discusses in detail the methodology that was applied to the task. For the case elements studied, the cost was found to be directly proportional to the Process Complexity Factor (PCF). The PCF was obtained for each element by identifying unit processes that are common to the elements and their alternative manufacturing routes, by assigning a weight to each unit process, and by summing the weighted counts. In three instances of actual manufacture, the actual cost per pound equaled the cost estimate based on PCF per pound, but this supplement, recognizes that the methodology is of limited, rather than general, application.
78 FR 19743 - Government-Owned Inventions, Available for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-02
..., Available for Licensing AGENCY: National Aeronautics and Space Administration. ACTION: Notice of Availability of Inventions for Licensing. SUMMARY: Patent applications on the inventions listed below assigned... Continuous Aerodynamic Control Surfaces and Methods for Active Wing Shaping Control; NASA Case No.: ARC-16846...
EVOKED POTENTIALS, PHYSIOLOGICAL METHODS WITH HUMAN APPLICATIONS
A number of tests and test batteries have been developed and implemented for detecting potential neurotoxicity in humans. n some cases test results may suggest specific dysfunction. hile tests in laboratory animals are often used to project the potential for adverse health effect...
Earthworm Biomass Measurement: A Science Activity for Middle School.
ERIC Educational Resources Information Center
Haskett, Jonathan; Levine, Elissa; Carey, Pauline B.; Niepold III, Frank
2000-01-01
Describes an activity on biomass measurement which, in this case, is the weight of a group of living things in a given area. The earthworm activity gives students a greater understanding of ecology, practical math applications, and the scientific method. (ASK)
Hair and bare skin discrimination for laser-assisted hair removal systems.
Cayir, Sercan; Yetik, Imam Samil
2017-07-01
Laser-assisted hair removal devices aim to remove body hair permanently. In most cases, these devices irradiate the whole area of the skin with a homogenous power density. Thus, a significant portion of the skin, where hair is not present, is burnt unnecessarily causing health risks. Therefore, methods that can distinguish hair regions automatically would be very helpful avoiding these unnecessary applications of laser. This study proposes a new system of algorithms to detect hair regions with the help of a digital camera. Unlike previous limited number of studies, our methods are very fast allowing for real-time application. Proposed methods are based on certain features derived from histograms of hair and skin regions. We compare our algorithm with competing methods in terms of localization performance and computation time and show that a much faster real-time accurate localization of hair regions is possible with the proposed method. Our results show that the algorithm we have developed is extremely fast (around 45 milliseconds) allowing for real-time application with high accuracy hair localization ( 96.48 %).
[Clinical Advanced in Early-stage ALK-positive Non-small Cell Lung Cancer Patients].
Gao, Qiongqiong; Jiang, Xiangli; Huang, Chun
2017-02-20
Lung cancer is the leading cause of cancer death in China. Non-small cell lung cancer (NSCLC) accounts for 85% of lung cancer cases, with the majority of the cases diagnosed at the advanced stage. Molecular targeted therapy is becoming the focus attention for advanced NSCLC. Echinoderm microtubule-associated protein-like 4 gene and the anaplastic lymphoma kinase gene (EML4-ALK) is among the most common molecular targets of NSCLC; its specific small-molecule tyrosine kinase inhibitors (TKIs) are approved for use in advanced NSCLC cases of ALK-positive. However, the influence of EML4-ALK fusion gene on the outcome of early-stage NSCLC cases and the necessity of application of TKIs for early-stage ALK-positive NSCLC patients are still uncertain. In this paper, we summarized the progression of testing methods for ALK-positive NSCLC patients as well as clinicopathological implication, outcome, and necessity of application of TKIs for early-stage ALK-positive NSCLC patients.
NASA Astrophysics Data System (ADS)
Raju, C. S. K.; Ibrahim, S. M.; Anuradha, S.; Priyadharshini, P.
2016-11-01
In modern days, the mass transfer rate is challenging to the scientists due to its noticeable significance for industrial as well as engineering applications; owing to this we attempt to study the cross-diffusion effects on the magnetohydrodynamic nonlinear radiative Carreau fluid over a wedge filled with gyro tactic microorganisms. Numerical results are presented graphically as well as in tabular form with the aid of the Runge-Kutta and Newton methods. The effects of pertinent parameters on velocity, temperature, concentration and density of motile organism distributions are presented and discussed for two cases (suction and injection flows). For real-life application we also calculated the local Nusselt and Sherwood numbers. It is observed that thermal and concentration profiles are not uniform in the suction and injection flow cases. It is found that the heat and mass transport phenomenon is high in the injection case, while heat and mass transfer rates are high in the suction flow case.
NASA Technical Reports Server (NTRS)
Rzasnicki, W.
1973-01-01
A method of solution is presented, which, when applied to the elasto-plastic analysis of plates having a v-notch on one edge and subjected to pure bending, will produce stress and strain fields in much greater detail than presently available. Application of the boundary integral equation method results in two coupled Fredholm-type integral equations, subject to prescribed boundary conditions. These equations are replaced by a system of simultaneous algebraic equations and solved by a successive approximation method employing Prandtl-Reuss incremental plasticity relations. The method is first applied to number of elasto-static problems and the results compared with available solutions. Good agreement is obtained in all cases. The elasto-plastic analysis provides detailed stress and strain distributions for several cases of plates with various notch angles and notch depths. A strain hardening material is assumed and both plane strain and plane stress conditions are considered.
An interior-point method-based solver for simulation of aircraft parts riveting
NASA Astrophysics Data System (ADS)
Stefanova, Maria; Yakunin, Sergey; Petukhova, Margarita; Lupuleac, Sergey; Kokkolaras, Michael
2018-05-01
The particularities of the aircraft parts riveting process simulation necessitate the solution of a large amount of contact problems. A primal-dual interior-point method-based solver is proposed for solving such problems efficiently. The proposed method features a worst case polynomial complexity bound ? on the number of iterations, where n is the dimension of the problem and ε is a threshold related to desired accuracy. In practice, the convergence is often faster than this worst case bound, which makes the method applicable to large-scale problems. The computational challenge is solving the system of linear equations because the associated matrix is ill conditioned. To that end, the authors introduce a preconditioner and a strategy for determining effective initial guesses based on the physics of the problem. Numerical results are compared with ones obtained using the Goldfarb-Idnani algorithm. The results demonstrate the efficiency of the proposed method.
Understanding and Evaluating Assurance Cases
NASA Technical Reports Server (NTRS)
Rushby, John; Xu, Xidong; Rangarajan, Murali; Weaver, Thomas L.
2015-01-01
Assurance cases are a method for providing assurance for a system by giving an argument to justify a claim about the system, based on evidence about its design, development, and tested behavior. In comparison with assurance based on guidelines or standards (which essentially specify only the evidence to be produced), the chief novelty in assurance cases is provision of an explicit argument. In principle, this can allow assurance cases to be more finely tuned to the specific circumstances of the system, and more agile than guidelines in adapting to new techniques and applications. The first part of this report (Sections 1-4) provides an introduction to assurance cases. Although this material should be accessible to all those with an interest in these topics, the examples focus on software for airborne systems, traditionally assured using the DO-178C guidelines and its predecessors. A brief survey of some existing assurance cases is provided in Section 5. The second part (Section 6) considers the criteria, methods, and tools that may be used to evaluate whether an assurance case provides sufficient confidence that a particular system or service is fit for its intended use. An assurance case cannot provide unequivocal "proof" for its claim, so much of the discussion focuses on the interpretation of such less-than-definitive arguments, and on methods to counteract confirmation bias and other fallibilities in human reasoning.
Modelling migration in multilayer systems by a finite difference method: the spherical symmetry case
NASA Astrophysics Data System (ADS)
Hojbotǎ, C. I.; Toşa, V.; Mercea, P. V.
2013-08-01
We present a numerical model based on finite differences to solve the problem of chemical impurity migration within a multilayer spherical system. Migration here means diffusion of chemical species in conditions of concentration partitioning at layer interfaces due to different solubilities of the migrant in different layers. We detail here the numerical model and discuss the results of its implementation. To validate the method we compare it with cases where an analytic solution exists. We also present an application of our model to a practical problem in which we compute the migration of caprolactam from the packaging multilayer foil into the food.
Theoretical study of air forces on an oscillating or steady thin wing in a supersonic main stream
NASA Technical Reports Server (NTRS)
Garrick, I E; Rubinow, S I
1947-01-01
A theoretical study, based on the linearized equations of motion for small disturbance, is made of the air forces on wings of general plan forms moving forward at a constant supersonic speed. The boundary problem is set up for both the harmonically oscillating and the steady conditions. Two types of boundary conditions are distinguished, which are designated "purely supersonic" and "mixed supersonic." the method is illustrated by applications to a number of examples for both the steady and the oscillating conditions. The purely supersonic case involves independence of action of the upper and lower surfaces of the airfoil and present analysis is mainly concerned with this case. A discussion is first given of the fundamental or elementary solution corresponding to a moving source. The solutions for the velocity potential are then synthesized by means of integration of the fundamental solution for the moving source. The method is illustrated by applications to a number of examples for both the steady and the oscillating cases and for various plan forms, including swept wings and rectangular and triangular plan forms. The special results of a number of authors are shown to be included in the analysis.
A Proposed Resident's Operative Case Tracking and Evaluation System.
Sehli, Deema N; Esene, Ignatius N; Baeesa, Saleh S
2016-03-01
Neurosurgery program trainers are continuously searching for new methods to evaluate trainees' competency besides number of cases and training duration. Recently, efforts are made on the development of reliable methods to teach competency and valid methods to measure teaching efficacy. Herein, we propose the "Resident's Operative Case Tracking and Evaluation System" (ROCTES) for the assessment and monitoring of the resident's performance quality during each procedure. We developed a data-based website and smartphone application for neurosurgical attending physicians, residents, and resident review committees in our accredited neurosurgical institutions. ROCTES runs through five steps: Login (Resident), Case Entry, Login (Attending Physician), Case Approval and Evaluation, and Report. The Resident enters each case record under "Case Entry" field and can "save," "edit," or "submit" the case data to the Attending Physician. The latter from the attending physician login profile will be able to "approve and evaluate" the resident's "knowledge," "skills," and "attitude" ranking from 1 to 15 for that particular case; add his comments and then "save," "edit," or "submit" the data, which can be viewed by users as a "report." Program Directors can also "login" to monitor the resident's progress. The implementation of this communication tool should enable the filtering and retrieval of information needed for the better assessment and monitoring of residents' exposure to variety of cases in each training center. This proposed evaluation system will provide a transparent assessment for residency training programs and should convert trainees into competent neurosurgeons. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Lin, P.; Pratt, D. T.
1987-01-01
A hybrid method has been developed for the numerical prediction of turbulent mixing in a spatially-developing, free shear layer. Most significantly, the computation incorporates the effects of large-scale structures, Schmidt number and Reynolds number on mixing, which have been overlooked in the past. In flow field prediction, large-eddy simulation was conducted by a modified 2-D vortex method with subgrid-scale modeling. The predicted mean velocities, shear layer growth rates, Reynolds stresses, and the RMS of longitudinal velocity fluctuations were found to be in good agreement with experiments, although the lateral velocity fluctuations were overpredicted. In scalar transport, the Monte Carlo method was extended to the simulation of the time-dependent pdf transport equation. For the first time, the mixing frequency in Curl's coalescence/dispersion model was estimated by using Broadwell and Breidenthal's theory of micromixing, which involves Schmidt number, Reynolds number and the local vorticity. Numerical tests were performed for a gaseous case and an aqueous case. Evidence that pure freestream fluids are entrained into the layer by large-scale motions was found in the predicted pdf. Mean concentration profiles were found to be insensitive to Schmidt number, while the unmixedness was higher for higher Schmidt number. Applications were made to mixing layers with isothermal, fast reactions. The predicted difference in product thickness of the two cases was in reasonable quantitative agreement with experimental measurements.
Nagy, Anna; Nagy, Orsolya; Tarcsai, Katalin; Farkas, Ágnes; Takács, Mária
2018-03-01
Tick-borne encephalitis virus (TBEV) is one of the endemic flaviviruses in Hungary, which is responsible for human infections every year. Neurological involvement in the disease is characterized by meningitis, encephalitis or meningoencephalitis which can result in long-term neurological and neuropsychiatric sequelae. Microbiological diagnosis of acute cases is predominantly based on serological tests due to the limited duration of viremia and long incubation period, however, the application of molecular methods can also supplement the serological diagnosis and provides epidemiological data. The aim of this study was to determine how viral RNA could successfully be detected from different body fluids of serologically confirmed acute cases. Serum, whole blood, cerebrospinal fluid and urine samples of 18 patients from the total of the 19 serologically diagnosed cases were investigated by using the RT-PCR method. Two sera and one urine sample of three patients tested positive and the European subtype of TBEV could be identified. As far as we know this was the first time that TBEV RNA could be detected from human clinical samples in Hungary. Our finding highlights that the application of molecular methods besides serological tests can be a valuable tool in differential diagnosis especially in areas like Hungary, where two or more flaviviruses are co-circulating. Copyright © 2018 Elsevier GmbH. All rights reserved.
NASA Astrophysics Data System (ADS)
Soummer, Rémi; Pueyo, Laurent; Ferrari, André; Aime, Claude; Sivaramakrishnan, Anand; Yaitskova, Natalia
2009-04-01
We study the application of Lyot coronagraphy to future Extremely Large Telescopes (ELTs), showing that Apodized Pupil Lyot Coronagraphs enable high-contrast imaging for exoplanet detection and characterization with ELTs. We discuss the properties of the optimal pupil apodizers for this application (generalized prolate spheroidal functions). The case of a circular aperture telescope with a central obstruction is considered in detail, and we discuss the effects of primary mirror segmentation and secondary mirror support structures as a function of the occulting mask size. In most cases where inner working distance is critical, e.g., for exoplanet detection, these additional features do not alter the solutions derived with just the central obstruction, although certain applications such as quasar-host galaxy coronagraphic observations could benefit from designs that explicitly accomodate ELT spider geometries. We illustrate coronagraphic designs for several ELT geometries including ESO/OWL, the Thirty Mirror Telescope, the Giant Magellan Telescope, and describe numerical methods for generating these designs.
Recommendations for numerical solution of reinforced-panel and fuselage-ring problems
NASA Technical Reports Server (NTRS)
Hoff, N J; Libby, Paul A
1949-01-01
Procedures are recommended for solving the equations of equilibrium of reinforced panels and isolated fuselage rings as represented by the external loads and the operations table established according to Southwell's method. From the solution of these equations the stress distribution can be easily determined. The method of systematic relaxations, the matrix-calculus method, and several other methods applicable in special cases are discussed. Definite recommendations are made for obtaining the solution of reinforced-panel problems which are generally designated as shear lag problems. The procedures recommended are demonstrated in the analysis of a number of panels. In the case of fuselage rings it is not possible to make definite recommendations for the solution of the equilibrium equations for all rings and loadings. However, suggestions based on the latest experience are made and demonstrated on several rings.
NASA Astrophysics Data System (ADS)
Kim, Duk-hyun; Lee, Hyoung-Jin
2018-04-01
A study of efficient aerodynamic database modeling method was conducted. A creation of database using periodicity and symmetry characteristic of missile aerodynamic coefficient was investigated to minimize the number of wind tunnel test cases. In addition, studies of how to generate the aerodynamic database when the periodicity changes due to installation of protuberance and how to conduct a zero calibration were carried out. Depending on missile configurations, the required number of test cases changes and there exist tests that can be omitted. A database of aerodynamic on deflection angle of control surface can be constituted using phase shift. A validity of modeling method was demonstrated by confirming that the result which the aerodynamic coefficient calculated by using the modeling method was in agreement with wind tunnel test results.
Cauliflower ear – a minimally invasive treatment method in a wrestling athlete: a case report
Haik, Josef; Givol, Or; Kornhaber, Rachel; Cleary, Michelle; Ofir, Hagit; Harats, Moti
2018-01-01
Acute auricular hematoma can be caused by direct blunt trauma or other injury to the external ear. It is typically seen in those who practice full contact sports such as boxing, wrestling, and rugby. “Cauliflower ear” deformity, fibrocartilage formation during scarring, is a common complication of auricular hematomas. Therefore, acute drainage of the hematoma and postprocedural techniques for preventing recurrence are necessary for preventing the deformity. There are many techniques although no superior method of treatment has been found. In this case report, we describe a novel method using needle aspiration followed by the application of a magnet and an adapted disc to the affected area of the auricular. This minimally invasive, simple, and accessible method could potentially facilitate the treatment of cauliflower ear among full contact sports athletes. PMID:29403318
Tian, Hua; Wang, Xueying; Shu, Gequn; Wu, Mingqiang; Yan, Nanhua; Ma, Xiaonan
2017-09-15
Mixture of hydrocarbon and carbon dioxide shows excellent cycle performance in Organic Rankine Cycle (ORC) used for engine waste heat recovery, but the unavoidable leakage in practical application is a threat for safety due to its flammability. In this work, a quantitative risk assessment system (QR-AS) is established aiming at providing a general method of risk assessment for flammable working fluid leakage. The QR-AS covers three main aspects: analysis of concentration distribution based on CFD simulations, explosive risk assessment based on the TNT equivalent method and risk mitigation based on evaluation results. A typical case of propane/carbon dioxide mixture leaking from ORC is investigated to illustrate the application of QR-AS. According to the assessment results, proper ventilation speed, safe mixture ratio and location of gas-detecting devices have been proposed to guarantee the security in case of leakage. The results revealed that this presented QR-AS was reliable for the practical application and the evaluation results could provide valuable guidance for the design of mitigation measures to improve the safe performance of ORC system. Copyright © 2017 Elsevier B.V. All rights reserved.
Exploring Situational Awareness in Diagnostic Errors in Primary Care
Singh, Hardeep; Giardina, Traber Davis; Petersen, Laura A.; Smith, Michael; Wilson, Lindsey; Dismukes, Key; Bhagwath, Gayathri; Thomas, Eric J.
2013-01-01
Objective Diagnostic errors in primary care are harmful but poorly studied. To facilitate understanding of diagnostic errors in real-world primary care settings using electronic health records (EHRs), this study explored the use of the Situational Awareness (SA) framework from aviation human factors research. Methods A mixed-methods study was conducted involving reviews of EHR data followed by semi-structured interviews of selected providers from two institutions in the US. The study population included 380 consecutive patients with colorectal and lung cancers diagnosed between February 2008 and January 2009. Using a pre-tested data collection instrument, trained physicians identified diagnostic errors, defined as lack of timely action on one or more established indications for diagnostic work-up for lung and colorectal cancers. Twenty-six providers involved in cases with and without errors were interviewed. Interviews probed for providers' lack of SA and how this may have influenced the diagnostic process. Results Of 254 cases meeting inclusion criteria, errors were found in 30 (32.6%) of 92 lung cancer cases and 56 (33.5%) of 167 colorectal cancer cases. Analysis of interviews related to error cases revealed evidence of lack of one of four levels of SA applicable to primary care practice: information perception, information comprehension, forecasting future events, and choosing appropriate action based on the first three levels. In cases without error, the application of the SA framework provided insight into processes involved in attention management. Conclusions A framework of SA can help analyze and understand diagnostic errors in primary care settings that use EHRs. PMID:21890757
Applications of surface metrology in firearm identification
NASA Astrophysics Data System (ADS)
Zheng, X.; Soons, J.; Vorburger, T. V.; Song, J.; Renegar, T.; Thompson, R.
2014-01-01
Surface metrology is commonly used to characterize functional engineering surfaces. The technologies developed offer opportunities to improve forensic toolmark identification. Toolmarks are created when a hard surface, the tool, comes into contact with a softer surface and causes plastic deformation. Toolmarks are commonly found on fired bullets and cartridge cases. Trained firearms examiners use these toolmarks to link an evidence bullet or cartridge case to a specific firearm, which can lead to a criminal conviction. Currently, identification is typically based on qualitative visual comparison by a trained examiner using a comparison microscope. In 2009, a report by the National Academies called this method into question. Amongst other issues, they questioned the objectivity of visual toolmark identification by firearms examiners. The National Academies recommended the development of objective toolmark identification criteria and confidence limits. The National Institute of Standards and Technology (NIST) have applied its experience in surface metrology to develop objective identification criteria, measurement methods, and reference artefacts for toolmark identification. NIST developed the Standard Reference Material SRM 2460 standard bullet and SRM 2461 standard cartridge case to facilitate quality control and traceability of identifications performed in crime laboratories. Objectivity is improved through measurement of surface topography and application of unambiguous surface similarity metrics, such as the maximum value (ACCFMAX) of the areal cross correlation function. Case studies were performed on consecutively manufactured tools, such as gun barrels and breech faces, to demonstrate that, even in this worst case scenario, all the tested tools imparted unique surface topographies that were identifiable. These studies provide scientific support for toolmark evidence admissibility in criminal court cases.
Burr, Tom; Croft, Stephen; Jarman, Kenneth D.
2015-09-05
The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less
Stolper, Margreet; Molewijk, Bert; Widdershoven, Guy
2016-07-22
Moral Case Deliberation is a specific form of bioethics education fostering professionals' moral competence in order to deal with their moral questions. So far, few studies focus in detail on Moral Case Deliberation methodologies and their didactic principles. The dilemma method is a structured and frequently used method in Moral Case Deliberation that stimulates methodological reflection and reasoning through a systematic dialogue on an ethical issue experienced in practice. In this paper we present a case-study of a Moral Case Deliberation with the dilemma method in a health care institution for people with an intellectual disability, describing the theoretical background and the practical application of the dilemma method. The dilemma method focuses on moral experiences of participants concerning a concrete dilemma in practice. By an in-depth description of each of the steps of the deliberation process, we elucidate the educational value and didactics of this specific method. The didactics and methodical steps of the dilemma method both supported and structured the dialogical reflection process of the participants. The process shows that the participants learned to recognize the moral dimension of the issue at stake and were able to distinguish various perspectives and reasons in a systematic manner. The facilitator played an important role in the learning process of the participants, by assisting them in focusing on and exploring moral aspects of the case. The reflection and learning process, experienced by the participants, shows competency-based characteristics. The role of the facilitator is that of a Socratic teacher with specific knowledge and skills, fostering reflection, inquiry and dialogue. The specific didactics of the dilemma method is well suited for teaching bioethics in clinical settings. The dilemma method follows an inductive learning approach through a dialogical moral inquiry in which participants develop not only knowledge but also skills, attitude and character. The role of a trained facilitator and a specific view on teaching and practicing ethics are essential when using the dilemma method in teaching health care professionals how to reflect on their own moral issues in practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silberman, E.; Morgan, H.W.
1977-01-01
Application of the mathematical theory of groups to the symmetry of molecules is a powerful method which permits the prediction, classification, and qualitative description of many molecular properties. In the particular case of vibrational molecular spectroscopy, applications of group theory lead to simple methods for the prediction of the number of bands to be found in the infrared and Raman spectra, their shape and polarization, and the qualitative description of the normal modes with which they are associated. The tables necessary for the application of group theory to vibrational spectroscopy and instructions on how to use them for molecular gases,more » liquids, and solutions are presented. A brief introduction to the concepts, definitions, nomenclature, and formulae is also included.« less
A risk evaluation model and its application in online retailing trustfulness
NASA Astrophysics Data System (ADS)
Ye, Ruyi; Xu, Yingcheng
2017-08-01
Building a general model for risks evaluation in advance could improve the convenience, normality and comparability of the results of repeating risks evaluation in the case that the repeating risks evaluating are in the same area and for a similar purpose. One of the most convenient and common risks evaluation models is an index system including of several index, according weights and crediting method. One method to build a risk evaluation index system that guarantees the proportional relationship between the resulting credit and the expected risk loss is proposed and an application example is provided in online retailing in this article.
Applications of the hybrid coordinate method to the TOPS autopilot
NASA Technical Reports Server (NTRS)
Fleischer, G. E.
1978-01-01
Preliminary results are presented from the application of the hybrid coordinate method to modeling TOPS (thermoelectric outer planet spacecraft) structural dynamics. Computer simulated responses of the vehicle are included which illustrate the interaction of relatively flexible appendages with an autopilot control system. Comparisons were made between simplified single-axis models of the control loop, with spacecraft flexibility represented by hinged rigid bodies, and a very detailed three-axis spacecraft model whose flexible portions are described by modal coordinates. While single-axis system, root loci provided reasonable qualitative indications of stability margins in this case, they were quantitatively optimistic when matched against responses of the detailed model.
Frequencies of gravity-capillary waves on highly curved interfaces with edge constraints
NASA Astrophysics Data System (ADS)
Shankar, P. N.
2007-06-01
A recently developed technique to calculate the natural frequencies of gravity-capillary waves in a confined liquid mass with a possibly highly curved free surface is extended to the case where the contact line is pinned. The general technique is worked out in detail for the cases of rectangular and cylindrical containers of circular section, the cases for which experimental data are available. The results of the present method are in excellent agreement with all earlier experimental and theoretical data for the flat static interface case [Benjamin and Scott, 1979. Gravity-capillary waves with edge constraints. J. Fluid Mech. 92, 241-267; Graham-Eagle, 1983. A new method for calculating eigenvalues with applications to gravity-capillary waves with edge constraints. Math. Proc. Camb. Phil. Soc. 94, 553-564; Henderson and Miles, 1994. Surface-wave damping in a circular cylinder with a fixed contact line. J. Fluid Mech. 275, 285-299]. However, the present method is applicable even when the contact angle is not π/2 and the static interface is curved. As a consequence we are able to work out the effects of a curved meniscus on the results of Cocciaro et al. [1993. Experimental investigation of capillary effects on surface gravity waves: non-wetting boundary conditions. J. Fluid Mech. 246, 43-66] where the measured contact angle was 62∘. We find that the meniscus does indeed account, as suggested by Cocciaro et al., for the earlier discrepancy between theory and experiment of about 20 mHz and there is now excellent agreement between the two.
A standardized mean difference effect size for multiple baseline designs across individuals.
Hedges, Larry V; Pustejovsky, James E; Shadish, William R
2013-12-01
Single-case designs are a class of research methods for evaluating treatment effects by measuring outcomes repeatedly over time while systematically introducing different condition (e.g., treatment and control) to the same individual. The designs are used across fields such as behavior analysis, clinical psychology, special education, and medicine. Emerging standards for single-case designs have focused attention on methods for summarizing and meta-analyzing findings and on the need for effect sizes indices that are comparable to those used in between-subjects designs. In the previous work, we discussed how to define and estimate an effect size that is directly comparable to the standardized mean difference often used in between-subjects research based on the data from a particular type of single-case design, the treatment reversal or (AB)(k) design. This paper extends the effect size measure to another type of single-case study, the multiple baseline design. We propose estimation methods for the effect size and its variance, study the estimators using simulation, and demonstrate the approach in two applications. Copyright © 2013 John Wiley & Sons, Ltd.
Hybridizable discontinuous Galerkin method for the 2-D frequency-domain elastic wave equations
NASA Astrophysics Data System (ADS)
Bonnasse-Gahot, Marie; Calandra, Henri; Diaz, Julien; Lanteri, Stéphane
2018-04-01
Discontinuous Galerkin (DG) methods are nowadays actively studied and increasingly exploited for the simulation of large-scale time-domain (i.e. unsteady) seismic wave propagation problems. Although theoretically applicable to frequency-domain problems as well, their use in this context has been hampered by the potentially large number of coupled unknowns they incur, especially in the 3-D case, as compared to classical continuous finite element methods. In this paper, we address this issue in the framework of the so-called hybridizable discontinuous Galerkin (HDG) formulations. As a first step, we study an HDG method for the resolution of the frequency-domain elastic wave equations in the 2-D case. We describe the weak formulation of the method and provide some implementation details. The proposed HDG method is assessed numerically including a comparison with a classical upwind flux-based DG method, showing better overall computational efficiency as a result of the drastic reduction of the number of globally coupled unknowns in the resulting discrete HDG system.
Nkosi, Zethu; Pillay, Padmini; Nokes, Kathleen M
2013-01-01
Case-based education has a long history in the disciplines of education, business, law and the health professions. Research suggests that students who learn via a case-based method have advanced critical thinking skills and a greater ability for application of knowledge in practice. In medical education, case-based methodology is widely used to facilitate knowledge transfer from theoretical knowledge to application in patient care. Nursing education has also adopted case-based methodology to enhance learner outcomes and critical thinking. The objectives of the study was to describe a decentralised nursing management education programme located in Durban, South Africa and describe the perceptions of nursing faculty facilitators regarding implementation of this teaching method. Data was collected through the use of one-on-one interviews and also focus groups amongst the fifteen facilitators who were using a case-based curriculum to teach the programme content. The average facilitator was female, between 41 and 50 years of age, working part-time, educated with a baccalaureate degree, working as a professional nurse for between 11 and 20 years; slightly more than half had worked as a facilitator for three or more years. The facilitators identified themes related to the student learners, the learning environment, and strengths and challenges of using facilitation to teach the content through cases. Decentralised nursing management educational programmes can meet the needs of nurses who are located in remote areas which are characterised by poor transportation patterns and limited resources and have great need for quality healthcare services. Nursing faculty facilitators need knowledgeable and accessible contact with centrally based full-time nursing faculty in order to promote high quality educational programmes.
The biospeckle method for the investigation of agricultural crops: A review
NASA Astrophysics Data System (ADS)
Zdunek, Artur; Adamiak, Anna; Pieczywek, Piotr M.; Kurenda, Andrzej
2014-01-01
Biospeckle is a nondestructive method for the evaluation of living objects. It has been applied to medicine, agriculture and microbiology for monitoring processes related to the movement of material particles. Recently, this method is extensively used for evaluation of quality of agricultural crops. In the case of botanical materials, the sources of apparent biospeckle activity are the Brownian motions and biological processes such as cyclosis, growth, transport, etc. Several different applications have been shown to monitor aging and maturation of samples, organ development and the detection and development of defects and diseases. This review will focus on three aspects: on the image analysis and mathematical methods for biospeckle activity evaluation, on published applications to botanical samples, with special attention to agricultural crops, and on interpretation of the phenomena from a biological point of view.
Protection of cooled blades of complex internal structure
NASA Technical Reports Server (NTRS)
Glamiche, P.
1977-01-01
The problem of general protection of cooled blades of complex internal structure was solved by a method called SF technique which makes possible the protection of both external and internal surfaces, as well as those of the orifices of cooling air, whatever their diameter. The SF method is most often applied in the case of pack process, at controlled or high activity; it can be of use on previously uncoated parts, but also on pieces already coated by a thermochemical, chemical or PVD method. The respective thickness of external and internal coatings may be precisely predetermined, no parasitic particle being liable to remain inside the parts after application of the protecting treatment. Results obtained to date by application of this method are illustrated by the presentation and examination of a various selection of advanced turbo engines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Hummel, Andrew John; Hiruta, Hikaru
The deterministic full core simulators require homogenized group constants covering the operating and transient conditions over the entire lifetime. Traditionally, the homogenized group constants are generated using lattice physics code over an assembly or block in the case of prismatic high temperature reactors (HTR). For the case of strong absorbers that causes strong local depressions on the flux profile require special techniques during homogenization over a large volume. Fuel blocks with burnable poisons or control rod blocks are example of such cases. Over past several decades, there have been a tremendous number of studies performed for improving the accuracy ofmore » full-core calculations through the homogenization procedure. However, those studies were mostly performed for light water reactor (LWR) analyses, thus, may not be directly applicable to advanced thermal reactors such as HTRs. This report presents the application of SuPer-Homogenization correction method to a hypothetical HTR core.« less
Toglia, Joan; Goverover, Yael; Johnston, Mark V; Dain, Barry
2011-01-01
The multicontext approach addresses strategy use and self-monitoring skills within activities and contexts that are systematically varied to facilitate transfer of learning. This article illustrates the application of the multicontext approach by presenting a case study of an adult who is 5 years post-traumatic brain injury with executive dysfunction and limited awareness. A single case study design with repeated pre-post measures was used. Methods to monitor strategy generation and specific awareness within intervention are described. Findings suggest improved functional performance and generalization of use of an external strategy despite absence of changes in general self-awareness of deficits. This case describes the multicontext intervention process and provides clinical suggestions for working with individuals with serious deficits in awareness and executive dysfunction following traumatic brain injury. Copyright 2011, SLACK Incorporated.
Cai, Xi; Han, Guang; Song, Xin; Wang, Jinkuan
2017-11-01
single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring. To build unconstrained monitoring environments, we propose a method to measure step length symmetry ratio (a useful gait parameter representing gait symmetry without significant relationship with age) from unconstrained straight walking using a single camera, without strict restrictions on walking directions or routes. according to projective geometry theory, we first develop a calculation formula of step length ratio for the case of unconstrained straight-line walking. Then, to adapt to general cases, we propose to modify noncollinear footprints, and accordingly provide general procedure for step length ratio extraction from unconstrained straight walking. Our method achieves a mean absolute percentage error (MAPE) of 1.9547% for 15 subjects' normal and abnormal side-view gaits, and also obtains satisfactory MAPEs for non-side-view gaits (2.4026% for 45°-view gaits and 3.9721% for 30°-view gaits). The performance is much better than a well-established monocular gait measurement system suitable only for side-view gaits with a MAPE of 3.5538%. Independently of walking directions, our method can accurately estimate step length ratios from unconstrained straight walking. This demonstrates our method is applicable for elders' daily gait monitoring to provide valuable information for elderly health care, such as abnormal gait recognition, fall risk assessment, etc. single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring. To build unconstrained monitoring environments, we propose a method to measure step length symmetry ratio (a useful gait parameter representing gait symmetry without significant relationship with age) from unconstrained straight walking using a single camera, without strict restrictions on walking directions or routes. according to projective geometry theory, we first develop a calculation formula of step length ratio for the case of unconstrained straight-line walking. Then, to adapt to general cases, we propose to modify noncollinear footprints, and accordingly provide general procedure for step length ratio extraction from unconstrained straight walking. Our method achieves a mean absolute percentage error (MAPE) of 1.9547% for 15 subjects' normal and abnormal side-view gaits, and also obtains satisfactory MAPEs for non-side-view gaits (2.4026% for 45°-view gaits and 3.9721% for 30°-view gaits). The performance is much better than a well-established monocular gait measurement system suitable only for side-view gaits with a MAPE of 3.5538%. Independently of walking directions, our method can accurately estimate step length ratios from unconstrained straight walking. This demonstrates our method is applicable for elders' daily gait monitoring to provide valuable information for elderly health care, such as abnormal gait recognition, fall risk assessment, etc.
Data Applicability of Heritage and New Hardware for Launch Vehicle System Reliability Models
NASA Technical Reports Server (NTRS)
Al Hassan Mohammad; Novack, Steven
2015-01-01
Many launch vehicle systems are designed and developed using heritage and new hardware. In most cases, the heritage hardware undergoes modifications to fit new functional system requirements, impacting the failure rates and, ultimately, the reliability data. New hardware, which lacks historical data, is often compared to like systems when estimating failure rates. Some qualification of applicability for the data source to the current system should be made. Accurately characterizing the reliability data applicability and quality under these circumstances is crucial to developing model estimations that support confident decisions on design changes and trade studies. This presentation will demonstrate a data-source classification method that ranks reliability data according to applicability and quality criteria to a new launch vehicle. This method accounts for similarities/dissimilarities in source and applicability, as well as operating environments like vibrations, acoustic regime, and shock. This classification approach will be followed by uncertainty-importance routines to assess the need for additional data to reduce uncertainty.
NASA Astrophysics Data System (ADS)
Hannachi, Ammar; Kohler, Sophie; Lallement, Alex; Hirsch, Ernest
2015-04-01
3D modeling of scene contents takes an increasing importance for many computer vision based applications. In particular, industrial applications of computer vision require efficient tools for the computation of this 3D information. Routinely, stereo-vision is a powerful technique to obtain the 3D outline of imaged objects from the corresponding 2D images. As a consequence, this approach provides only a poor and partial description of the scene contents. On another hand, for structured light based reconstruction techniques, 3D surfaces of imaged objects can often be computed with high accuracy. However, the resulting active range data in this case lacks to provide data enabling to characterize the object edges. Thus, in order to benefit from the positive points of various acquisition techniques, we introduce in this paper promising approaches, enabling to compute complete 3D reconstruction based on the cooperation of two complementary acquisition and processing techniques, in our case stereoscopic and structured light based methods, providing two 3D data sets describing respectively the outlines and surfaces of the imaged objects. We present, accordingly, the principles of three fusion techniques and their comparison based on evaluation criterions related to the nature of the workpiece and also the type of the tackled application. The proposed fusion methods are relying on geometric characteristics of the workpiece, which favour the quality of the registration. Further, the results obtained demonstrate that the developed approaches are well adapted for 3D modeling of manufactured parts including free-form surfaces and, consequently quality control applications using these 3D reconstructions.
Benefit Indicators for Flood Regulation Services of Wetlands: A Modeling Approach
This report describes a method for developing indicators of the benefits of flood regulation services of wetlands and presents a companion case study. We demonstrate our approach through an application to the Woonasquatucket River watershed in northern Rhode Island. This work is ...
18 CFR 157.17 - Applications for temporary certificates in cases of emergency.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Amended, Concerning Any Operation, Sales, Service, Construction, Extension, Acquisition or Abandonment... temporary certificate authorizing the construction and operation of extensions of existing facilities... exact character of the emergency, the proposed method of meeting it, and the facts claimed to warrant...
18 CFR 157.17 - Applications for temporary certificates in cases of emergency.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Amended, Concerning Any Operation, Sales, Service, Construction, Extension, Acquisition or Abandonment... temporary certificate authorizing the construction and operation of extensions of existing facilities... exact character of the emergency, the proposed method of meeting it, and the facts claimed to warrant...
18 CFR 157.17 - Applications for temporary certificates in cases of emergency.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Amended, Concerning Any Operation, Sales, Service, Construction, Extension, Acquisition or Abandonment... temporary certificate authorizing the construction and operation of extensions of existing facilities... exact character of the emergency, the proposed method of meeting it, and the facts claimed to warrant...
18 CFR 157.17 - Applications for temporary certificates in cases of emergency.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Amended, Concerning Any Operation, Sales, Service, Construction, Extension, Acquisition or Abandonment... temporary certificate authorizing the construction and operation of extensions of existing facilities... exact character of the emergency, the proposed method of meeting it, and the facts claimed to warrant...
Practical Stereology Applications for the Pathologist.
Brown, Danielle L
2017-05-01
Qualitative histopathology is the gold standard for routine examination of morphological tissue changes in the regulatory or academic environment. The human eye is exceptional for pattern recognition but often cannot detect small changes in quantity. In cases where detection of subtle quantitative changes is critical, more sensitive methods are required. Two-dimensional histomorphometry can provide additional quantitative information and is quite useful in many cases. However, the provided data may not be referent to the entire tissue and, as such, it makes several assumptions, which are sources of bias. In contrast, stereology is design based rather than assumption based and uses stringent sampling methods to obtain accurate and precise 3-dimensional information using geometrical and statistical principles. Recent advances in technology have made stereology more approachable and practical for the pathologist in both regulatory and academic environments. This review introduces pathologists to the basic principles of stereology and walks the reader through some real-world examples for the application of these principles in the workplace.
The supersymmetric method in random matrix theory and applications to QCD
NASA Astrophysics Data System (ADS)
Verbaarschot, Jacobus
2004-12-01
The supersymmetric method is a powerful method for the nonperturbative evaluation of quenched averages in disordered systems. Among others, this method has been applied to the statistical theory of S-matrix fluctuations, the theory of universal conductance fluctuations and the microscopic spectral density of the QCD Dirac operator. We start this series of lectures with a general review of Random Matrix Theory and the statistical theory of spectra. An elementary introduction of the supersymmetric method in Random Matrix Theory is given in the second and third lecture. We will show that a Random Matrix Theory can be rewritten as an integral over a supermanifold. This integral will be worked out in detail for the Gaussian Unitary Ensemble that describes level correlations in systems with broken time-reversal invariance. We especially emphasize the role of symmetries. As a second example of the application of the supersymmetric method we discuss the calculation of the microscopic spectral density of the QCD Dirac operator. This is the eigenvalue density near zero on the scale of the average level spacing which is known to be given by chiral Random Matrix Theory. Also in this case we use symmetry considerations to rewrite the generating function for the resolvent as an integral over a supermanifold. The main topic of the second last lecture is the recent developments on the relation between the supersymmetric partition function and integrable hierarchies (in our case the Toda lattice hierarchy). We will show that this relation is an efficient way to calculate superintegrals. Several examples that were given in previous lectures will be worked out by means of this new method. Finally, we will discuss the quenched QCD Dirac spectrum at nonzero chemical potential. Because of the nonhermiticity of the Dirac operator the usual supersymmetric method has not been successful in this case. However, we will show that the supersymmetric partition function can be evaluated by means of the replica limit of the Toda lattice equation.
[Application of ARIMA model to predict number of malaria cases in China].
Hui-Yu, H; Hua-Qin, S; Shun-Xian, Z; Lin, A I; Yan, L U; Yu-Chun, C; Shi-Zhu, L I; Xue-Jiao, T; Chun-Li, Y; Wei, H U; Jia-Xu, C
2017-08-15
Objective To study the application of autoregressive integrated moving average (ARIMA) model to predict the monthly reported malaria cases in China, so as to provide a reference for prevention and control of malaria. Methods SPSS 24.0 software was used to construct the ARIMA models based on the monthly reported malaria cases of the time series of 20062015 and 2011-2015, respectively. The data of malaria cases from January to December, 2016 were used as validation data to compare the accuracy of the two ARIMA models. Results The models of the monthly reported cases of malaria in China were ARIMA (2, 1, 1) (1, 1, 0) 12 and ARIMA (1, 0, 0) (1, 1, 0) 12 respectively. The comparison between the predictions of the two models and actual situation of malaria cases showed that the ARIMA model based on the data of 2011-2015 had a higher accuracy of forecasting than the model based on the data of 2006-2015 had. Conclusion The establishment and prediction of ARIMA model is a dynamic process, which needs to be adjusted unceasingly according to the accumulated data, and in addition, the major changes of epidemic characteristics of infectious diseases must be considered.
RESEARCH: Theory in Practice: Applying Participatory Democracy Theory to Public Land Planning
Moote; Mcclaran; Chickering
1997-11-01
/ Application of participatory democracy theory to public participation in public land planning, while widely advocated, has not been closely examined. A case study is used here to explicate the application of participatory democracy concepts to public participation in public land planning and decision making. In this case, a Bureau of Land Management resource area manager decided to make a significant shift from the traditional public involvement process to a more participatory method-coordinated resource management (CRM). This case was assessed using document analysis, direct observation of CRM meetings, questionnaires, and interviews of key participants. These sources were used to examine the CRM case using participatory democracy concepts of efficacy, access and representation, continuous participation throughout planning, information exchange and learning, and decision-making authority. The case study suggests that social deliberation in itself does not ensure successful collaboration and that establishing rules of operation and decision making within the group is critical. Furthermore, conflicts between the concept of shared decision-making authority and the public land management agencies' accountability to Congress, the President, and the courts need further consideration.KEY WORDS: Case study; Coordinated resource management; Public participation; Administrative discretion; Representation; Consensus; Collaboration
Magnetotelluric Studies for Hydrocarbon and Geothermal Resources: Examples from the Asian Region
NASA Astrophysics Data System (ADS)
Patro, Prasanta K.
2017-09-01
Magnetotellurics (MT) and the other related electrical and electromagnetic methods play a very useful role in resource exploration. This review paper presents the current scenario of application of MT in the exploration for hydrocarbons and geothermal resources in Asia. While seismics is the most preferred method in oil exploration, it is, however, beset with several limitations in the case of sedimentary targets overlain by basalts or evaporate/carbonate rocks where the high-velocity layers overlying the lower velocity layers pose a problem. In such cases, MT plays an important and, in some cases, a crucial role in mapping these potential reservoirs because of significant resistivity contrast generally observed between the basalts and the underlying sedimentary layers. A few case histories are presented that typically illustrate the role of MT in this context. In the case of geothermal exploration, MT is known to be highly effective in deciphering the target areas because of the conductivity structures arising from the presence and circulation of highly conductive fluids in the geothermal target areas. A few examples of MT studies carried out in some of the potential areas of geothermal significance in the Asian region are also discussed. While it is a relatively favorable situation for application of EM and MT methods in the case of exploration of the high-enthalpy region due to the development of well-defined conceptual models, still the low-enthalpy regions need to be understood well, particularly because of more complex structural patterns and the fluid circulation under relatively low-temperature conditions. Currently, a lot of modeling in both geothermal and hydrocarbon exploration is being done using three-dimensional techniques, and it is the right time to go for integration and three-dimensional joint inversion of the geophysical parameters such as resistivity, velocity, density, from MT, electromagnetics (EM), seismics and gravity.
Measurement of the surface wavelength distribution of narrow-band radiation by a colorimetric method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraiskii, A V; Mironova, T V; Sultanov, T T
2010-09-10
A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases. (laser applications and other topics in quantum electronics)
Targeted methods for quantitative analysis of protein glycosylation
Goldman, Radoslav; Sanda, Miloslav
2018-01-01
Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218
Free-energy landscapes from adaptively biased methods: Application to quantum systems
NASA Astrophysics Data System (ADS)
Calvo, F.
2010-10-01
Several parallel adaptive biasing methods are applied to the calculation of free-energy pathways along reaction coordinates, choosing as a difficult example the double-funnel landscape of the 38-atom Lennard-Jones cluster. In the case of classical statistics, the Wang-Landau and adaptively biased molecular-dynamics (ABMD) methods are both found efficient if multiple walkers and replication and deletion schemes are used. An extension of the ABMD technique to quantum systems, implemented through the path-integral MD framework, is presented and tested on Ne38 against the quantum superposition method.
Potential and viscous flow in VTOL, STOL or CTOL propulsion system inlets
NASA Technical Reports Server (NTRS)
Stockman, N. O.
1975-01-01
A method was developed for analyzing the flow in subsonic axisymmetric inlets at arbitrary conditions of freestream velocity, incidence angle, and inlet mass flow. An improved version of the method is discussed and comparisons of results obtained with the original and improved methods are given. Comparisons with experiments are also presented for several inlet configurations and for various conditions of the boundary layer from insignificant to separated. Applications of the method are discussed, with several examples given for specific cases involving inlets for VTOL lift fans and for STOL engine nacelles.
NASA Astrophysics Data System (ADS)
Kraiskii, A. V.; Mironova, T. V.; Sultanov, T. T.
2010-09-01
A method is suggested for determining the wavelength of narrow-band light from a digital photograph of a radiating surface. The digital camera used should be appropriately calibrated. The accuracy of the wavelength measurement is better than 1 nm. The method was tested on the yellow doublet of mercury spectrum and on the adjacent continuum of the incandescent lamp radiation spectrum. By means of the method suggested the homogeneity of holographic sensor swelling was studied in stationary and transient cases.
A controlled experiment in ground water flow model calibration
Hill, M.C.; Cooley, R.L.; Pollock, D.W.
1998-01-01
Nonlinear regression was introduced to ground water modeling in the 1970s, but has been used very little to calibrate numerical models of complicated ground water systems. Apparently, nonlinear regression is thought by many to be incapable of addressing such complex problems. With what we believe to be the most complicated synthetic test case used for such a study, this work investigates using nonlinear regression in ground water model calibration. Results of the study fall into two categories. First, the study demonstrates how systematic use of a well designed nonlinear regression method can indicate the importance of different types of data and can lead to successive improvement of models and their parameterizations. Our method differs from previous methods presented in the ground water literature in that (1) weighting is more closely related to expected data errors than is usually the case; (2) defined diagnostic statistics allow for more effective evaluation of the available data, the model, and their interaction; and (3) prior information is used more cautiously. Second, our results challenge some commonly held beliefs about model calibration. For the test case considered, we show that (1) field measured values of hydraulic conductivity are not as directly applicable to models as their use in some geostatistical methods imply; (2) a unique model does not necessarily need to be identified to obtain accurate predictions; and (3) in the absence of obvious model bias, model error was normally distributed. The complexity of the test case involved implies that the methods used and conclusions drawn are likely to be powerful in practice.Nonlinear regression was introduced to ground water modeling in the 1970s, but has been used very little to calibrate numerical models of complicated ground water systems. Apparently, nonlinear regression is thought by many to be incapable of addressing such complex problems. With what we believe to be the most complicated synthetic test case used for such a study, this work investigates using nonlinear regression in ground water model calibration. Results of the study fall into two categories. First, the study demonstrates how systematic use of a well designed nonlinear regression method can indicate the importance of different types of data and can lead to successive improvement of models and their parameterizations. Our method differs from previous methods presented in the ground water literature in that (1) weighting is more closely related to expected data errors than is usually the case; (2) defined diagnostic statistics allow for more effective evaluation of the available data, the model, and their interaction; and (3) prior information is used more cautiously. Second, our results challenge some commonly held beliefs about model calibration. For the test case considered, we show that (1) field measured values of hydraulic conductivity are not as directly applicable to models as their use in some geostatistical methods imply; (2) a unique model does not necessarily need to be identified to obtain accurate predictions; and (3) in the absence of obvious model bias, model error was normally distributed. The complexity of the test case involved implies that the methods used and conclusions drawn are likely to be powerful in practice.
Astronomical Methods in Aerial Navigation
NASA Technical Reports Server (NTRS)
Beij, K Hilding
1925-01-01
The astronomical method of determining position is universally used in marine navigation and may also be of service in aerial navigation. The practical application of the method, however, must be modified and adapted to conform to the requirements of aviation. Much of this work of adaptation has already been accomplished, but being scattered through various technical journals in a number of languages, is not readily available. This report is for the purpose of collecting under one cover such previous work as appears to be of value to the aerial navigator, comparing instruments and methods, indicating the best practice, and suggesting future developments. The various methods of determining position and their application and value are outlined, and a brief resume of the theory of the astronomical method is given. Observation instruments are described in detail. A complete discussion of the reduction of observations follows, including a rapid method of finding position from the altitudes of two stars. Maps and map cases are briefly considered. A bibliography of the subject is appended.
Hypospadias and Residential Proximity to Pesticide Applications
Yang, Wei; Roberts, Eric M.; Kegley, Susan E.; Wolff, Craig; Guo, Liang; Lammer, Edward J.; English, Paul; Shaw, Gary M.
2013-01-01
BACKGROUND: Experimental evidence suggests pesticides may be associated with hypospadias. OBJECTIVE: Examine the association of hypospadias with residential proximity to commercial agricultural pesticide applications. METHODS: The study population included male infants born from 1991 to 2004 to mothers residing in 8 California counties. Cases (n = 690) were ascertained by the California Birth Defects Monitoring Program; controls were selected randomly from the birth population (n = 2195). We determined early pregnancy exposure to pesticide applications within a 500-m radius of mother’s residential address, using detailed data on applications and land use. Associations with exposures to physicochemical groups of pesticides and specific chemicals were assessed using logistic regression adjusted for maternal race or ethnicity and age and infant birth year. RESULTS: Forty-one percent of cases and controls were classified as exposed to 57 chemical groups and 292 chemicals. Despite >500 statistical comparisons, there were few elevated odds ratios with confidence intervals that excluded 1 for chemical groups or specific chemicals. Those that did were for monochlorophenoxy acid or ester herbicides; the insecticides aldicarb, dimethoate, phorate, and petroleum oils; and adjuvant polyoxyethylene sorbitol among all cases; 2,6-dinitroaniline herbicides, the herbicide oxyfluorfen, and the fungicide copper sulfate among mild cases; and chloroacetanilide herbicides, polyalkyloxy compounds used as adjuvants, the insecticides aldicarb and acephate, and the adjuvant nonyl-phenoxy-poly(ethylene oxy)ethanol among moderate and severe cases. Odds ratios ranged from 1.9 to 2.9. CONCLUSIONS: Most pesticides were not associated with elevated hypospadias risk. For the few that were associated, results should be interpreted with caution until replicated in other study populations. PMID:24167181
Size-extensive QCISDT — implementation and application
NASA Astrophysics Data System (ADS)
Cremer, Dieter; He, Zhi
1994-05-01
A size-extensive quadratic CI method with single (S), double (D), and triple (T) excitations, QCISDT, has been derived by appropriate cancellation of disconnected terms in the CISDT projection equations. Matrix elements of the new QCI method have been evaluated in terms of two-electron integrals and applied to a number of atoms and small molecules. While QCISDT results are of similar accuracy to CCSDT results, the new method is easier to implement, converges in many cases faster and, thereby, leads to advantages compared to CCSDT.
NASA Technical Reports Server (NTRS)
Rosenfeld, Moshe
1990-01-01
The main goals are the development, validation, and application of a fractional step solution method of the time-dependent incompressible Navier-Stokes equations in generalized coordinate systems. A solution method that combines a finite volume discretization with a novel choice of the dependent variables and a fractional step splitting to obtain accurate solutions in arbitrary geometries is extended to include more general situations, including cases with moving grids. The numerical techniques are enhanced to gain efficiency and generality.
Electrochemical Disposal of Hydrazines in Water
NASA Technical Reports Server (NTRS)
Kim, Jinseong; Gonzalez-Mar, Anuncia; Salinas, Carlos; Rutherford, Larris; Jeng, King-Tsai; Andrews, Craig; Yalamanchili, Ratlaya
2007-01-01
An electrochemical method of disposal of hydrazines dissolved in water has been devised. The method is applicable to hydrazine (N2H4), to monomethyl hydrazine [also denoted by MMH or by its chemical formula, (CH3)HNNH2], and to unsymmetrical dimethyl hydrazine [also denoted UDMH or by its chemical formula, (CH3)2NNH2]. The method involves a room-temperature process that converts the hydrazine to the harmless products N2, H2O, and, in some cases, CO2
[Analysis of EML4-ALK gene fusion mutation in patients with non-small cell lung cancer].
Wang, Xuzhou; Chen, Weisheng; Yu, Yinghao
2015-02-01
Non-small cell lung cancer (NSCLC) is the main type of lung cancer, and the related locus mutation detection research has become a hot direction of molecular targeted therapy, studying on gene mutation status of echinodem microtubule associated protein like 4-Anaplastic lymphoma kinase (EML4-ALK) and epidermal growth factor receptor (EGFR), detecting the sensitivity of EML4-ALK gene fusion and gene mutation of EGFR. EML4-ALK gene fusion in 85 cases of paraffin embedded tumor tissue and adjacent lung tissue was detected with the application of immunohistochemistry (IHC), Scorpions amplification refractory mutation system (Scorpions ARMS) fluorescence quantitative PCR and fluorescence in situ hybridization (FISH) technology, and EGFR gene in 18, 19, 20 and 21 exon mutation status was detected with the application of ARMS method. In 115 cases of NSCLC, IHC showed 32 cases with ALK (D5F3) expression, the expression rate was 27.8%; ARMS showed 27 cases with EML4-ALK fusion gene mutation, the mutation detection rate was 23.5%; 53 cases were detected with EGFR mutation, the mutation rate was 46%. While FISH showed 23 cases with EML4-ALK fusion gene mutation, the detection rate was 20%, slightly lower than the ARMS detection results, suggesting that ARMS more sensitive. The application of IHC, ARMS fluorescence quantitative PCR and FISH technology can make a rapid and accurate evaluation of EML4-ALK gene fusion.
Application of the Hilbert-Huang Transform to Financial Data
NASA Technical Reports Server (NTRS)
Huang, Norden
2005-01-01
A paper discusses the application of the Hilbert-Huang transform (HHT) method to time-series financial-market data. The method was described, variously without and with the HHT name, in several prior NASA Tech Briefs articles and supporting documents. To recapitulate: The method is especially suitable for analyzing time-series data that represent nonstationary and nonlinear phenomena including physical phenomena and, in the present case, financial-market processes. The method involves the empirical mode decomposition (EMD), in which a complicated signal is decomposed into a finite number of functions, called "intrinsic mode functions" (IMFs), that admit well-behaved Hilbert transforms. The HHT consists of the combination of EMD and Hilbert spectral analysis. The local energies and the instantaneous frequencies derived from the IMFs through Hilbert transforms can be used to construct an energy-frequency-time distribution, denoted a Hilbert spectrum. The instant paper begins with a discussion of prior approaches to quantification of market volatility, summarizes the HHT method, then describes the application of the method in performing time-frequency analysis of mortgage-market data from the years 1972 through 2000. Filtering by use of the EMD is shown to be useful for quantifying market volatility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Y; Department of Engineering Physics, Tsinghua University, Beijing; Tian, Z
Purpose: Acuros BV has become available to perform accurate dose calculations in high-dose-rate (HDR) brachytherapy with phantom heterogeneity considered by solving the Boltzmann transport equation. In this work, we performed validation studies regarding the dose calculation accuracy of Acuros BV in cases with a shielded cylinder applicator using Monte Carlo (MC) simulations. Methods: Fifteen cases were considered in our studies, covering five different diameters of the applicator and three different shielding degrees. For each case, a digital phantom was created in Varian BrachyVision with the cylinder applicator inserted in the middle of a large water phantom. A treatment plan withmore » eight dwell positions was generated for these fifteen cases. Dose calculations were performed with Acuros BV. We then generated a voxelized phantom of the same geometry, and the materials were modeled according to the vendor’s specifications. MC dose calculations were then performed using our in-house developed fast MC dose engine for HDR brachytherapy (gBMC) on a GPU platform, which is able to simulate both photon transport and electron transport in a voxelized geometry. A phase-space file for the Ir-192 HDR source was used as a source model for MC simulations. Results: Satisfactory agreements between the dose distributions calculated by Acuros BV and those calculated by gBMC were observed in all cases. Quantitatively, we computed point-wise dose difference within the region that receives a dose higher than 10% of the reference dose, defined to be the dose at 5mm outward away from the applicator surface. The mean dose difference was ∼0.45%–0.51% and the 95-percentile maximum difference was ∼1.24%–1.47%. Conclusion: Acuros BV is able to accurately perform dose calculations in HDR brachytherapy with a shielded cylinder applicator.« less
Mincewicz, Grzegorz; Rumiński, Jacek; Krzykowski, Grzegorz
2012-02-01
Recently, we described a model system which included corrections of high-resolution computed tomography (HRCT) bronchial measurements based on the adjusted subpixel method (ASM). To verify the clinical application of ASM by comparing bronchial measurements obtained by means of the traditional eye-driven method, subpixel method alone and ASM in a group comprised of bronchial asthma patients and healthy individuals. The study included 30 bronchial asthma patients and the control group comprised of 20 volunteers with no symptoms of asthma. The lowest internal and external diameters of the bronchial cross-sections (ID and ED) and their derivative parameters were determined in HRCT scans using: (1) traditional eye-driven method, (2) subpixel technique, and (3) ASM. In the case of the eye-driven method, lower ID values along with lower bronchial lumen area and its percentage ratio to total bronchial area were basic parameters that differed between asthma patients and healthy controls. In the case of the subpixel method and ASM, both groups were not significantly different in terms of ID. Significant differences were observed in values of ED and total bronchial area with both parameters being significantly higher in asthma patients. Compared to ASM, the eye-driven method overstated the values of ID and ED by about 30% and 10% respectively, while understating bronchial wall thickness by about 18%. Results obtained in this study suggest that the traditional eye-driven method of HRCT-based measurement of bronchial tree components probably overstates the degree of bronchial patency in asthma patients. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Yates, S R
2009-01-01
An analytical solution describing the fate and transport of pesticides applied to soils has been developed. Two pesticide application methods can be simulated: point-source applications, such as idealized shank or a hot-gas injection method, and a more realistic shank-source application method that includes a vertical pesticide distribution in the soil domain due to a soil fracture caused by a shank. The solutions allow determination of the volatilization rate and other information that could be important for understanding fumigant movement and in the development of regulatory permitting conditions. The solutions can be used to characterize differences in emissions relative to changes in the soil degradation rate, surface barrier conditions, application depth, and soil packing. In some cases, simple algebraic expressions are provided that can be used to obtain the total emissions and total soil degradation. The solutions provide a consistent methodology for determining the total emissions and can be used with other information, such as field and laboratory experimental data, to support the development of fumigant regulations. The uses of the models are illustrated by several examples.
Barigye, Stephen J; Freitas, Matheus P; Ausina, Priscila; Zancan, Patricia; Sola-Penna, Mauro; Castillo-Garit, Juan A
2018-02-12
We recently generalized the formerly alignment-dependent multivariate image analysis applied to quantitative structure-activity relationships (MIA-QSAR) method through the application of the discrete Fourier transform (DFT), allowing for its application to noncongruent and structurally diverse chemical compound data sets. Here we report the first practical application of this method in the screening of molecular entities of therapeutic interest, with human aromatase inhibitory activity as the case study. We developed an ensemble classification model based on the two-dimensional (2D) DFT MIA-QSAR descriptors, with which we screened the NCI Diversity Set V (1593 compounds) and obtained 34 chemical compounds with possible aromatase inhibitory activity. These compounds were docked into the aromatase active site, and the 10 most promising compounds were selected for in vitro experimental validation. Of these compounds, 7419 (nonsteroidal) and 89 201 (steroidal) demonstrated satisfactory antiproliferative and aromatase inhibitory activities. The obtained results suggest that the 2D-DFT MIA-QSAR method may be useful in ligand-based virtual screening of new molecular entities of therapeutic utility.
Application of phyto-indication and radiocesium indicative methods for microrelief mapping
NASA Astrophysics Data System (ADS)
Panidi, E.; Trofimetz, L.; Sokolova, J.
2016-04-01
Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.
Chan, Siew Foong; Deeks, Jonathan J; Macaskill, Petra; Irwig, Les
2008-01-01
To compare three predictive models based on logistic regression to estimate adjusted likelihood ratios allowing for interdependency between diagnostic variables (tests). This study was a review of the theoretical basis, assumptions, and limitations of published models; and a statistical extension of methods and application to a case study of the diagnosis of obstructive airways disease based on history and clinical examination. Albert's method includes an offset term to estimate an adjusted likelihood ratio for combinations of tests. Spiegelhalter and Knill-Jones method uses the unadjusted likelihood ratio for each test as a predictor and computes shrinkage factors to allow for interdependence. Knottnerus' method differs from the other methods because it requires sequencing of tests, which limits its application to situations where there are few tests and substantial data. Although parameter estimates differed between the models, predicted "posttest" probabilities were generally similar. Construction of predictive models using logistic regression is preferred to the independence Bayes' approach when it is important to adjust for dependency of tests errors. Methods to estimate adjusted likelihood ratios from predictive models should be considered in preference to a standard logistic regression model to facilitate ease of interpretation and application. Albert's method provides the most straightforward approach.
Swartman, B; Frere, D; Wei, W; Schnetzke, M; Beisemann, N; Keil, H; Franke, J; Grützner, P A; Vetter, S Y
2017-10-01
A new software application can be used without fixed reference markers or a registration process in wire placement. The aim was to compare placement of Kirschner wires (K-wires) into the proximal femur with the software application versus the conventional method without guiding. As study hypothesis, we assumed less placement attempts, shorter procedure time and shorter fluoroscopy time using the software. The same precision inside a proximal femur bone model using the software application was premised. The software detects a K-wire within the 2D fluoroscopic image. By evaluating its direction and tip location, it superimposes a trajectory on the image, visualizing the intended direction of the K-wire. The K-wire was positioned in 20 artificial bones with the use of software by one surgeon; 20 bones served as conventional controls. A brass thumb tack was placed into the femoral head and its tip targeted with the wire. Number of placement attempts, duration of the procedure, duration of fluoroscopy time and distance to the target in a postoperative 3D scan were recorded. Compared with the conventional method, use of the application showed fewer attempts for optimal wire placement (p=0.026), shorter duration of surgery (p=0.004), shorter fluoroscopy time (p=0.024) and higher precision (p=0.018). Final wire position was achieved in the first attempt in 17 out of 20 cases with the software and in 9 out of 20 cases with the conventional method. The study hypothesis was confirmed. The new application optimised the process of K-wire placement in the proximal femur in an artificial bone model while also improving precision. Benefits lie especially in the reduction of placement attempts and reduction of fluoroscopy time under the aspect of radiation protection. The software runs on a conventional image intensifier and can therefore be easily integrated into the daily surgical routine. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Maximum Entropy Method for Particle Filtering
NASA Astrophysics Data System (ADS)
Eyink, Gregory L.; Kim, Sangil
2006-06-01
Standard ensemble or particle filtering schemes do not properly represent states of low priori probability when the number of available samples is too small, as is often the case in practical applications. We introduce here a set of parametric resampling methods to solve this problem. Motivated by a general H-theorem for relative entropy, we construct parametric models for the filter distributions as maximum-entropy/minimum-information models consistent with moments of the particle ensemble. When the prior distributions are modeled as mixtures of Gaussians, our method naturally generalizes the ensemble Kalman filter to systems with highly non-Gaussian statistics. We apply the new particle filters presented here to two simple test cases: a one-dimensional diffusion process in a double-well potential and the three-dimensional chaotic dynamical system of Lorenz.
Experiences with leak rate calculations methods for LBB application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebner, H.; Kastner, W.; Hoefler, A.
1997-04-01
In this paper, three leak rate computer programs for the application of leak before break analysis are described and compared. The programs are compared to each other and to results of an HDR Reactor experiment and two real crack cases. The programs analyzed are PIPELEAK, FLORA, and PICEP. Generally, the different leak rate models are in agreement. To obtain reasonable agreement between measured and calculated leak rates, it was necessary to also use data from detailed crack investigations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, Marc; Saur, Genevieve; Ramsden, Todd
2015-05-28
This presentation summarizes NREL's hydrogen and fuel cell analysis work in three areas: resource potential, greenhouse gas emissions and cost of delivered energy, and influence of auxiliary revenue streams. NREL's hydrogen and fuel cell analysis projects focus on low-carbon and economic transportation and stationary fuel cell applications. Analysis tools developed by the lab provide insight into the degree to which bridging markets can strengthen the business case for fuel cell applications.
Transfer Hydro-dehalogenation of Organic Halides Catalyzed by Ruthenium(II) Complex.
You, Tingjie; Wang, Zhenrong; Chen, Jiajia; Xia, Yuanzhi
2017-02-03
A simple and efficient Ru(II)-catalyzed transfer hydro-dehalogenation of organic halides using 2-propanol solvent as the hydride source was reported. This methodology is applicable for hydro-dehalogenation of a variety of aromatic halides and α-haloesters and amides without additional ligand, and quantitative yields were achieved in many cases. The potential synthetic application of this method was demonstrated by efficient gram-scale transformation with catalyst loading as low as 0.5 mol %.
The Science Manager's Guide to Case Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Branch, Kristi M.; Peffers, Melissa S.; Ruegg, Rosalie T.
2001-09-24
This guide takes the science manager through the steps of planning, implementing, validating, communicating, and using case studies. It outlines the major methods of analysis, describing their relative merits and applicability while providing relevant examples and sources of additional information. Well-designed case studies can provide a combination of rich qualitative and quantitative information, offering valuable insights into the nature, outputs, and longer-term impacts of the research. An objective, systematic, and credible approach to the evaluation of U.S. Department of Energy Office of Science programs adds value to the research process and is the subject of this guide.
Use of miniplates as a method for orthodontic anchorage: a case report
Peres, Fernando Gianzanti; Padovan, Luis Eduardo Marques; Kluppel, Leandro Eduardo; Albuquerque, Gustavo Calvalcanti; de Souza, Paulo Cesar Ulson; Claudino, Marcela
2016-01-01
ABSTRACT Introduction: Temporary anchorage devices (TADs) have been developed to be used as direct adjuncts in orthodontic treatment and have facilitated treatment of more complex orthodontic cases, including patients with dental impaction. Objectives: This clinical case reports the applicability of TADs in the orthodontic treatment of a patient with impacted mandibular second molars. Surgical and orthodontic procedures related to the use of miniplates were also discussed in this study. Conclusions: The use of temporary anchorage devices, such as miniplates, can be suggested as an alternative to treat patients with impacted mandibular second molars. PMID:27901235
A method for determining spiral-bevel gear tooth geometry for finite element analysis
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.; Litvin, Faydor L.
1991-01-01
An analytical method was developed to determine gear tooth surface coordinates of face-milled spiral bevel gears. The method uses the basic gear design parameters in conjunction with the kinematical aspects of spiral bevel gear manufacturing machinery. A computer program, SURFACE, was developed. The computer program calculates the surface coordinates and outputs 3-D model data that can be used for finite element analysis. Development of the modeling method and an example case are presented. This analysis method could also find application for gear inspection and near-net-shape gear forging die design.
Kaltenbacher, Barbara; Kaltenbacher, Manfred; Sim, Imbo
2013-01-01
We consider the second order wave equation in an unbounded domain and propose an advanced perfectly matched layer (PML) technique for its efficient and reliable simulation. In doing so, we concentrate on the time domain case and use the finite-element (FE) method for the space discretization. Our un-split-PML formulation requires four auxiliary variables within the PML region in three space dimensions. For a reduced version (rPML), we present a long time stability proof based on an energy analysis. The numerical case studies and an application example demonstrate the good performance and long time stability of our formulation for treating open domain problems. PMID:23888085
Ewald method for polytropic potentials in arbitrary dimensionality
NASA Astrophysics Data System (ADS)
Osychenko, O. N.; Astrakharchik, G. E.; Boronat, J.
2012-02-01
The Ewald summation technique is generalized to power-law 1/| r | k potentials in three-, two- and one-dimensional geometries with explicit formulae for all the components of the sums. The cases of short-range, long-range and 'marginal' interactions are treated separately. The jellium model, as a particular case of a charge-neutral system, is discussed and the explicit forms of the Ewald sums for such a system are presented. A generalized form of the Ewald sums for a non-cubic (non-square) simulation cell for three- (two-) dimensional geometry is obtained and its possible field of application is discussed. A procedure for the optimization of the involved parameters in actual simulations is developed and an example of its application is presented.
The wavefield of acoustic logging in a cased hole with a single casing—Part II: a dipole tool
NASA Astrophysics Data System (ADS)
Wang, Hua; Fehler, Michael
2018-02-01
The acoustic method, being the most effective method for cement bond evaluation, has been used by industry for more than a half century. However, the methods currently used are almost always focused on the first arrival (especially for sonic logging), which has limitations. We use a 3-D finite-difference method to numerically simulate the wavefields from a dipole source in a single-cased hole with different cement conditions. By using wavefield snapshots and dispersion curves, we interpret the characteristics of the modes in the models. We investigate the effect of source frequency, the thickness and location of fluid columns on different modes. The dipole wavefield in a single-cased hole consists of a leaky P (for frequency >10 kHz) from formation, formation flexural, and also some casing modes. Depending on the mode, their behaviour is sometimes sensitive to the existence of fluid between the cement and formation and sometimes sensitive to the existence of fluid between the casing and cement. The formation S velocity can be obtained from the formation flexural mode at low frequency. However, interference from high-order casing modes makes the leaky P invisible and P velocity determination difficult when the casing is not well cemented. The dispersion curve of the formation flexural mode is sensitive to the fluid thickness when fluid exists only at the interface between casing and cement. The fundamental casing dipole mode is only sensitive to the total fluid thickness in the annulus between casing and formation. Either the arrival time or amplitude of the high-order casing dipole mode is sensitive to the fluid column when the fluid column is next to the casing. We provide a table that summarizes the ability of different modes to detect fluid columns between various layers of casing, cement and formation. Based on the results, we suggest a data processing flow for field application, which will highly improve cement evaluation.
Nuclear quantum effects and kinetic isotope effects in enzyme reactions.
Vardi-Kilshtain, Alexandra; Nitoker, Neta; Major, Dan Thomas
2015-09-15
Enzymes are extraordinarily effective catalysts evolved to perform well-defined and highly specific chemical transformations. Studying the nature of rate enhancements and the mechanistic strategies in enzymes is very important, both from a basic scientific point of view, as well as in order to improve rational design of biomimetics. Kinetic isotope effect (KIE) is a very important tool in the study of chemical reactions and has been used extensively in the field of enzymology. Theoretically, the prediction of KIEs in condensed phase environments such as enzymes is challenging due to the need to include nuclear quantum effects (NQEs). Herein we describe recent progress in our group in the development of multi-scale simulation methods for the calculation of NQEs and accurate computation of KIEs. We also describe their application to several enzyme systems. In particular we describe the use of combined quantum mechanics/molecular mechanics (QM/MM) methods in classical and quantum simulations. The development of various novel path-integral methods is reviewed. These methods are tailor suited to enzyme systems, where only a few degrees of freedom involved in the chemistry need to be quantized. The application of the hybrid QM/MM quantum-classical simulation approach to three case studies is presented. The first case involves the proton transfer in alanine racemase. The second case presented involves orotidine 5'-monophosphate decarboxylase where multidimensional free energy simulations together with kinetic isotope effects are combined in the study of the reaction mechanism. Finally, we discuss the proton transfer in nitroalkane oxidase, where the enzyme employs tunneling as a catalytic fine-tuning tool. Copyright © 2015 Elsevier Inc. All rights reserved.
THE APPLICATION OF BIOMONITORING DATA IN RISK ASSESSMENT: AN EXPANDED CASE-STUDY WITH BENZENE
Improved analytical methods permit the measurement of low levels of chemicals in human tissues. Despite evidence that chemicals are absorbed, it is unclear whether the relatively low levels detected in human tissue represent a potential adverse health risk. Furthermore, without...
Analyzing Faculty Salaries When Statistics Fail.
ERIC Educational Resources Information Center
Simpson, William A.
The role played by nonstatistical procedures, in contrast to multivariant statistical approaches, in analyzing faculty salaries is discussed. Multivariant statistical methods are usually used to establish or defend against prima facia cases of gender and ethnic discrimination with respect to faculty salaries. These techniques are not applicable,…
Tile-based parallel coordinates and its application in financial visualization
NASA Astrophysics Data System (ADS)
Alsakran, Jamal; Zhao, Ye; Zhao, Xinlei
2010-01-01
Parallel coordinates technique has been widely used in information visualization applications and it has achieved great success in visualizing multivariate data and perceiving their trends. Nevertheless, visual clutter usually weakens or even diminishes its ability when the data size increases. In this paper, we first propose a tile-based parallel coordinates, where the plotting area is divided into rectangular tiles. Each tile stores an intersection density that counts the total number of polylines intersecting with that tile. Consequently, the intersection density is mapped to optical attributes, such as color and opacity, by interactive transfer functions. The method visualizes the polylines efficiently and informatively in accordance with the density distribution, and thus, reduces visual cluttering and promotes knowledge discovery. The interactivity of our method allows the user to instantaneously manipulate the tiles distribution and the transfer functions. Specifically, the classic parallel coordinates rendering is a special case of our method when each tile represents only one pixel. A case study on a real world data set, U.S. stock mutual fund data of year 2006, is presented to show the capability of our method in visually analyzing financial data. The presented visual analysis is conducted by an expert in the domain of finance. Our method gains the support from professionals in the finance field, they embrace it as a potential investment analysis tool for mutual fund managers, financial planners, and investors.
Time reversal imaging and cross-correlations techniques by normal mode theory
NASA Astrophysics Data System (ADS)
Montagner, J.; Fink, M.; Capdeville, Y.; Phung, H.; Larmat, C.
2007-12-01
Time-reversal methods were successfully applied in the past to acoustic waves in many fields such as medical imaging, underwater acoustics, non destructive testing and recently to seismic waves in seismology for earthquake imaging. The increasing power of computers and numerical methods (such as spectral element methods) enables one to simulate more and more accurately the propagation of seismic waves in heterogeneous media and to develop new applications, in particular time reversal in the three-dimensional Earth. Generalizing the scalar approach of Draeger and Fink (1999), the theoretical understanding of time-reversal method can be addressed for the 3D- elastic Earth by using normal mode theory. It is shown how to relate time- reversal methods on one hand, with auto-correlation of seismograms for source imaging and on the other hand, with cross-correlation between receivers for structural imaging and retrieving Green function. The loss of information will be discussed. In the case of source imaging, automatic location in time and space of earthquakes and unknown sources is obtained by time reversal technique. In the case of big earthquakes such as the Sumatra-Andaman earthquake of december 2004, we were able to reconstruct the spatio-temporal history of the rupture. We present here some new applications at the global scale of these techniques on synthetic tests and on real data.
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko
2017-07-01
Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.
Teaching Case: MiHotel--Applicant Processing System Design Case
ERIC Educational Resources Information Center
Miller, Robert E.; Dunn, Paul
2018-01-01
This teaching case describes the functionality of an applicant processing system designed for a fictitious hotel chain. The system detailed in the case includes a webform where applicants complete and submit job applications. The system also includes a desktop application used by hotel managers and Human Resources to track applications and process…
NASA Astrophysics Data System (ADS)
Mahardika, Harry
Hydromechanical energy can be partially converted into electromagnetic energy due to electrokinetic effect, where mechanical energy causes the relative displacement of the charged pore water with respect to the solid skeleton of the porous material and generated electrical current density. An application of this phenomenon is seismoelectric method, a geophysical method in which electromagnetic signals are recorded and associated with the propagation of seismic waves. Due to its coupling nature, seismoelectric method promises advantages in characterizing the subsurface properties and geometry compared to independent employments of seismic or electromagnetic acquisition alone. Since the recorded seismoelectric signal are sensitive to water content changes this method have been applied for groundwater studies to delineates vadoze zone-aquifer boundary since the last twenty years. The problem, however, the existing governing equations of coupled seismic and electromagnetic are not accounted for unsaturated conditions and its petrophysical sensitivity to water content. In this thesis we extend the applications of seismoelectric method for unsaturated porous medium for several geophysical problems. (1) We begin our study with numerical study to localize and characterize a seismic event induced by hydraulic fracturing operation sedimentary rocks. In this problem, we use the fully-saturated case of seismoelectric method and we propose a new joint inversion scheme (seismic and seismoelectric) to determine the position and moment tensor that event. (2) We expand the seismoelectric theory for unsaturated condition and show that the generation of electrical current density are depend on several important petrophysical properties that are sensitive to water content. This new expansion of governing equation provide us theory for developing a new approach for seismoelectric method to image the oil water encroachment front during water flooding of an oil reservoir or an aquifer contaminated with DNAPL. (3) Next, we present a test case which is the first-attempt analysis of seismoelectric sounding measurements done on glacial environment of Glacier de Tsanfleuron through numerical forward modeling. Here we treat the snow-glacial environment similar as with vadoze zone-aquifer zone in unsaturated porous medium. (4) The modified governing equations also provides us foundations to do another case study, which is characterization of seismoelectrical events generated from water content changes in the vadoze zone measured using seismoelectric sounding from NE England. (5) We finalize the thesis with an interpretation of electrical signal generated from water injection experiment done on the top two meter of the soil surface (vadoze zone) using inverse calculation presented on the first topic of the thesis. The fundamental research presented on this thesis hopefully provides a basis for further advancement on seismoelectric or joint seismic-electrical methods for applications ranging from hydrogeology, volcanology and geothermal energy, and oil and gas cases.
Kriston, Levente; Meister, Ramona
2014-03-01
Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.
[The application of full thicknes skin graft inpartial laryngectomy for glottic carcinoma].
Fu, Y G; Sun, D Z; Yang, P Z; Chen, Y L; Chen, Z P; Yang, Z K
2016-08-05
Objective: The aim of this study is to explore the experience and advantages of the application of full thicknes skin graft in glottic carcinoma.partial laryngectomy for glottic carcinoma. Method: One hundred and forty-three patients with glottic cancer were treated with partial laryngectomy.Among those,78 cases were repaired with full-thickness skin graft and 65 cases were repaired with sternohyoid muscular fasciae.Compared the time of extubation and the formation of granulation in laryngeal cavity after operation between the two groups. Result: In the group of full-thickness skin graft,the mean time of decannulation was 6.8 days,5 cases with growth of granulation after operation.In other group,the mean time of decannulation was 10.7 days,16 cases with growth of granulation after operation.The mean time of decannulation( t =-4.739, P <0.01) and the growth of granulation(χ²=9.379, P <0.01) are significantly different between the two groups.No laryngostenosis was found in all patients. Conclusion: The application of full-thicknes skin graft in partial laryngectomy for glottic carcinoma.can shortthe time of extubation and reduce the formation of granulation. Copyright© by the Editorial Department of Journal of Clinical Otorhinolaryngology Head and Neck Surgery.
Computer-aided drug discovery.
Bajorath, Jürgen
2015-01-01
Computational approaches are an integral part of interdisciplinary drug discovery research. Understanding the science behind computational tools, their opportunities, and limitations is essential to make a true impact on drug discovery at different levels. If applied in a scientifically meaningful way, computational methods improve the ability to identify and evaluate potential drug molecules, but there remain weaknesses in the methods that preclude naïve applications. Herein, current trends in computer-aided drug discovery are reviewed, and selected computational areas are discussed. Approaches are highlighted that aid in the identification and optimization of new drug candidates. Emphasis is put on the presentation and discussion of computational concepts and methods, rather than case studies or application examples. As such, this contribution aims to provide an overview of the current methodological spectrum of computational drug discovery for a broad audience.
Progress in Application of Generalized Wigner Distribution to Growth and Other Problems
NASA Astrophysics Data System (ADS)
Einstein, T. L.; Morales-Cifuentes, Josue; Pimpinelli, Alberto; Gonzalez, Diego Luis
We recap the use of the (single-parameter) Generalized Wigner Distribution (GWD) to analyze capture-zone distributions associated with submonolayer epitaxial growth. We discuss recent applications to physical systems, as well as key simulations. We pay particular attention to how this method compares with other methods to assess the critical nucleus size characterizing growth. The following talk discusses a particular case when special insight is needed to reconcile the various methods. We discuss improvements that can be achieved by going to a 2-parameter fragmentation approach. At a much larger scale we have applied this approach to various distributions in socio-political phenomena (areas of secondary administrative units [e.g., counties] and distributions of subway stations). Work at UMD supported by NSF CHE 13-05892.
Empirical evaluation of the market price of risk using the CIR model
NASA Astrophysics Data System (ADS)
Bernaschi, M.; Torosantucci, L.; Uboldi, A.
2007-03-01
We describe a simple but effective method for the estimation of the market price of risk. The basic idea is to compare the results obtained by following two different approaches in the application of the Cox-Ingersoll-Ross (CIR) model. In the first case, we apply the non-linear least squares method to cross sectional data (i.e., all rates of a single day). In the second case, we consider the short rate obtained by means of the first procedure as a proxy of the real market short rate. Starting from this new proxy, we evaluate the parameters of the CIR model by means of martingale estimation techniques. The estimate of the market price of risk is provided by comparing results obtained with these two techniques, since this approach makes possible to isolate the market price of risk and evaluate, under the Local Expectations Hypothesis, the risk premium given by the market for different maturities. As a test case, we apply the method to data of the European Fixed Income Market.
The initial rise method extended to multiple trapping levels in thermoluminescent materials.
Furetta, C; Guzmán, S; Ruiz, B; Cruz-Zaragoza, E
2011-02-01
The well known Initial Rise Method (IR) is commonly used to determine the activation energy when only one glow peak is presented and analysed in the phosphor materials. However, when the glow peak is more complex, a wide peak and some holders appear in the structure. The application of the Initial Rise Method is not valid because multiple trapping levels are considered and then the thermoluminescent analysis becomes difficult to perform. This paper shows the case of a complex glow curve structure as an example and shows that the calculation is also possible using the IR method. The aim of the paper is to extend the well known Initial Rise Method (IR) to the case of multiple trapping levels. The IR method is applied to minerals extracted from Nopal cactus and Oregano spices because the thermoluminescent glow curve's shape suggests a trap distribution instead of a single trapping level. Copyright © 2010 Elsevier Ltd. All rights reserved.
Application of State Quantization-Based Methods in HEP Particle Transport Simulation
NASA Astrophysics Data System (ADS)
Santi, Lucio; Ponieman, Nicolás; Jun, Soon Yung; Genser, Krzysztof; Elvira, Daniel; Castro, Rodrigo
2017-10-01
Simulation of particle-matter interactions in complex geometries is one of the main tasks in high energy physics (HEP) research. An essential aspect of it is an accurate and efficient particle transportation in a non-uniform magnetic field, which includes the handling of volume crossings within a predefined 3D geometry. Quantized State Systems (QSS) is a family of numerical methods that provides attractive features for particle transportation processes, such as dense output (sequences of polynomial segments changing only according to accuracy-driven discrete events) and lightweight detection and handling of volume crossings (based on simple root-finding of polynomial functions). In this work we present a proof-of-concept performance comparison between a QSS-based standalone numerical solver and an application based on the Geant4 simulation toolkit, with its default Runge-Kutta based adaptive step method. In a case study with a charged particle circulating in a vacuum (with interactions with matter turned off), in a uniform magnetic field, and crossing up to 200 volume boundaries twice per turn, simulation results showed speedups of up to 6 times in favor of QSS while it being 10 times slower in the case with zero volume boundaries.
Convergence of methods for coupling of microscopic and mesoscopic reaction-diffusion simulations
NASA Astrophysics Data System (ADS)
Flegg, Mark B.; Hellander, Stefan; Erban, Radek
2015-05-01
In this paper, three multiscale methods for coupling of mesoscopic (compartment-based) and microscopic (molecular-based) stochastic reaction-diffusion simulations are investigated. Two of the three methods that will be discussed in detail have been previously reported in the literature; the two-regime method (TRM) and the compartment-placement method (CPM). The third method that is introduced and analysed in this paper is called the ghost cell method (GCM), since it works by constructing a "ghost cell" in which molecules can disappear and jump into the compartment-based simulation. Presented is a comparison of sources of error. The convergent properties of this error are studied as the time step Δt (for updating the molecular-based part of the model) approaches zero. It is found that the error behaviour depends on another fundamental computational parameter h, the compartment size in the mesoscopic part of the model. Two important limiting cases, which appear in applications, are considered: Δt → 0 and h is fixed; Δt → 0 and h → 0 such that √{ Δt } / h is fixed. The error for previously developed approaches (the TRM and CPM) converges to zero only in the limiting case (ii), but not in case (i). It is shown that the error of the GCM converges in the limiting case (i). Thus the GCM is superior to previous coupling techniques if the mesoscopic description is much coarser than the microscopic part of the model.
Kuniya, Toshikazu; Sano, Hideki
2016-05-10
In mathematical epidemiology, age-structured epidemic models have usually been formulated as the boundary-value problems of the partial differential equations. On the other hand, in engineering, the backstepping method has recently been developed and widely studied by many authors. Using the backstepping method, we obtained a boundary feedback control which plays the role of the threshold criteria for the prediction of increase or decrease of newly infected population. Under an assumption that the period of infectiousness is same for all infected individuals (that is, the recovery rate is given by the Dirac delta function multiplied by a sufficiently large positive constant), the prediction method is simplified to the comparison of the numbers of reported cases at the current and previous time steps. Our prediction method was applied to the reported cases per sentinel of influenza in Japan from 2006 to 2015 and its accuracy was 0.81 (404 correct predictions to the total 500 predictions). It was higher than that of the ARIMA models with different orders of the autoregressive part, differencing and moving-average process. In addition, a proposed method for the estimation of the number of reported cases, which is consistent with our prediction method, was better than that of the best-fitted ARIMA model ARIMA(1,1,0) in the sense of mean square error. Our prediction method based on the backstepping method can be simplified to the comparison of the numbers of reported cases of the current and previous time steps. In spite of its simplicity, it can provide a good prediction for the spread of influenza in Japan.
Managing for resilience: an information theory-based ...
Ecosystems are complex and multivariate; hence, methods to assess the dynamics of ecosystems should have the capacity to evaluate multiple indicators simultaneously. Most research on identifying leading indicators of regime shifts has focused on univariate methods and simple models which have limited utility when evaluating real ecosystems, particularly because drivers are often unknown. We discuss some common univariate and multivariate approaches for detecting critical transitions in ecosystems and demonstrate their capabilities via case studies. Synthesis and applications. We illustrate the utility of an information theory-based index for assessing ecosystem dynamics. Trends in this index also provide a sentinel of both abrupt and gradual transitions in ecosystems. In response to the need to identify leading indicators of regime shifts in ecosystems, our research compares traditional indicators and Fisher information, an information theory based method, by examining four case study systems. Results demonstrate the utility of methods and offers great promise for quantifying and managing for resilience.
Feedback control for unsteady flow and its application to the stochastic Burgers equation
NASA Technical Reports Server (NTRS)
Choi, Haecheon; Temam, Roger; Moin, Parviz; Kim, John
1993-01-01
The study applies mathematical methods of control theory to the problem of control of fluid flow with the long-range objective of developing effective methods for the control of turbulent flows. Model problems are employed through the formalism and language of control theory to present the procedure of how to cast the problem of controlling turbulence into a problem in optimal control theory. Methods of calculus of variations through the adjoint state and gradient algorithms are used to present a suboptimal control and feedback procedure for stationary and time-dependent problems. Two types of controls are investigated: distributed and boundary controls. Several cases of both controls are numerically simulated to investigate the performances of the control algorithm. Most cases considered show significant reductions of the costs to be minimized. The dependence of the control algorithm on the time-descretization method is discussed.
Identifiability and identification of trace continuous pollutant source.
Qu, Hongquan; Liu, Shouwen; Pang, Liping; Hu, Tao
2014-01-01
Accidental pollution events often threaten people's health and lives, and a pollutant source is very necessary so that prompt remedial actions can be taken. In this paper, a trace continuous pollutant source identification method is developed to identify a sudden continuous emission pollutant source in an enclosed space. The location probability model is set up firstly, and then the identification method is realized by searching a global optimal objective value of the location probability. In order to discuss the identifiability performance of the presented method, a conception of a synergy degree of velocity fields is presented in order to quantitatively analyze the impact of velocity field on the identification performance. Based on this conception, some simulation cases were conducted. The application conditions of this method are obtained according to the simulation studies. In order to verify the presented method, we designed an experiment and identified an unknown source appearing in the experimental space. The result showed that the method can identify a sudden trace continuous source when the studied situation satisfies the application conditions.
Ahn, Eunjong; Kim, Hyunjun; Sim, Sung-Han; Shin, Sung Woo; Shin, Myoungsu
2017-01-01
Recently, self-healing technologies have emerged as a promising approach to extend the service life of social infrastructure in the field of concrete construction. However, current evaluations of the self-healing technologies developed for cementitious materials are mostly limited to lab-scale experiments to inspect changes in surface crack width (by optical microscopy) and permeability. Furthermore, there is a universal lack of unified test methods to assess the effectiveness of self-healing technologies. Particularly, with respect to the self-healing of concrete applied in actual construction, nondestructive test methods are required to avoid interrupting the use of the structures under evaluation. This paper presents a review of all existing research on the principles of ultrasonic test methods and case studies pertaining to self-healing concrete. The main objective of the study is to examine the applicability and limitation of various ultrasonic test methods in assessing the self-healing performance. Finally, future directions on the development of reliable assessment methods for self-healing cementitious materials are suggested. PMID:28772640
Multimodal inspection in power engineering and building industries: new challenges and solutions
NASA Astrophysics Data System (ADS)
Kujawińska, Małgorzata; Malesa, Marcin; Malowany, Krzysztof
2013-09-01
Recently the demand and number of applications of full-field, optical measurement methods based on noncoherent light sources increased significantly. They include traditional image processing, thermovision, digital image correlation (DIC) and structured light methods. However, there are still numerous challenges connected with implementation of these methods to in-situ, long-term monitoring in industrial, civil engineering and cultural heritage applications, multimodal measurements of a variety of object features or simply adopting instruments to work in hard environmental conditions. In this paper we focus on 3D DIC method and present its enhancements concerning software modifications (new visualization methods and a method for automatic merging of data distributed in time) and hardware improvements. The modified 3D DIC system combined with infrared camera system is applied in many interesting cases: measurements of boiler drum during annealing and of pipelines in heat power stations and monitoring of different building steel struts at construction site and validation of numerical models of large building structures constructed of graded metal plate arches.
Automatic Black-Box Model Order Reduction using Radial Basis Functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephanson, M B; Lee, J F; White, D A
Finite elements methods have long made use of model order reduction (MOR), particularly in the context of fast freqeucny sweeps. In this paper, we discuss a black-box MOR technique, applicable to a many solution methods and not restricted only to spectral responses. We also discuss automated methods for generating a reduced order model that meets a given error tolerance. Numerical examples demonstrate the effectiveness and wide applicability of the method. With the advent of improved computing hardware and numerous fast solution techniques, the field of computational electromagnetics are progressed rapidly in terms of the size and complexity of problems thatmore » can be solved. Numerous applications, however, require the solution of a problem for many different configurations, including optimization, parameter exploration, and uncertainly quantification, where the parameters that may be changed include frequency, material properties, geometric dimensions, etc. In such cases, thousands of solutions may be needed, so solve times of even a few minutes can be burdensome. Model order reduction (MOR) may alleviate this difficulty by creating a small model that can be evaluated quickly. Many MOR techniques have been applied to electromagnetic problems over the past few decades, particularly in the context of fast frequency sweeps. Recent works have extended these methods to allow more than one parameter and to allow the parameters to represent material and geometric properties. There are still limitations with these methods, however. First, they almost always assume that the finite element method is used to solve the problem, so that the system matrix is a known function of the parameters. Second, although some authors have presented adaptive methods (e.g., [2]), the order of the model is often determined before the MOR process begins, with little insight about what order is actually needed to reach the desired accuracy. Finally, it not clear how to efficiently extend most methods to the multiparameter case. This paper address the above shortcomings be developing a method that uses a block-box approach to the solution method, is adaptive, and is easily extensible to many parameters.« less
Wang, Henry; Xing, Lei
2016-11-08
An autopilot scheme of volumetric-modulated arc therapy (VMAT)/intensity-modulated radiation therapy (IMRT) planning with the guidance of prior knowl-edge is established with recorded interactions between a planner and a commercial treatment planning system (TPS). Microsoft (MS) Visual Studio Coded UI is applied to record some common planner-TPS interactions as subroutines. The TPS used in this study is a Windows-based Eclipse system. The interactions of our application program with Eclipse TPS are realized through a series of subrou-tines obtained by prerecording the mouse clicks or keyboard strokes of a planner in operating the TPS. A strategy to autopilot Eclipse VMAT/IMRT plan selection process is developed as a specific example of the proposed "scripting" method. The autopiloted planning is navigated by a decision function constructed with a reference plan that has the same prescription and similar anatomy with the case at hand. The calculation proceeds by alternating between the Eclipse optimization and the outer-loop optimization independent of the Eclipse. In the C# program, the dosimetric characteristics of a reference treatment plan are used to assess and modify the Eclipse planning parameters and to guide the search for a clinically sensible treatment plan. The approach is applied to plan a head and neck (HN) VMAT case and a prostate IMRT case. Our study demonstrated the feasibility of application programming method in C# environment with recorded interactions of planner-TPS. The process mimics a planner's planning process and automatically provides clinically sensible treatment plans that would otherwise require a large amount of manual trial and error of a planner. The proposed technique enables us to harness a commercial TPS by application programming via the use of recorded human computer interactions and provides an effective tool to greatly facilitate the treatment planning process. © 2016 The Authors.
Wang, Henry
2016-01-01
An autopilot scheme of volumetric‐modulated arc therapy (VMAT)/intensity‐modulated radiation therapy (IMRT) planning with the guidance of prior knowledge is established with recorded interactions between a planner and a commercial treatment planning system (TPS). Microsoft (MS) Visual Studio Coded UI is applied to record some common planner‐TPS interactions as subroutines. The TPS used in this study is a Windows‐based Eclipse system. The interactions of our application program with Eclipse TPS are realized through a series of subroutines obtained by prerecording the mouse clicks or keyboard strokes of a planner in operating the TPS. A strategy to autopilot Eclipse VMAT/IMRT plan selection process is developed as a specific example of the proposed “scripting” method. The autopiloted planning is navigated by a decision function constructed with a reference plan that has the same prescription and similar anatomy with the case at hand. The calculation proceeds by alternating between the Eclipse optimization and the outer‐loop optimization independent of the Eclipse. In the C# program, the dosimetric characteristics of a reference treatment plan are used to assess and modify the Eclipse planning parameters and to guide the search for a clinically sensible treatment plan. The approach is applied to plan a head and neck (HN) VMAT case and a prostate IMRT case. Our study demonstrated the feasibility of application programming method in C# environment with recorded interactions of planner‐TPS. The process mimics a planner's planning process and automatically provides clinically sensible treatment plans that would otherwise require a large amount of manual trial and error of a planner. The proposed technique enables us to harness a commercial TPS by application programming via the use of recorded human computer interactions and provides an effective tool to greatly facilitate the treatment planning process. PACS number(s): 87.55.D‐, 87.55.kd, 87.55.de PMID:27929493
Deriving Safety Cases from Machine-Generated Proofs
NASA Technical Reports Server (NTRS)
Basir, Nurlida; Fischer, Bernd; Denney, Ewen
2009-01-01
Proofs provide detailed justification for the validity of claims and are widely used in formal software development methods. However, they are often complex and difficult to understand, because they use machine-oriented formalisms; they may also be based on assumptions that are not justified. This causes concerns about the trustworthiness of using formal proofs as arguments in safety-critical applications. Here, we present an approach to develop safety cases that correspond to formal proofs found by automated theorem provers and reveal the underlying argumentation structure and top-level assumptions. We concentrate on natural deduction proofs and show how to construct the safety cases by covering the proof tree with corresponding safety case fragments.
Hosny, Gamal Ahmed; Ahmed, Abdel-Salam Abdel-Aleem; Hussein, Mohamed Abd-Elaal
2018-04-07
Corticotomy is an integral part of the Ilizarov method on management of infected nonunited fractures that are challenging orthopaedic surgeons. However, the presence of active draining sinuses may contaminate the operative field with the potential of developing corticotomy site infection. The authors present a surgical technique aiming at minimizing or avoiding the risk of surgical site infection (SSI) in the corticotomy zone. A total of 144 cases of draining infected nonunions were treated by Ilizarov fixator using the corticotomy-first technique. The study included humeral (18 cases), femoral (52 cases), and tibial (74 cases) nonunions. The mean age was 44.48 years with 87 males and 57 females. The mean duration of nonunion was 28.69 months. After debridement, the combined shortening and nonunion gap averaged 5.98 (range 3-10) cm. Evaluation of bone and functional results was done according to Association for the Study and Application of the Method of Ilizarov (ASAMI) criteria. The follow-up period averaged 51.05 (range 36-72) months. None of the cases developed corticotomy site or distraction gap infection. Union was successfully achieved in 141 cases (97.92%). Nonunion persisted in three cases (2.08%) in the distal tibia. Infection was eventually controlled in 138 cases (95.83%). Bone grafting was not needed in any case. The Ilizarov fixator with the corticotomy-first technique was effective in the management of draining infected non-united fractures of long bones while avoiding the SSI in the corticotomy site in all cases.
Case study: Mapping tsunami hazards associated with debris flow into a reservoir
Walder, J.S.; Watts, P.; Waythomas, C.F.
2006-01-01
Debris-flow generated impulse waves (tsunamis) pose hazards in lakes, especially those used for hydropower or recreation. We describe a method for assessing tsunami-related hazards for the case in which inundation by coherent water waves, rather than chaotic splashing, is of primary concern. The method involves an experimentally based initial condition (tsunami source) and a Boussinesq model for tsunami propagation and inundation. Model results are used to create hazard maps that offer guidance for emergency planners and responders. An example application explores tsunami hazards associated with potential debris flows entering Baker Lake, a reservoir on the flanks of the Mount Baker volcano in the northwestern United States. ?? 2006 ASCE.
Application of multi response optimization with grey relational analysis and fuzzy logic method
NASA Astrophysics Data System (ADS)
Winarni, Sri; Wahyu Indratno, Sapto
2018-01-01
Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.
[Application of blocking vessels in operative therapy of non-limb hemangioma].
Zheng, Fanwei; Cen, Ying; Cui, Zhengjun
2005-04-01
To study the surgical method to reduce bleeding in treating hemangioma at non-limb sites. From November 1998 to November 2003, 49 cases of non-limb hemangioma were treated, aged 3 months to 63 years, including 21 males and 28 females. There were 14 cases of capillary hemangioma, 25 cases of cavernous hemangioma, 7 cases of arterial racemose angioma and 3 cases of mixture hemangioma. According to the position and type of hemangioma, the various methods of blocking blood vessels were adopted to assist resect tumors. After the pulsatile artery was felt in arterial racemose angioma of neck and face by palpation, we sutured and knotted it with 7-0 silk string to block the bleeding. We found out the common iliac artery or external iliac artery or femoral artery and blocked them temporarily to resect arterial racemose angioma in inguen and thigh. We sutured and knotted vessel with 7-0 silk string to block the bleeding in capillary hemangioma and cavernous hemangioma of neck and face and truncus. Intraoperative bleeding obviously decreased and the tumor size reduced to various extent. Of the 49 cases, 47 cases achieved complete success, 2 cases bled within two days after operation. A postoperative follow-up of 6 months to 4 years showed that the appearance and function were satisfactory. The preoperative method of blocking blood vessels obviously can reduce intraoperative bleeding and decrease operative difficulty, which makes it possible to eradicate hemangioma and lower recurrence rate.
Methods in the study of discrete upper hybrid waves
NASA Astrophysics Data System (ADS)
Yoon, P. H.; Ye, S.; Labelle, J.; Weatherwax, A. T.; Menietti, J. D.
2007-11-01
Naturally occurring plasma waves characterized by fine frequency structure or discrete spectrum, detected by satellite, rocket-borne instruments, or ground-based receivers, can be interpreted as eigenmodes excited and trapped in field-aligned density structures. This paper overviews various theoretical methods to study such phenomena for a one-dimensional (1-D) density structure. Among the various methods are parabolic approximation, eikonal matching, eigenfunction matching, and full numerical solution based upon shooting method. Various approaches are compared against the full numerical solution. Among the analytic methods it is found that the eigenfunction matching technique best approximates the actual numerical solution. The analysis is further extended to 2-D geometry. A detailed comparative analysis between the eigenfunction matching and fully numerical methods is carried out for the 2-D case. Although in general the two methods compare favorably, significant differences are also found such that for application to actual observations it is prudent to employ the fully numerical method. Application of the methods developed in the present paper to actual geophysical problems will be given in a companion paper.
Correction of Dual-PRF Doppler Velocity Outliers in the Presence of Aliasing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altube, Patricia; Bech, Joan; Argemí, Oriol
In Doppler weather radars, the presence of unfolding errors or outliers is a well-known quality issue for radial velocity fields estimated using the dual–pulse repetition frequency (PRF) technique. Postprocessing methods have been developed to correct dual-PRF outliers, but these need prior application of a dealiasing algorithm for an adequate correction. Our paper presents an alternative procedure based on circular statistics that corrects dual-PRF errors in the presence of extended Nyquist aliasing. The correction potential of the proposed method is quantitatively tested by means of velocity field simulations and is exemplified in the application to real cases, including severe storm events.more » The comparison with two other existing correction methods indicates an improved performance in the correction of clustered outliers. The technique we propose is well suited for real-time applications requiring high-quality Doppler radar velocity fields, such as wind shear and mesocyclone detection algorithms, or assimilation in numerical weather prediction models.« less
Increasing low frequency sound attenuation using compounded single layer of sonic crystal
NASA Astrophysics Data System (ADS)
Gulia, Preeti; Gupta, Arpan
2018-05-01
Sonic crystals (SC) are man-made periodic structures where sound hard scatterers are arranged in a crystalline manner. SC reduces noise in a particular range of frequencies called as band gap. Sonic crystals have a promising application in noise shielding; however, the application is limited due to the size of structure. Particularly for low frequencies, the structure becomes quite bulky, restricting its practical application. This paper presents a compounded model of SC, which has the same overall area and filling fraction but with increased low frequency sound attenuation. Two cases have been considered, a three layer SC and a compounded single layer SC. Both models have been analyzed using finite element simulation and plane wave expansion method. Band gaps for periodic structures have been obtained using both methods which are in good agreement. Further, sound transmission loss has been evaluated using finite element method. The results demonstrate the use of compounded model of Sonic Crystal for low frequency sound attenuation.
Wavelet neural networks: a practical guide.
Alexandridis, Antonios K; Zapranis, Achilleas D
2013-06-01
Wavelet networks (WNs) are a new class of networks which have been used with great success in a wide range of applications. However a general accepted framework for applying WNs is missing from the literature. In this study, we present a complete statistical model identification framework in order to apply WNs in various applications. The following subjects were thoroughly examined: the structure of a WN, training methods, initialization algorithms, variable significance and variable selection algorithms, model selection methods and finally methods to construct confidence and prediction intervals. In addition the complexity of each algorithm is discussed. Our proposed framework was tested in two simulated cases, in one chaotic time series described by the Mackey-Glass equation and in three real datasets described by daily temperatures in Berlin, daily wind speeds in New York and breast cancer classification. Our results have shown that the proposed algorithms produce stable and robust results indicating that our proposed framework can be applied in various applications. Copyright © 2013 Elsevier Ltd. All rights reserved.
Correction of Dual-PRF Doppler Velocity Outliers in the Presence of Aliasing
Altube, Patricia; Bech, Joan; Argemí, Oriol; ...
2017-07-18
In Doppler weather radars, the presence of unfolding errors or outliers is a well-known quality issue for radial velocity fields estimated using the dual–pulse repetition frequency (PRF) technique. Postprocessing methods have been developed to correct dual-PRF outliers, but these need prior application of a dealiasing algorithm for an adequate correction. Our paper presents an alternative procedure based on circular statistics that corrects dual-PRF errors in the presence of extended Nyquist aliasing. The correction potential of the proposed method is quantitatively tested by means of velocity field simulations and is exemplified in the application to real cases, including severe storm events.more » The comparison with two other existing correction methods indicates an improved performance in the correction of clustered outliers. The technique we propose is well suited for real-time applications requiring high-quality Doppler radar velocity fields, such as wind shear and mesocyclone detection algorithms, or assimilation in numerical weather prediction models.« less
Polakovič, Milan; Švitel, Juraj; Bučko, Marek; Filip, Jaroslav; Neděla, Vilém; Ansorge-Schumacher, Marion B; Gemeiner, Peter
2017-05-01
Viable microbial cells are important biocatalysts in the production of fine chemicals and biofuels, in environmental applications and also in emerging applications such as biosensors or medicine. Their increasing significance is driven mainly by the intensive development of high performance recombinant strains supplying multienzyme cascade reaction pathways, and by advances in preservation of the native state and stability of whole-cell biocatalysts throughout their application. In many cases, the stability and performance of whole-cell biocatalysts can be highly improved by controlled immobilization techniques. This review summarizes the current progress in the development of immobilized whole-cell biocatalysts, the immobilization methods as well as in the bioreaction engineering aspects and economical aspects of their biocatalytic applications.
Uchikoga, Nobuyuki; Hirokawa, Takatsugu
2010-05-11
Protein-protein docking for proteins with large conformational changes was analyzed by using interaction fingerprints, one of the scales for measuring similarities among complex structures, utilized especially for searching near-native protein-ligand or protein-protein complex structures. Here, we have proposed a combined method for analyzing protein-protein docking by taking large conformational changes into consideration. This combined method consists of ensemble soft docking with multiple protein structures, refinement of complexes, and cluster analysis using interaction fingerprints and energy profiles. To test for the applicability of this combined method, various CaM-ligand complexes were reconstructed from the NMR structures of unbound CaM. For the purpose of reconstruction, we used three known CaM-ligands, namely, the CaM-binding peptides of cyclic nucleotide gateway (CNG), CaM kinase kinase (CaMKK) and the plasma membrane Ca2+ ATPase pump (PMCA), and thirty-one structurally diverse CaM conformations. For each ligand, 62000 CaM-ligand complexes were generated in the docking step and the relationship between their energy profiles and structural similarities to the native complex were analyzed using interaction fingerprint and RMSD. Near-native clusters were obtained in the case of CNG and CaMKK. The interaction fingerprint method discriminated near-native structures better than the RMSD method in cluster analysis. We showed that a combined method that includes the interaction fingerprint is very useful for protein-protein docking analysis of certain cases.
Classical and all-floating FETI methods for the simulation of arterial tissues
Augustin, Christoph M.; Holzapfel, Gerhard A.; Steinbach, Olaf
2015-01-01
High-resolution and anatomically realistic computer models of biological soft tissues play a significant role in the understanding of the function of cardiovascular components in health and disease. However, the computational effort to handle fine grids to resolve the geometries as well as sophisticated tissue models is very challenging. One possibility to derive a strongly scalable parallel solution algorithm is to consider finite element tearing and interconnecting (FETI) methods. In this study we propose and investigate the application of FETI methods to simulate the elastic behavior of biological soft tissues. As one particular example we choose the artery which is – as most other biological tissues – characterized by anisotropic and nonlinear material properties. We compare two specific approaches of FETI methods, classical and all-floating, and investigate the numerical behavior of different preconditioning techniques. In comparison to classical FETI, the all-floating approach has not only advantages concerning the implementation but in many cases also concerning the convergence of the global iterative solution method. This behavior is illustrated with numerical examples. We present results of linear elastic simulations to show convergence rates, as expected from the theory, and results from the more sophisticated nonlinear case where we apply a well-known anisotropic model to the realistic geometry of an artery. Although the FETI methods have a great applicability on artery simulations we will also discuss some limitations concerning the dependence on material parameters. PMID:26751957
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Y; Tan, J; Jiang, S
Purpose: High dose rate (HDR) brachytherapy treatment planning is conventionally performed in a manual fashion. Yet it is highly desirable to perform computerized automated planning to improve treatment planning efficiency, eliminate human errors, and reduce plan quality variation. The goal of this research is to develop an automatic treatment planning tool for HDR brachytherapy with a cylinder applicator for vaginal cancer. Methods: After inserting the cylinder applicator into the patient, a CT scan was acquired and was loaded to an in-house developed treatment planning software. The cylinder applicator was automatically segmented using image-processing techniques. CTV was generated based on user-specifiedmore » treatment depth and length. Locations of relevant points (apex point, prescription point, and vaginal surface point), central applicator channel coordinates, and dwell positions were determined according to their geometric relations with the applicator. Dwell time was computed through an inverse optimization process. The planning information was written into DICOM-RT plan and structure files to transfer the automatically generated plan to a commercial treatment planning system for plan verification and delivery. Results: We have tested the system retrospectively in nine patients treated with vaginal cylinder applicator. These cases were selected with different treatment prescriptions, lengths, depths, and cylinder diameters to represent a large patient population. Our system was able to generate treatment plans for these cases with clinically acceptable quality. Computation time varied from 3–6 min. Conclusion: We have developed a system to perform automated treatment planning for HDR brachytherapy with a cylinder applicator. Such a novel system has greatly improved treatment planning efficiency and reduced plan quality variation. It also served as a testbed to demonstrate the feasibility of automatic HDR treatment planning for more complicated cases.« less
Tugnoli, Alessandro; Khan, Faisal; Amyotte, Paul; Cozzani, Valerio
2008-12-15
Layout planning plays a key role in the inherent safety performance of process plants since this design feature controls the possibility of accidental chain-events and the magnitude of possible consequences. A lack of suitable methods to promote the effective implementation of inherent safety in layout design calls for the development of new techniques and methods. In the present paper, a safety assessment approach suitable for layout design in the critical early phase is proposed. The concept of inherent safety is implemented within this safety assessment; the approach is based on an integrated assessment of inherent safety guideword applicability within the constraints typically present in layout design. Application of these guidewords is evaluated along with unit hazards and control devices to quantitatively map the safety performance of different layout options. Moreover, the economic aspects related to safety and inherent safety are evaluated by the method. Specific sub-indices are developed within the integrated safety assessment system to analyze and quantify the hazard related to domino effects. The proposed approach is quick in application, auditable and shares a common framework applicable in other phases of the design lifecycle (e.g. process design). The present work is divided in two parts: Part 1 (current paper) presents the application of inherent safety guidelines in layout design and the index method for safety assessment; Part 2 (accompanying paper) describes the domino hazard sub-index and demonstrates the proposed approach with a case study, thus evidencing the introduction of inherent safety features in layout design.
Applications of the JARS method to study levee sites in southern Texas and southern New Mexico
Ivanov, J.; Miller, R.D.; Xia, J.; Dunbar, J.B.
2007-01-01
We apply the joint analysis of refractions with surface waves (JARS) method to several sites and compare its results to traditional refraction-tomography methods in efforts of finding a more realistic solution to the inverse refraction-traveltime problem. The JARS method uses a reference model, derived from surface-wave shear-wave velocity estimates, as a constraint. In all of the cases JARS estimates appear more realistic than those from the conventional refraction-tomography methods. As a result, we consider, the JARS algorithm as the preferred method for finding solutions to the inverse refraction-tomography problems. ?? 2007 Society of Exploration Geophysicists.
Model Uncertainty Quantification Methods In Data Assimilation
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
NASA Astrophysics Data System (ADS)
Kuznetsov, M. V.; Ogorodnikov, I. I.; Vorokh, A. S.
2014-01-01
The state-of-the-art theory and experimental applications of X-ray photoelectron diffraction (XPD) and photoelectron holography (PH) are discussed. These methods are rapidly progressing and serve to examine the surface atomic structure of solids, including nanostructures formed on surfaces during adsorption of gases, epitaxial film growth, etc. The depth of analysis by these methods is several nanometres, which makes it possible to characterize the positions of atoms localized both on and beneath the surface. A remarkable feature of the XPD and PH methods is their sensitivity to the type of examined atoms and, in the case of high energy resolution, to the particular chemical form of the element under study. The data on experimental applications of XPD and PH to studies of various surface structures are analyzed and generalized. The bibliography includes 121 references.