IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning
ERIC Educational Resources Information Center
Winters, Niall; Mor, Yishay
2008-01-01
One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…
Do We Really Want A Fearless Society? Technical Paper No. 40
ERIC Educational Resources Information Center
Fisher, R. Michael
2012-01-01
This paper summarizes the literature across disciplines and cultures that examines the possibility of a "fearless society." The author presents various theories and critical methodologies that critique this literature and yet support its inherent impulse of the Fearlessness Principle. The author suggests, despite the problems of interpretation of…
Preparing Emerging Doctoral Scholars for Transdisciplinary Research: A Developmental Approach
ERIC Educational Resources Information Center
Kemp, Susan Patricia; Nurius, Paula S.
2015-01-01
Research models that bridge disciplinary, theoretical, and methodological boundaries are increasingly common as funders and the public push for effective responses to pressing social problems. Although social work is inherently an integrative discipline, there is growing recognition of the need to better prepare emerging scholars for sophisticated…
The Fundamental Flaws of Immunoassays and Potential Solutions Using Tandem Mass Spectrometry
Hoofnagle, Andrew N.; Wener, Mark H.
2009-01-01
Immunoassays have made it possible to measure dozens of individual proteins and other analytes in human samples for help in establishing the diagnosis and prognosis of disease. In too many cases the results of those measurements are misleading and can lead to unnecessary treatment or missed opportunities for therapeutic interventions. These cases stem from problems inherent to immunoassays performed with human samples, which include a lack of concordance across platforms, autoantibodies, anti-reagent antibodies, and the high-dose hook effect. Tandem mass spectrometry may represent a detection method capable of alleviating many of the flaws inherent to immunoassays. We review our understanding of the problems associated with immunoassays on human specimens and describe methodologies using tandem mass spectrometry that could solve some of those problems. We also provide a critical discussion of the potential pitfalls of novel mass spectrometric approaches in the clinical laboratory. PMID:19538965
Working through the Problems of Study Abroad Using the Methodologies of Religious Studies
ERIC Educational Resources Information Center
Siegler, Elijah
2015-01-01
After illustrating the joys of teaching religious studies abroad with an anecdote from my trip to China, I warn of some of its inherent pedagogical and ethical challenges. I argue that teaching some of the "new directions" in religious studies scholarship might address these challenges. These include a turning away from the abstract…
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
Probabilistic structural mechanics research for parallel processing computers
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Martin, William R.
1991-01-01
Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical.
An applicational process for dynamic balancing of turbomachinery shafting
NASA Technical Reports Server (NTRS)
Verhoff, Vincent G.
1990-01-01
The NASA Lewis Research Center has developed and implemented a time-efficient methodology for dynamically balancing turbomachinery shafting. This methodology minimizes costly facility downtime by using a balancing arbor (mandrel) that simulates the turbomachinery (rig) shafting. The need for precision dynamic balancing of turbomachinery shafting and for a dynamic balancing methodology is discussed in detail. Additionally, the inherent problems (and their causes and effects) associated with unbalanced turbomachinery shafting as a function of increasing shaft rotational speeds are discussed. Included are the design criteria concerning rotor weight differentials for rotors made of different materials that have similar parameters and shafting. The balancing methodology for applications where rotor replaceability is a requirement is also covered. This report is intended for use as a reference when designing, fabricating, and troubleshooting turbomachinery shafting.
Teacher stress research: problems and progress.
Pithers, R T
1995-12-01
There is a reasonably large body of published research evidence available which indicates that teaching is a 'highly' or 'extremely highly' stressful occupation for up to one-third of its professionals. Generalisations such as this one, however, are wrought with problems. These problems, for instance, range from confusion about the definition of stress through to how it is to be measured. They include methodological problems inherent in some of the research used to examine the area of teacher stress and as well include, for example, confusion about the effect of mediating variables in the production of stress and strain. This paper examines some of the more important pervasive research problems in current research on teacher stress and makes some suggestions for research progress.
An evolving systems-based methodology for healthcare planning.
Warwick, Jon; Bell, Gary
2007-01-01
Healthcare planning seems beset with problems at all hierarchical levels. These are caused by the 'soft' nature of many of the issues present in healthcare planning and the high levels of complexity inherent in healthcare services. There has, in recent years, been a move to utilize systems thinking ideas in an effort to gain a better understanding of the forces at work within the healthcare environment and these have had some success. This paper argues that systems-based methodologies can be further enhanced by metrication and modeling which assist in exploring the changed emergent behavior of a system resulting from management intervention. The paper describes the Holon Framework as an evolving systems-based approach that has been used to help clients understand complex systems (in the education domain) that would have application in the analysis of healthcare problems.
Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno
2018-05-28
Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.
Suzuki, Masahiko; Mitoma, Hiroshi; Yoneyama, Mitsuru
2017-01-01
Long-term and objective monitoring is necessary for full assessment of the condition of patients with Parkinson's disease (PD). Recent advances in biotechnology have seen the development of various types of wearable (body-worn) sensor systems. By using accelerometers and gyroscopes, these devices can quantify motor abnormalities, including decreased activity and gait disturbances, as well as nonmotor signs, such as sleep disturbances and autonomic dysfunctions in PD. This review discusses methodological problems inherent in wearable devices. Until now, analysis of the mean values of motion-induced signals on a particular day has been widely applied in the clinical management of PD patients. On the other hand, the reliability of these devices to detect various events, such as freezing of gait and dyskinesia, has been less than satisfactory. Quantification of disease-specific changes rather than nonspecific changes is necessary.
2017-01-01
Long-term and objective monitoring is necessary for full assessment of the condition of patients with Parkinson's disease (PD). Recent advances in biotechnology have seen the development of various types of wearable (body-worn) sensor systems. By using accelerometers and gyroscopes, these devices can quantify motor abnormalities, including decreased activity and gait disturbances, as well as nonmotor signs, such as sleep disturbances and autonomic dysfunctions in PD. This review discusses methodological problems inherent in wearable devices. Until now, analysis of the mean values of motion-induced signals on a particular day has been widely applied in the clinical management of PD patients. On the other hand, the reliability of these devices to detect various events, such as freezing of gait and dyskinesia, has been less than satisfactory. Quantification of disease-specific changes rather than nonspecific changes is necessary. PMID:28607801
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains. PMID:22164064
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains.
LES, DNS, and RANS for the Analysis of High-Speed Turbulent Reacting Flows
NASA Technical Reports Server (NTRS)
Colucci, P. J.; Jaberi, F. A.; Givi, P.
1996-01-01
A filtered density function (FDF) method suitable for chemically reactive flows is developed in the context of large eddy simulation. The advantage of the FDF methodology is its inherent ability to resolve subgrid scales (SGS) scalar correlations that otherwise have to be modeled. Because of the lack of robust models to accurately predict these correlations in turbulent reactive flows, simulations involving turbulent combustion are often met with a degree of skepticism. The FDF methodology avoids the closure problem associated with these terms and treats the reaction in an exact manner. The scalar FDF approach is particularly attractive since it can be coupled with existing hydrodynamic computational fluid dynamics (CFD) codes.
Applications of Support Vector Machines In Chemo And Bioinformatics
NASA Astrophysics Data System (ADS)
Jayaraman, V. K.; Sundararajan, V.
2010-10-01
Conventional linear & nonlinear tools for classification, regression & data driven modeling are being replaced on a rapid scale by newer techniques & tools based on artificial intelligence and machine learning. While the linear techniques are not applicable for inherently nonlinear problems, newer methods serve as attractive alternatives for solving real life problems. Support Vector Machine (SVM) classifiers are a set of universal feed-forward network based classification algorithms that have been formulated from statistical learning theory and structural risk minimization principle. SVM regression closely follows the classification methodology. In this work recent applications of SVM in Chemo & Bioinformatics will be described with suitable illustrative examples.
Formal and heuristic system decomposition methods in multidisciplinary synthesis. Ph.D. Thesis, 1991
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.
1991-01-01
The multidisciplinary interactions which exist in large scale engineering design problems provide a unique set of difficulties. These difficulties are associated primarily with unwieldy numbers of design variables and constraints, and with the interdependencies of the discipline analysis modules. Such obstacles require design techniques which account for the inherent disciplinary couplings in the analyses and optimizations. The objective of this work was to develop an efficient holistic design synthesis methodology that takes advantage of the synergistic nature of integrated design. A general decomposition approach for optimization of large engineering systems is presented. The method is particularly applicable for multidisciplinary design problems which are characterized by closely coupled interactions among discipline analyses. The advantage of subsystem modularity allows for implementation of specialized methods for analysis and optimization, computational efficiency, and the ability to incorporate human intervention and decision making in the form of an expert systems capability. The resulting approach is not a method applicable to only a specific situation, but rather, a methodology which can be used for a large class of engineering design problems in which the system is non-hierarchic in nature.
NASA Astrophysics Data System (ADS)
Lee, Hyunki; Kim, Min Young; Moon, Jeon Il
2017-12-01
Phase measuring profilometry and moiré methodology have been widely applied to the three-dimensional shape measurement of target objects, because of their high measuring speed and accuracy. However, these methods suffer from inherent limitations called a correspondence problem, or 2π-ambiguity problem. Although a kind of sensing method to combine well-known stereo vision and phase measuring profilometry (PMP) technique simultaneously has been developed to overcome this problem, it still requires definite improvement for sensing speed and measurement accuracy. We propose a dynamic programming-based stereo PMP method to acquire more reliable depth information and in a relatively small time period. The proposed method efficiently fuses information from two stereo sensors in terms of phase and intensity simultaneously based on a newly defined cost function of dynamic programming. In addition, the important parameters are analyzed at the view point of the 2π-ambiguity problem and measurement accuracy. To analyze the influence of important hardware and software parameters related to the measurement performance and to verify its efficiency, accuracy, and sensing speed, a series of experimental tests were performed with various objects and sensor configurations.
Evaluation of a methodology for model identification in the time domain
NASA Technical Reports Server (NTRS)
Beck, R. T.; Beck, J. L.
1988-01-01
A model identification methodology for structural dynamics has been applied to simulated vibrational data as a first step in evaluating its accuracy. The evaluation has taken into account a wide variety of factors which affect the accuracy of the procedure. The effects of each of these factors were observed in both the response time histories and the estimates of the parameters of the model by comparing them with the exact values of the system. Each factor was varied independently but combinations of these have also been considered in an effort to simulate real situations. The results of the tests have shown that for the chain model, the procedure yields robust estimates of the stiffness parameters under the conditions studied whenever uniqueness is ensured. When inaccuracies occur in the results, they are intimately related to non-uniqueness conditions inherent in the inverse problem and not to shortcomings in the methodology.
Analysis of harmonic spline gravity models for Venus and Mars
NASA Technical Reports Server (NTRS)
Bowin, Carl
1986-01-01
Methodology utilizing harmonic splines for determining the true gravity field from Line-Of-Sight (LOS) acceleration data from planetary spacecraft missions was tested. As is well known, the LOS data incorporate errors in the zero reference level that appear to be inherent in the processing procedure used to obtain the LOS vectors. The proposed method offers a solution to this problem. The harmonic spline program was converted from the VAX 11/780 to the Ridge 32C computer. The problem with the matrix inversion routine that improved inversion of the data matrices used in the Optimum Estimate program for global Earth studies was solved. The problem of obtaining a successful matrix inversion for a single rev supplemented by data for the two adjacent revs still remains.
Yu, Xue; Chen, Wei-Neng; Gu, Tianlong; Zhang, Huaxiang; Yuan, Huaqiang; Kwong, Sam; Zhang, Jun
2018-07-01
This paper studies a specific class of multiobjective combinatorial optimization problems (MOCOPs), namely the permutation-based MOCOPs. Many commonly seen MOCOPs, e.g., multiobjective traveling salesman problem (MOTSP), multiobjective project scheduling problem (MOPSP), belong to this problem class and they can be very different. However, as the permutation-based MOCOPs share the inherent similarity that the structure of their search space is usually in the shape of a permutation tree, this paper proposes a generic multiobjective set-based particle swarm optimization methodology based on decomposition, termed MS-PSO/D. In order to coordinate with the property of permutation-based MOCOPs, MS-PSO/D utilizes an element-based representation and a constructive approach. Through this, feasible solutions under constraints can be generated step by step following the permutation-tree-shaped structure. And problem-related heuristic information is introduced in the constructive approach for efficiency. In order to address the multiobjective optimization issues, the decomposition strategy is employed, in which the problem is converted into multiple single-objective subproblems according to a set of weight vectors. Besides, a flexible mechanism for diversity control is provided in MS-PSO/D. Extensive experiments have been conducted to study MS-PSO/D on two permutation-based MOCOPs, namely the MOTSP and the MOPSP. Experimental results validate that the proposed methodology is promising.
Automated control of hierarchical systems using value-driven methods
NASA Technical Reports Server (NTRS)
Pugh, George E.; Burke, Thomas E.
1990-01-01
An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.
Methodology for astronaut reconditioning research.
Beard, David J; Cook, Jonathan A
2017-01-01
Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chambaron, Stéphanie; Ginhac, Dominique; Perruchet, Pierre
2008-05-01
Serial reaction time tasks and, more generally, the visual-motor sequential paradigms are increasingly popular tools in a variety of research domains, from studies on implicit learning in laboratory contexts to the assessment of residual learning capabilities of patients in clinical settings. A consequence of this success, however, is the increased variability in paradigms and the difficulty inherent in respecting the methodological principles that two decades of experimental investigations have made more and more stringent. The purpose of the present article is to address those problems. We present a user-friendly application that simplifies running classical experiments, but is flexible enough to permit a broad range of nonstandard manipulations for more specific objectives. Basic methodological guidelines are also provided, as are suggestions for using the software to explore unconventional directions of research. The most recent version of gSRT-Soft may be obtained for free by contacting the authors.
Distributed control topologies for deep space formation flying spacecraft
NASA Technical Reports Server (NTRS)
Hadaegh, F. Y.; Smith, R. S.
2002-01-01
A formation of satellites flying in deep space can be specified in terms of the relative satellite positions and absolute satellite orientations. The redundancy in the relative position specification generates a family of control topologies with equivalent stability and reference tracking performance, one of which can be implemented without requiring communication between the spacecraft. A relative position design formulation is inherently unobservable, and a methodology for circumventing this problem is presented. Additional redundancy in the control actuation space can be exploited for feed-forward control of the formation centroid's location in space, or for minimization of total fuel consumption.
A software engineering approach to expert system design and verification
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.; Goodwin, Mary Ann
1988-01-01
Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.
Current Trends in Modeling Research for Turbulent Aerodynamic Flows
NASA Technical Reports Server (NTRS)
Gatski, Thomas B.; Rumsey, Christopher L.; Manceau, Remi
2007-01-01
The engineering tools of choice for the computation of practical engineering flows have begun to migrate from those based on the traditional Reynolds-averaged Navier-Stokes approach to methodologies capable, in theory if not in practice, of accurately predicting some instantaneous scales of motion in the flow. The migration has largely been driven by both the success of Reynolds-averaged methods over a wide variety of flows as well as the inherent limitations of the method itself. Practitioners, emboldened by their ability to predict a wide-variety of statistically steady, equilibrium turbulent flows, have now turned their attention to flow control and non-equilibrium flows, that is, separation control. This review gives some current priorities in traditional Reynolds-averaged modeling research as well as some methodologies being applied to a new class of turbulent flow control problems.
Optimal placement of actuators and sensors in control augmented structural optimization
NASA Technical Reports Server (NTRS)
Sepulveda, A. E.; Schmit, L. A., Jr.
1990-01-01
A control-augmented structural synthesis methodology is presented in which actuator and sensor placement is treated in terms of (0,1) variables. Structural member sizes and control variables are treated simultaneously as design variables. A multiobjective utopian approach is used to obtain a compromise solution for inherently conflicting objective functions such as strucutal mass control effort and number of actuators. Constraints are imposed on transient displacements, natural frequencies, actuator forces and dynamic stability as well as controllability and observability of the system. The combinatorial aspects of the mixed - (0,1) continuous variable design optimization problem are made tractable by combining approximation concepts with branch and bound techniques. Some numerical results for example problems are presented to illustrate the efficacy of the design procedure set forth.
A New Concurrent Multiscale Methodology for Coupling Molecular Dynamics and Finite Element Analyses
NASA Technical Reports Server (NTRS)
Yamakov, Vesselin; Saether, Erik; Glaessgen, Edward H/.
2008-01-01
The coupling of molecular dynamics (MD) simulations with finite element methods (FEM) yields computationally efficient models that link fundamental material processes at the atomistic level with continuum field responses at higher length scales. The theoretical challenge involves developing a seamless connection along an interface between two inherently different simulation frameworks. Various specialized methods have been developed to solve particular classes of problems. Many of these methods link the kinematics of individual MD atoms with FEM nodes at their common interface, necessarily requiring that the finite element mesh be refined to atomic resolution. Some of these coupling approaches also require simulations to be carried out at 0 K and restrict modeling to two-dimensional material domains due to difficulties in simulating full three-dimensional material processes. In the present work, a new approach to MD-FEM coupling is developed based on a restatement of the standard boundary value problem used to define a coupled domain. The method replaces a direct linkage of individual MD atoms and finite element (FE) nodes with a statistical averaging of atomistic displacements in local atomic volumes associated with each FE node in an interface region. The FEM and MD computational systems are effectively independent and communicate only through an iterative update of their boundary conditions. With the use of statistical averages of the atomistic quantities to couple the two computational schemes, the developed approach is referred to as an embedded statistical coupling method (ESCM). ESCM provides an enhanced coupling methodology that is inherently applicable to three-dimensional domains, avoids discretization of the continuum model to atomic scale resolution, and permits finite temperature states to be applied.
Challenges in conducting qualitative research in health: A conceptual paper.
Khankeh, Hamidreza; Ranjbar, Maryam; Khorasani-Zavareh, Davoud; Zargham-Boroujeni, Ali; Johansson, Eva
2015-01-01
Qualitative research focuses on social world and provides the tools to study health phenomena from the perspective of those experiencing them. Identifying the problem, forming the question, and selecting an appropriate methodology and design are some of the initial challenges that researchers encounter in the early stages of any research project. These problems are particularly common for novices. This article describes the practical challenges of using qualitative inquiry in the field of health and the challenges of performing an interpretive research based on professional experience as a qualitative researcher and on available literature. One of the main topics discussed is the nature of qualitative research, its inherent challenges, and how to overcome them. Some of those highlighted here include: identification of the research problem, formation of the research question/aim, and selecting an appropriate methodology and research design, which are the main concerns of qualitative researchers and need to be handled properly. Insights from real-life experiences in conducting qualitative research in health reveal these issues. The paper provides personal comments on the experiences of a researcher in conducting pure qualitative research in the field of health. It offers insights into the practical difficulties encountered when performing qualitative studies and offers solutions and alternatives applied by these authors, which may be of use to others.
Cancer heterogeneity: origins and implications for genetic association studies
Urbach, Davnah; Lupien, Mathieu; Karagas, Margaret R.; Moore, Jason H.
2012-01-01
Genetic association studies have become standard approaches to characterize the genetic and epigenetic variability associated with cancer development, including predispositions and mutations. However, the bewildering genetic and phenotypic heterogeneity inherent in cancer both magnifies the conceptual and methodological problems associated with these approaches and renders the translation of available genetic information into a knowledge that is both biologically sound and clinically relevant difficult. Here, we elaborate on the underlying causes of this complexity, illustrate why it represents a challenge for genetic association studies, and briefly discuss how it can be reconciled with the ultimate goal of identifying targetable disease pathways and successfully treating individual patients. PMID:22858414
Crowdsourcing biomedical research: leveraging communities as innovation engines
Saez-Rodriguez, Julio; Costello, James C.; Friend, Stephen H.; Kellen, Michael R.; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo
2018-01-01
The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories. PMID:27418159
Crowdsourcing biomedical research: leveraging communities as innovation engines.
Saez-Rodriguez, Julio; Costello, James C; Friend, Stephen H; Kellen, Michael R; Mangravite, Lara; Meyer, Pablo; Norman, Thea; Stolovitzky, Gustavo
2016-07-15
The generation of large-scale biomedical data is creating unprecedented opportunities for basic and translational science. Typically, the data producers perform initial analyses, but it is very likely that the most informative methods may reside with other groups. Crowdsourcing the analysis of complex and massive data has emerged as a framework to find robust methodologies. When the crowdsourcing is done in the form of collaborative scientific competitions, known as Challenges, the validation of the methods is inherently addressed. Challenges also encourage open innovation, create collaborative communities to solve diverse and important biomedical problems, and foster the creation and dissemination of well-curated data repositories.
Multi-criteria analysis of potential recovery facilities in a reverse supply chain
NASA Astrophysics Data System (ADS)
Nukala, Satish; Gupta, Surendra M.
2005-11-01
Analytic Hierarchy Process (AHP) has been employed by researchers for solving multi-criteria analysis problems. However, AHP is often criticized for its unbalanced scale of judgments and failure to precisely handle the inherent uncertainty and vagueness in carrying out the pair-wise comparisons. With an objective to address these drawbacks, in this paper, we employ a fuzzy approach in selecting potential recovery facilities in the strategic planning of a reverse supply chain network that addresses the decision maker's level of confidence in the fuzzy assessments and his/her attitude towards risk. A numerical example is considered to illustrate the methodology.
Gomez-Ramirez, Jaime; Sanz, Ricardo
2013-09-01
One of the most important scientific challenges today is the quantitative and predictive understanding of biological function. Classical mathematical and computational approaches have been enormously successful in modeling inert matter, but they may be inadequate to address inherent features of biological systems. We address the conceptual and methodological obstacles that lie in the inverse problem in biological systems modeling. We introduce a full Bayesian approach (FBA), a theoretical framework to study biological function, in which probability distributions are conditional on biophysical information that physically resides in the biological system that is studied by the scientist. Copyright © 2013 Elsevier Ltd. All rights reserved.
Research Challenges Inherent in Determining Improvement in University Teaching
ERIC Educational Resources Information Center
Devlin, Marcia
2008-01-01
Using a recent study that examined the effectiveness of a particular approach to improving individual university teaching as a case study, this paper examines some of the challenges inherent in educational research, particularly research examining the effects of interventions to improve teaching. Aspects of the research design and methodology and…
Rapid Design of Gravity Assist Trajectories
NASA Technical Reports Server (NTRS)
Carrico, J.; Hooper, H. L.; Roszman, L.; Gramling, C.
1991-01-01
Several International Solar Terrestrial Physics (ISTP) missions require the design of complex gravity assisted trajectories in order to investigate the interaction of the solar wind with the Earth's magnetic field. These trajectories present a formidable trajectory design and optimization problem. The philosophy and methodology that enable an analyst to design and analyse such trajectories are discussed. The so called 'floating end point' targeting, which allows the inherently nonlinear multiple body problem to be solved with simple linear techniques, is described. The combination of floating end point targeting with analytic approximations with a Newton method targeter to achieve trajectory design goals quickly, even for the very sensitive double lunar swingby trajectories used by the ISTP missions, is demonstrated. A multiconic orbit integration scheme allows fast and accurate orbit propagation. A prototype software tool, Swingby, built for trajectory design and launch window analysis, is described.
Methodological Limitations of the Application of Expert Systems Methodology in Reading.
ERIC Educational Resources Information Center
Willson, Victor L.
Methodological deficiencies inherent in expert-novice reading research make it impossible to draw inferences about curriculum change. First, comparisons of intact groups are often used as a basis for making causal inferences about how observed characteristics affect behaviors. While comparing different groups is not by itself a useless activity,…
Innovations in design and technology. The story of hip arthroplasty.
Amstutz, H C
2000-09-01
The current study reviews the early history of surgeon-initiated trial and error development in hip joint arthroplasty and the subsequent methodological evolution to proper criteria for hypothesis testing using bioengineers and other research scientists. The interplay and relationships to industry, universities, scientific organizations, and the Food and Drug Administration with respect to device development in hip arthroplasty are reviewed. The ethics of and responsibilities to involved parties are outlined, citing the history of many contemporary developments. Examples are provided from the evolution and introduction of unsuccessful innovations, and the problems inherent in the current methodology of the approval process from the Food and Drug Administration using the 5-10K, Investigative Device Exemption, and the Pre-Market Approval protocols. The pros and cons of randomized trials for devices are outlined with the conclusion that they are not appropriate for device introduction. The proper, rational methodology for introduction of new devices is a phased-in clinical trial process after pertinent bench testing. Finally, the ethical dilemmas created by managed care are addressed. Industry involvements of the surgeon-spokesmen are cited.
Travel into a fairy land: a critique of modern qualitative and mixed methods psychologies.
Toomela, Aaro
2011-03-01
In this article modern qualitative and mixed methods approaches are criticized from the standpoint of structural-systemic epistemology. It is suggested that modern qualitative methodologies suffer from several fallacies: some of them are grounded on inherently contradictory epistemology, the others ask scientific questions after the methods have been chosen, conduct studies inductively so that not only answers but even questions are often supposed to be discovered, do not create artificial situations and constraints on study-situations, are adevelopmental by nature, study not the external things and phenomena but symbols and representations--often the object of studies turns out to be the researcher rather than researched, rely on ambiguous data interpretation methods based to a large degree on feelings and opinions, aim to understand unique which is theoretically impossible, or have theoretical problems with sampling. Any one of these fallacies would be sufficient to exclude any possibility to achieve structural-systemic understanding of the studied things and phenomena. It also turns out that modern qualitative methodologies share several fallacies with the quantitative methodology. Therefore mixed methods approaches are not able to overcome the fundamental difficulties that characterize mixed methods taken separately. It is proposed that structural-systemic methodology that dominated psychological thought in the pre-WWII continental Europe is philosophically and theoretically better grounded than the other methodologies that can be distinguished in psychology today. Future psychology should be based on structural-systemic methodology.
Environment, genes, and experience: lessons from behavior genetics.
Barsky, Philipp I
2010-11-01
The article reviews the theoretical analysis of the problems inherent in studying the environment within behavior genetics across several periods in the development of environmental studies in behavior genetics and proposes some possible alternatives to traditional approaches to studying the environment in behavior genetics. The first period (from the end of the 1920s to the end of the 1970s), when the environment was not actually studied, is called pre-environmental; during this time, the basic principles and theoretical models of understanding environmental effects in behavior genetics were developed. The second period is characterized by the development of studies on environmental influences within the traditional behavior genetics paradigm; several approaches to studying the environment emerged in behavior genetics during this period, from the beginning of the 1980s until today. At the present time, the field is undergoing paradigmatic changes, concerned with methodology, theory, and mathematical models of genotype-environment interplay; this might be the beginning of a third period of development of environmental studies in behavior genetics. In another part, the methodological problems related to environmental studies in behavior genetics are discussed. Although the methodology used in differential psychology is applicable for assessment of differences between individuals, it is insufficient to explain the sources of these differences. In addition, we stress that psychoanalytic studies of twins and their experiences, initiated in the 1930s and continued episodically until the 1980s, could bring an interesting methodology and contribute to the explanation of puzzling findings from environmental studies of behavior genetics. Finally, we will conclude with implications from the results of environmental studies in behavior genetics, including methodological issues. Copyright © 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Seay, Jeffrey R.; Eden, Mario R.
2008-01-01
This paper introduces, via case study example, the benefit of including risk assessment methodology and inherently safer design practices into the curriculum for chemical engineering students. This work illustrates how these tools can be applied during the earliest stages of conceptual process design. The impacts of decisions made during…
Bringing the Unidata IDV to the Cloud
NASA Astrophysics Data System (ADS)
Fisher, W. I.; Oxelson Ganter, J.
2015-12-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.
Challenges in conducting qualitative research in health: A conceptual paper
Khankeh, Hamidreza; Ranjbar, Maryam; Khorasani-Zavareh, Davoud; Zargham-Boroujeni, Ali; Johansson, Eva
2015-01-01
Background: Qualitative research focuses on social world and provides the tools to study health phenomena from the perspective of those experiencing them. Identifying the problem, forming the question, and selecting an appropriate methodology and design are some of the initial challenges that researchers encounter in the early stages of any research project. These problems are particularly common for novices. Materials and Methods: This article describes the practical challenges of using qualitative inquiry in the field of health and the challenges of performing an interpretive research based on professional experience as a qualitative researcher and on available literature. Results: One of the main topics discussed is the nature of qualitative research, its inherent challenges, and how to overcome them. Some of those highlighted here include: identification of the research problem, formation of the research question/aim, and selecting an appropriate methodology and research design, which are the main concerns of qualitative researchers and need to be handled properly. Insights from real-life experiences in conducting qualitative research in health reveal these issues. Conclusions: The paper provides personal comments on the experiences of a researcher in conducting pure qualitative research in the field of health. It offers insights into the practical difficulties encountered when performing qualitative studies and offers solutions and alternatives applied by these authors, which may be of use to others. PMID:26793245
A Framework for the Optimization of Discrete-Event Simulation Models
NASA Technical Reports Server (NTRS)
Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.
1996-01-01
With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.
Concepts of formal concept analysis
NASA Astrophysics Data System (ADS)
Žáček, Martin; Homola, Dan; Miarka, Rostislav
2017-07-01
The aim of this article is apply of Formal Concept Analysis on concept of world. Formal concept analysis (FCA) as a methodology of data analysis, information management and knowledge representation has potential to be applied to a verity of linguistic problems. FCA is mathematical theory for concepts and concept hierarchies that reflects an understanding of concept. Formal concept analysis explicitly formalizes extension and intension of a concept, their mutual relationships. A distinguishing feature of FCA is an inherent integration of three components of conceptual processing of data and knowledge, namely, the discovery and reasoning with concepts in data, discovery and reasoning with dependencies in data, and visualization of data, concepts, and dependencies with folding/unfolding capabilities.
Preparing Emerging Doctoral Scholars for Transdisciplinary Research: A Developmental Approach
Kemp, Susan P.; Nurius, Paula S.
2015-01-01
Research models that bridge disciplinary, theoretical, and methodological boundaries are increasingly common as funders and the public push for timely, effective, collaborative responses to pressing social and environmental problems. Although social work is inherently an integrative discipline, there is growing recognition of the need to better prepare emerging scholars for sophisticated transdisciplinary and translational research environments. This paper outlines a developmental, competency-oriented approach to enhancing the readiness of doctoral students and emerging scholars in social work and allied disciplines for transdisciplinary research, describes an array of pedagogical tools applicable in doctoral course work and other program elements, and urges coordinated attention to enhancing the field’s transdisciplinary training capacity. PMID:26005286
Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,
1975-12-01
Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and
Speed Accuracy Tradeoffs in Human Speech Production
2017-05-01
for considering Fitts’ law in the domain of speech production is elucidated. Methodological challenges in applying Fitts-style analysis are addressed...order to assess whether articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in...performing Fitts-style analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor
Speed-Accuracy Tradeoffs in Speech Production
2017-06-01
imaging data of speech production. A theoretical framework for considering Fitts’ law in the domain of speech production is elucidated. Methodological ...articulatory kinematics conform to Fitts’ law. A second, associated goal is to address the methodological challenges inherent in performing Fitts-style...analysis on rtMRI data of speech production. Methodological challenges include segmenting continuous speech into specific motor tasks, defining key
Social Network Analysis: A New Methodology for Counseling Research.
ERIC Educational Resources Information Center
Koehly, Laura M.; Shivy, Victoria A.
1998-01-01
Social network analysis (SNA) uses indices of relatedness among individuals to produce representations of social structures and positions inherent in dyads or groups. SNA methods provide quantitative representations of ongoing transactional patterns in a given social environment. Methodological issues, applications and resources are discussed…
Evaluation of Reference Services--A Review
ERIC Educational Resources Information Center
Kuruppu, Pali U.
2007-01-01
Understanding the inherent deficiencies in reference service as provided is critical to providing effective, high quality service. Quantitative and qualitative research methodologies, as well as a combination of both, are being used to evaluate these services. The identification of appropriate research methodology is critical to an effective…
What lies behind crop decisions?Coming to terms with revealing farmers' preferences
NASA Astrophysics Data System (ADS)
Gomez, C.; Gutierrez, C.; Pulido-Velazquez, M.; López Nicolás, A.
2016-12-01
The paper offers a fully-fledged applied revealed preference methodology to screen and represent farmers' choices as the solution of an optimal program involving trade-offs among the alternative welfare outcomes of crop decisions such as profits, income security and management easiness. The recursive two-stage method is proposed as an alternative to cope with the methodological problems inherent to common practice positive mathematical program methodologies (PMP). Differently from PMP, in the model proposed in this paper, the non-linear costs that are required for both calibration and smooth adjustment are not at odds with the assumptions of linear Leontief technologies and fixed crop prices and input costs. The method frees the model from ad-hoc assumptions about costs and then recovers the potential of economic analysis as a means to understand the rationale behind observed and forecasted farmers' decisions and then to enhance the potential of the model to support policy making in relevant domains such as agricultural policy, water management, risk management and climate change adaptation. After the introduction, where the methodological drawbacks and challenges are set up, section two presents the theoretical model, section three develops its empirical application and presents its implementation to a Spanish irrigation district and finally section four concludes and makes suggestions for further research.
Extended cooperative control synthesis
NASA Technical Reports Server (NTRS)
Davidson, John B.; Schmidt, David K.
1994-01-01
This paper reports on research for extending the Cooperative Control Synthesis methodology to include a more accurate modeling of the pilot's controller dynamics. Cooperative Control Synthesis (CCS) is a methodology that addresses the problem of how to design control laws for piloted, high-order, multivariate systems and/or non-conventional dynamic configurations in the absence of flying qualities specifications. This is accomplished by emphasizing the parallel structure inherent in any pilot-controlled, augmented vehicle. The original CCS methodology is extended to include the Modified Optimal Control Model (MOCM), which is based upon the optimal control model of the human operator developed by Kleinman, Baron, and Levison in 1970. This model provides a modeling of the pilot's compensation dynamics that is more accurate than the simplified pilot dynamic representation currently in the CCS methodology. Inclusion of the MOCM into the CCS also enables the modeling of pilot-observation perception thresholds and pilot-observation attention allocation affects. This Extended Cooperative Control Synthesis (ECCS) allows for the direct calculation of pilot and system open- and closed-loop transfer functions in pole/zero form and is readily implemented in current software capable of analysis and design for dynamic systems. Example results based upon synthesizing an augmentation control law for an acceleration command system in a compensatory tracking task using the ECCS are compared with a similar synthesis performed by using the original CCS methodology. The ECCS is shown to provide augmentation control laws that yield more favorable, predicted closed-loop flying qualities and tracking performance than those synthesized using the original CCS methodology.
Neger, Emily N; Prinz, Ronald J
2015-07-01
Parental substance abuse is a serious problem affecting the well-being of children and families. The co-occurrence of parental substance abuse and problematic parenting is recognized as a major public health concern. This review focuses on 21 outcome studies that tested dual treatment of substance abuse and parenting. A summary of theoretical conceptualizations of the connections between substance abuse and parenting provides a backdrop for the review. Outcomes of the dual treatment studies were generally positive with respect to reduction of parental substance use and improvement of parenting. Research in this area varied in methodological rigor and needs to overcome challenges regarding design issues, sampling frame, and complexities inherent in such a high-risk population. This area of work can be strengthened by randomized controlled trials, use of mixed-methods outcome measures, consideration of parent involvement with child protective services, involvement of significant others in treatment, provision of concrete supports for treatment attendance and facilitative public policies. Copyright © 2015 Elsevier Ltd. All rights reserved.
Near-Nash targeting strategies for heterogeneous teams of autonomous combat vehicles
NASA Astrophysics Data System (ADS)
Galati, David G.; Simaan, Marwan A.
2008-04-01
Military strategists are currently seeking methodologies to control large numbers of autonomous assets. Automated planners based upon the Nash equilibrium concept in non-zero sum games are one option. Because such planners inherently consider possible adversarial actions, assets are able to adapt to, and to some extent predict, potential enemy actions. However, these planners must function properly both in cases in which a pure Nash strategy does not exist and in scenarios possessing multiple Nash equilibria. Another issue that needs to be overcome is the scalability of the Nash equilibrium. That is, as the dimensionality of the problem increases, the Nash strategies become unfeasible to compute using traditional methodologies. In this paper we introduce the concept of near-Nash strategies as a mechanism to overcome these difficulties. We then illustrate this concept by deriving the near-Nash strategies and using these strategies as the basis for an intelligent battle plan for heterogeneous teams of autonomous combat air vehicles in the Multi-Team Dynamic Weapon Target Assignment model.
Neger, Emily N.; Prinz, Ronald J.
2015-01-01
Parental substance abuse is a serious problem affecting the well-being of children and families. The co-occurrence of parental substance abuse and problematic parenting is recognized as a major public health concern. This review focuses on 21 outcome studies that tested dual treatment of substance abuse and parenting. A summary of theoretical conceptualizations of the connections between substance abuse and parenting provides a backdrop for the review. Outcomes of the dual treatment studies were generally positive with respect to reduction of parental substance use and improvement of parenting. Research in this area varied in methodological rigor and needs to overcome challenges regarding design issues, sampling frame, and complexities inherent in such a high-risk population. This area of work can be strengthened by randomized controlled trials, use of mixed-methods outcome measures, consideration of parent involvement with child protective services, involvement of significant others in treatment, provision of concrete supports for treatment attendance and facilitative public policies. PMID:25939033
ERIC Educational Resources Information Center
Schudde, Lauren
2018-01-01
To date, the theory of intersectionality has largely guided qualitative efforts in social science and education research. Translating the construct to new methodological approaches is inherently complex and challenging, but offers the possibility of breaking down silos that keep education researchers with similar interests--but different…
"Parents as Partners" in Research and Evaluation: Methodological and Ethical Issues and Solutions.
ERIC Educational Resources Information Center
Wolfendale, Sheila
1999-01-01
This article investigates parents' status within educational research and examines some research paradigms that have been used. A number of inherent methodological and ethical issues are identified and several fundamental aspects are explored. It is argued that researchers should adopt a partnership model for cooperative research on parental…
The Challenge of Researching Violent Societies: Navigating Complexities in Ethnography
ERIC Educational Resources Information Center
Tshabangu, Icarbord
2009-01-01
Through use of a recent study researching democratic education and citizenship in Zimbabwe, this paper examines the methodological dilemmas and challenges faced by an ethnographer, particularly by a research student in a violent context. The article posits a bricolage strategy to navigate some of the dangers and methodological dilemmas inherent so…
School Psychology as a Relational Enterprise: The Role and Process of Qualitative Methodology
ERIC Educational Resources Information Center
Newman, Daniel S.; Clare, Mary M.
2016-01-01
The purpose of this article is to explore the application of qualitative research to establishing a more complete understanding of relational processes inherent in school psychology practice. We identify the building blocks of rigorous qualitative research design through a conceptual overview of qualitative paradigms, methodologies, methods (i.e.,…
Practicing the Four Seasons of Ethnography Methodology while Searching for Identity in Mexico
ERIC Educational Resources Information Center
Pitts, Margaret Jane
2012-01-01
This narrative is an account of my field experiences and challenges practicing Gonzalez's (2000) Four Seasons of Ethnography methodology in Mexico City. I describe the complexities and tensions inherent in managing two scientific paradigms: Western scientific logic vs. a more organic ontology. The experiential knowledge produced in this text is…
The inherent paradox of clinical trials in psychiatry.
Helmchen, H; Müller-Oerlinghausen, B
1975-01-01
The authors sum up the central issue of ethics in the conduct of controlled clinical trials in these two paradoxes: 'first, it is unethical to use treatment the efficacy of which has not been examined scientifically; second, it is also unethical to examine the efficacy of treatment scientifically.' In this paper they set out to demonstrate how these antithetical statements apply in controlled trials conducted in psychiatric patients. In such trials the problem of obtaining informed consent may be acute, but in these patients giving 'informed' consent might contribute to a further exacerbation of the illness. Nevertheless the problem cannot be evaded, and scientific judgments must be applied to treatment for it to be sound and improved for the further benefit of patients. These problems in the case of psychiatric controlled trials are a part of the methodology, and in Germany a new drug law has been drafted to attempt to clarify the issue. The authors briefly discuss its application, and its consequences if such a law were enacted. British psychiatrists have exactly the same problems to face but so far no attempts have been made to establish a legal framework. PMID:775089
Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks
NASA Technical Reports Server (NTRS)
Brown, Richard Lee
2008-01-01
Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.
A Statistical Approach for the Concurrent Coupling of Molecular Dynamics and Finite Element Methods
NASA Technical Reports Server (NTRS)
Saether, E.; Yamakov, V.; Glaessgen, E.
2007-01-01
Molecular dynamics (MD) methods are opening new opportunities for simulating the fundamental processes of material behavior at the atomistic level. However, increasing the size of the MD domain quickly presents intractable computational demands. A robust approach to surmount this computational limitation has been to unite continuum modeling procedures such as the finite element method (FEM) with MD analyses thereby reducing the region of atomic scale refinement. The challenging problem is to seamlessly connect the two inherently different simulation techniques at their interface. In the present work, a new approach to MD-FEM coupling is developed based on a restatement of the typical boundary value problem used to define a coupled domain. The method uses statistical averaging of the atomistic MD domain to provide displacement interface boundary conditions to the surrounding continuum FEM region, which, in return, generates interface reaction forces applied as piecewise constant traction boundary conditions to the MD domain. The two systems are computationally disconnected and communicate only through a continuous update of their boundary conditions. With the use of statistical averages of the atomistic quantities to couple the two computational schemes, the developed approach is referred to as an embedded statistical coupling method (ESCM) as opposed to a direct coupling method where interface atoms and FEM nodes are individually related. The methodology is inherently applicable to three-dimensional domains, avoids discretization of the continuum model down to atomic scales, and permits arbitrary temperatures to be applied.
Theatre across Prison Walls: Using Democratizing Theatre Methodologies to Subvert Carceral Control
ERIC Educational Resources Information Center
Rocchio, Rivka
2017-01-01
In spite of the influx of articles on practitioner experience teaching in carceral settings, little has been written around the methodologies that best level the inherent inequity between practitioner and ensemble. This article seeks to respond to some of the questions and concerns around the balancing of power structures by describing the…
A Test Method for Monitoring Modulus Changes during Durability Tests on Building Joint Sealants
Christopher C. White; Donald L. Hunston; Kar Tean Tan; Gregory T. Schueneman
2012-01-01
The durability of building joint sealants is generally assessed using a descriptive methodology involving visual inspection of exposed specimens for defects. It is widely known that this methodology has inherent limitations, including that the results are qualitative. A new test method is proposed that provides more fundamental and quantitative information about...
Lara, Alvaro R; Galindo, Enrique; Ramírez, Octavio T; Palomares, Laura A
2006-11-01
The presence of spatial gradients in fundamental culture parameters, such as dissolved gases, pH, concentration of substrates, and shear rate, among others, is an important problem that frequently occurs in large-scale bioreactors. This problem is caused by a deficient mixing that results from limitations inherent to traditional scale-up methods and practical constraints during large-scale bioreactor design and operation. When cultured in a heterogeneous environment, cells are continuously exposed to fluctuating conditions as they travel through the various zones of a bioreactor. Such fluctuations can affect cell metabolism, yields, and quality of the products of interest. In this review, the theoretical analyses that predict the existence of environmental gradients in bioreactors and their experimental confirmation are reviewed. The origins of gradients in common culture parameters and their effects on various organisms of biotechnological importance are discussed. In particular, studies based on the scale-down methodology, a convenient tool for assessing the effect of environmental heterogeneities, are surveyed.
NASA Astrophysics Data System (ADS)
Haines, P. E.; Esler, J. G.; Carver, G. D.
2014-06-01
A new methodology for the formulation of an adjoint to the transport component of the chemistry transport model TOMCAT is described and implemented in a new model, RETRO-TOM. The Eulerian backtracking method is used, allowing the forward advection scheme (Prather's second-order moments) to be efficiently exploited in the backward adjoint calculations. Prather's scheme is shown to be time symmetric, suggesting the possibility of high accuracy. To attain this accuracy, however, it is necessary to make a careful treatment of the "density inconsistency" problem inherent to offline transport models. The results are verified using a series of test experiments. These demonstrate the high accuracy of RETRO-TOM when compared with direct forward sensitivity calculations, at least for problems in which flux limiters in the advection scheme are not required. RETRO-TOM therefore combines the flexibility and stability of a "finite difference of adjoint" formulation with the accuracy of an "adjoint of finite difference" formulation.
NASA Astrophysics Data System (ADS)
Haines, P. E.; Esler, J. G.; Carver, G. D.
2014-01-01
A new methodology for the formulation of an adjoint to the transport component of the chemistry transport model TOMCAT is described and implemented in a new model RETRO-TOM. The Eulerian backtracking method is used, allowing the forward advection scheme (Prather's second-order moments), to be efficiently exploited in the backward adjoint calculations. Prather's scheme is shown to be time-symmetric suggesting the possibility of high accuracy. To attain this accuracy, however, it is necessary to make a careful treatment of the "density inconsistency" problem inherent to offline transport models. The results are verified using a series of test experiments. These demonstrate the high accuracy of RETRO-TOM when compared with direct forward sensitivity calculations, at least for problems in which flux-limiters in the advection scheme are not required. RETRO-TOM therefore combines the flexibility and stability of a "finite difference of adjoint" formulation with the accuracy of an "adjoint of finite difference" formulation.
Online interviewing with interpreters in humanitarian contexts
Chiumento, Anna; Rahman, Atif; Frith, Lucy
2018-01-01
ABSTRACT Purpose: Recognising that one way to address the logistical and safety considerations of research conducted in humanitarian emergencies is to use internet communication technologies to facilitate interviews online, this article explores some practical and methodological considerations inherent to qualitative online interviewing. Method: Reflections from a case study of a multi-site research project conducted in post-conflict countries are presented. Synchronous online cross-language qualitative interviews were conducted in one country. Although only a small proportion of interviews were conducted online (six out of 35), it remains important to critically consider the impact upon data produced in this way. Results: A range of practical and methodological considerations are discussed, illustrated with examples. Results suggest that whilst online interviewing has methodological and ethical potential and versatility, there are inherent practical challenges in settings with poor internet and electricity infrastructure. Notable methodological limitations include barriers to building rapport due to partial visual and non-visual cues, and difficulties interpreting pauses or silences. Conclusions: Drawing upon experiences in this case study, strategies for managing the practical and methodological limitations of online interviewing are suggested, alongside recommendations for supporting future research practice. These are intended to act as a springboard for further reflection, and operate alongside other conceptual frameworks for online interviewing. PMID:29532739
Online interviewing with interpreters in humanitarian contexts.
Chiumento, Anna; Machin, Laura; Rahman, Atif; Frith, Lucy
2018-12-01
Recognising that one way to address the logistical and safety considerations of research conducted in humanitarian emergencies is to use internet communication technologies to facilitate interviews online, this article explores some practical and methodological considerations inherent to qualitative online interviewing. Reflections from a case study of a multi-site research project conducted in post-conflict countries are presented. Synchronous online cross-language qualitative interviews were conducted in one country. Although only a small proportion of interviews were conducted online (six out of 35), it remains important to critically consider the impact upon data produced in this way. A range of practical and methodological considerations are discussed, illustrated with examples. Results suggest that whilst online interviewing has methodological and ethical potential and versatility, there are inherent practical challenges in settings with poor internet and electricity infrastructure. Notable methodological limitations include barriers to building rapport due to partial visual and non-visual cues, and difficulties interpreting pauses or silences. Drawing upon experiences in this case study, strategies for managing the practical and methodological limitations of online interviewing are suggested, alongside recommendations for supporting future research practice. These are intended to act as a springboard for further reflection, and operate alongside other conceptual frameworks for online interviewing.
Network support for turn-taking in multimedia collaboration
NASA Astrophysics Data System (ADS)
Dommel, Hans-Peter; Garcia-Luna-Aceves, Jose J.
1997-01-01
The effectiveness of collaborative multimedia systems depends on the regulation of access to their shared resources, such as continuous media or instruments used concurrently by multiple parties. Existing applications use only simple protocols to mediate such resource contention. Their cooperative rules follow a strict agenda and are largely application-specific. The inherent problem of floor control lacks a systematic methodology. This paper presents a general model on floor control for correct, scalable, fine-grained and fair resource sharing that integrates user interaction with network conditions, and adaptation to various media types. The motion of turn-taking known from psycholinguistics in studies on discourse structure is adapted for this framework. Viewed as a computational analogy to speech communication, online collaboration revolves around dynamically allocated access permissions called floors. The control semantics of floors derives from concurrently control methodology. An explicit specification and verification of a novel distributed Floor Control Protocol are presented. Hosts assume sharing roles that allow for efficient dissemination of control information, agreeing on a floor holder which is granted mutually exclusive access to a resource. Performance analytic aspects of floor control protocols are also briefly discussed.
NASA Astrophysics Data System (ADS)
Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui
2018-04-01
Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane
The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significantmore » funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.« less
NASA Astrophysics Data System (ADS)
Dou, Zhi-Wu
2010-08-01
To solve the inherent safety problem puzzling the coal mining industry, analyzing the characteristic and the application of distributed interactive simulation based on high level architecture (DIS/HLA), a new method is proposed for developing coal mining industry inherent safety distributed interactive simulation adopting HLA technology. Researching the function and structure of the system, a simple coal mining industry inherent safety is modeled with HLA, the FOM and SOM are developed, and the math models are suggested. The results of the instance research show that HLA plays an important role in developing distributed interactive simulation of complicated distributed system and the method is valid to solve the problem puzzling coal mining industry. To the coal mining industry, the conclusions show that the simulation system with HLA plays an important role to identify the source of hazard, to make the measure for accident, and to improve the level of management.
Benchmarking Strategies for Measuring the Quality of Healthcare: Problems and Prospects
Lovaglio, Pietro Giorgio
2012-01-01
Over the last few years, increasing attention has been directed toward the problems inherent to measuring the quality of healthcare and implementing benchmarking strategies. Besides offering accreditation and certification processes, recent approaches measure the performance of healthcare institutions in order to evaluate their effectiveness, defined as the capacity to provide treatment that modifies and improves the patient's state of health. This paper, dealing with hospital effectiveness, focuses on research methods for effectiveness analyses within a strategy comparing different healthcare institutions. The paper, after having introduced readers to the principle debates on benchmarking strategies, which depend on the perspective and type of indicators used, focuses on the methodological problems related to performing consistent benchmarking analyses. Particularly, statistical methods suitable for controlling case-mix, analyzing aggregate data, rare events, and continuous outcomes measured with error are examined. Specific challenges of benchmarking strategies, such as the risk of risk adjustment (case-mix fallacy, underreporting, risk of comparing noncomparable hospitals), selection bias, and possible strategies for the development of consistent benchmarking analyses, are discussed. Finally, to demonstrate the feasibility of the illustrated benchmarking strategies, an application focused on determining regional benchmarks for patient satisfaction (using 2009 Lombardy Region Patient Satisfaction Questionnaire) is proposed. PMID:22666140
Benchmarking strategies for measuring the quality of healthcare: problems and prospects.
Lovaglio, Pietro Giorgio
2012-01-01
Over the last few years, increasing attention has been directed toward the problems inherent to measuring the quality of healthcare and implementing benchmarking strategies. Besides offering accreditation and certification processes, recent approaches measure the performance of healthcare institutions in order to evaluate their effectiveness, defined as the capacity to provide treatment that modifies and improves the patient's state of health. This paper, dealing with hospital effectiveness, focuses on research methods for effectiveness analyses within a strategy comparing different healthcare institutions. The paper, after having introduced readers to the principle debates on benchmarking strategies, which depend on the perspective and type of indicators used, focuses on the methodological problems related to performing consistent benchmarking analyses. Particularly, statistical methods suitable for controlling case-mix, analyzing aggregate data, rare events, and continuous outcomes measured with error are examined. Specific challenges of benchmarking strategies, such as the risk of risk adjustment (case-mix fallacy, underreporting, risk of comparing noncomparable hospitals), selection bias, and possible strategies for the development of consistent benchmarking analyses, are discussed. Finally, to demonstrate the feasibility of the illustrated benchmarking strategies, an application focused on determining regional benchmarks for patient satisfaction (using 2009 Lombardy Region Patient Satisfaction Questionnaire) is proposed.
NASA Astrophysics Data System (ADS)
Fekete, Tamás
2018-05-01
Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.
The development of an inherent safety approach to the prevention of domino accidents.
Cozzani, Valerio; Tugnoli, Alessandro; Salzano, Ernesto
2009-11-01
The severity of industrial accidents in which a domino effect takes place is well known in the chemical and process industry. The application of an inherent safety approach for the prevention of escalation events leading to domino accidents was explored in the present study. Reference primary scenarios were analyzed and escalation vectors were defined. Inherent safety distances were defined and proposed as a metric to express the intensity of the escalation vectors. Simple rules of thumb were presented for a preliminary screening of these distances. Swift reference indices for layout screening with respect to escalation hazard were also defined. Two case studies derived from existing layouts of oil refineries were selected to understand the potentialities coming from the application in the methodology. The results evidenced that the approach allows a first comparative assessment of the actual domino hazard in a layout, and the identification of critical primary units with respect to escalation events. The methodology developed also represents a useful screening tool to identify were to dedicate major efforts in the design of add-on measures, optimizing conventional passive and active measures for the prevention of severe domino accidents.
Boom Rendezvous Alternative Docking Approach
NASA Technical Reports Server (NTRS)
Bonometti, Joseph A.
2006-01-01
Space rendezvous and docking has always been attempted with primarily one philosophic methodology. The slow matching of one vehicle's orbit by a second vehicle and then a final closing sequence that ends in matching the orbits with perfect precision and with near zero relative velocities. The task is time consuming, propellant intensive, risk inherent (plume impingement, collisions, fuel depletion, etc.) and requires substantial hardware mass. The historical background and rationale as to why this approach is used is discussed in terms of the path-not-taken and in light of an alternate methodology. Rendezvous and docking by boom extension is suggested to have inherent advantages that today s technology can readily exploit. Extension from the primary spacecraft, beyond its inherent large inertia, allows low inertia connections to be made rapidly and safely. Plume contamination issues are eliminated as well as the extra propellant mass and risk required for the final thruster (docking) operations. Space vehicle connection hardware can be significantly lightened. Also, docking sensors and controls require less fidelity; allowing them to be more robust and less sensitive. It is the potential safety advantage and mission risk reduction that makes this approach attractive, besides the prospect of nominal time and mass savings.
Toward information management in corporations (2)
NASA Astrophysics Data System (ADS)
Shibata, Mitsuru
If construction of inhouse information management systems in an advanced information society should be positioned along with the social information management, its base making begins with reviewing current paper filing systems. Since the problems which inhere in inhouse information management systems utilizing OA equipments also inhere in paper filing systems, the first step toward full scale inhouse information management should be to grasp and solve the fundamental problems in current filing systems. This paper describes analysis of fundamental problems in filing systems, making new type of offices and analysis of improvement needs in filing systems, and some points in improving filing systems.
Intra-Campaign Changes in Voting Preferences: The Impact of Media and Party Communication
Johann, David; Königslöw, Katharina Kleinen-von; Kritzinger, Sylvia; Thomas, Kathrin
2018-01-01
An increasing number of citizens change and adapt their party preferences during the electoral campaign. We analyze which short-term factors explain intra-campaign changes in voting preferences, focusing on the visibility and tone of news media reporting and party canvassing. Our analyses rely on an integrative data approach, linking data from media content analysis to public opinion data. This enables us to investigate the relative impact of news media reporting as well as party communication. Inherently, we overcome previously identified methodological problems in the study of communication effects on voting behavior. Our findings reveal that campaigns matter: Especially interpersonal party canvassing increases voters’ likelihood to change their voting preferences in favor of the respective party, whereas media effects are limited to quality news outlets and depend on individual voters’ party ambivalence. PMID:29695892
Intra-Campaign Changes in Voting Preferences: The Impact of Media and Party Communication.
Johann, David; Königslöw, Katharina Kleinen-von; Kritzinger, Sylvia; Thomas, Kathrin
2018-01-01
An increasing number of citizens change and adapt their party preferences during the electoral campaign. We analyze which short-term factors explain intra-campaign changes in voting preferences, focusing on the visibility and tone of news media reporting and party canvassing. Our analyses rely on an integrative data approach, linking data from media content analysis to public opinion data. This enables us to investigate the relative impact of news media reporting as well as party communication. Inherently, we overcome previously identified methodological problems in the study of communication effects on voting behavior. Our findings reveal that campaigns matter: Especially interpersonal party canvassing increases voters' likelihood to change their voting preferences in favor of the respective party, whereas media effects are limited to quality news outlets and depend on individual voters' party ambivalence.
The adolescence of a thirteenth-century visionary nun.
Kroll, J; De Ganck, R
1986-11-01
Among the most notable features of the religious revival in western Europe in the early thirteenth century was the development of mysticism among the nuns and religious women of the lowlands. As scholarly attention becomes increasingly focused on this group of remarkable women, the question arises whether a psychiatric viewpoint has something of value to offer to the understanding of such individuals and the culture in which they struggled. The methodological and intellectual problems inherent in examining the life of a thirteenth-century mystic with a twentieth-century empirical frame of reference are illustrated in this study of the adolescence of Beatrice of Nazareth. Beatrice's stormy asceticism, ecstatic states and mood swings lend themselves to potentially competing hypotheses regarding the spiritual and psychopathological significance of her adolescent development and eventual life-course. Common grounds for reconciling these alternative models are discussed.
Financial options methodology for analyzing investments in new technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wenning, B.D.
1994-12-31
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisionsmore » are being contemplated.« less
Financial options methodology for analyzing investments in new technology
NASA Technical Reports Server (NTRS)
Wenning, B. D.
1995-01-01
The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.
NASA Astrophysics Data System (ADS)
Ehlmann, Bryon K.
Current scientific experiments are often characterized by massive amounts of very complex data and the need for complex data analysis software. Object-oriented database (OODB) systems have the potential of improving the description of the structure and semantics of this data and of integrating the analysis software with the data. This dissertation results from research to enhance OODB functionality and methodology to support scientific databases (SDBs) and, more specifically, to support a nuclear physics experiments database for the Continuous Electron Beam Accelerator Facility (CEBAF). This research to date has identified a number of problems related to the practical application of OODB technology to the conceptual design of the CEBAF experiments database and other SDBs: the lack of a generally accepted OODB design methodology, the lack of a standard OODB model, the lack of a clear conceptual level in existing OODB models, and the limited support in existing OODB systems for many common object relationships inherent in SDBs. To address these problems, the dissertation describes an Object-Relationship Diagram (ORD) and an Object-oriented Database Definition Language (ODDL) that provide tools that allow SDB design and development to proceed systematically and independently of existing OODB systems. These tools define multi-level, conceptual data models for SDB design, which incorporate a simple notation for describing common types of relationships that occur in SDBs. ODDL allows these relationships and other desirable SDB capabilities to be supported by an extended OODB system. A conceptual model of the CEBAF experiments database is presented in terms of ORDs and the ODDL to demonstrate their functionality and use and provide a foundation for future development of experimental nuclear physics software using an OODB approach.
Performance and state-space analyses of systems using Petri nets
NASA Technical Reports Server (NTRS)
Watson, James Francis, III
1992-01-01
The goal of any modeling methodology is to develop a mathematical description of a system that is accurate in its representation and also permits analysis of structural and/or performance properties. Inherently, trade-offs exist between the level detail in the model and the ease with which analysis can be performed. Petri nets (PN's), a highly graphical modeling methodology for Discrete Event Dynamic Systems, permit representation of shared resources, finite capacities, conflict, synchronization, concurrency, and timing between state changes. By restricting the state transition time delays to the family of exponential density functions, Markov chain analysis of performance problems is possible. One major drawback of PN's is the tendency for the state-space to grow rapidly (exponential complexity) compared to increases in the PN constructs. It is the state space, or the Markov chain obtained from it, that is needed in the solution of many problems. The theory of state-space size estimation for PN's is introduced. The problem of state-space size estimation is defined, its complexities are examined, and estimation algorithms are developed. Both top-down and bottom-up approaches are pursued, and the advantages and disadvantages of each are described. Additionally, the author's research in non-exponential transition modeling for PN's is discussed. An algorithm for approximating non-exponential transitions is developed. Since only basic PN constructs are used in the approximation, theory already developed for PN's remains applicable. Comparison to results from entropy theory show the transition performance is close to the theoretic optimum. Inclusion of non-exponential transition approximations improves performance results at the expense of increased state-space size. The state-space size estimation theory provides insight and algorithms for evaluating this trade-off.
Chelliah, Pandian; Murgesan, Kasinathan; Samvel, Sosamma; Chelamchala, Babu Rao; Tammana, Jayakumar; Nagarajan, Murali; Raj, Baldev
2010-07-10
Optical-fiber-based sensors have inherent advantages, such as immunity to electromagnetic interference, compared to the conventional sensors. Distributed optical fiber sensor (DOFS) systems, such as Raman and Brillouin distributed temperature sensors are used for leak detection. The inherent noise of fiber-based systems leads to occasional false alarms. In this paper, a methodology is proposed to overcome this. This uses a looped back fiber mode in DOFS and voting logic is employed to considerably reduce the false alarm rate.
A More Flexible Approach to Valuing Flexibility
2011-04-01
remaining life of the program? Almost certainly. Next is the cost assessment step. This is executed in the context of whatever design options we...methodology is essentially a modifi- cation of the current life cycle model and is premised on the notion that the need for capabili- ty changes in a program...valuing the inherent ability of a system or design to accommodate change. The proposed methodology is essentially a modifi-cation of the current life
Navas, Francisco Javier; Jordana, Jordi; León, José Manuel; Arando, Ander; Pizarro, Gabriela; McLean, Amy Katherine; Delgado, Juan Vicente
2017-08-01
New productive niches can offer new commercial perspectives linked to donkeys' products and human therapeutic or leisure applications. However, no assessment for selection criteria has been carried out yet. First, we assessed the animal inherent features and environmental factors that may potentially influence several cognitive processes in donkeys. Then, we aimed at describing a practical methodology to quantify such cognitive processes, seeking their inclusion in breeding and conservation programmes, through a multifactorial linear model. Sixteen cognitive process-related traits were scored on a problem-solving test in a sample of 300 Andalusian donkeys for three consecutive years from 2013 to 2015. The linear model assessed the influence and interactions of four environmental factors, sex as an animal-inherent factor, age as a covariable, and the interactions between these factors. Analyses of variance were performed with GLM procedure of SPSS Statistics for Windows, Version 24.0 software to assess the relative importance of each factor. All traits were significantly (P<0.05) affected by all factors in the model except for sex that was not significant for some of the cognitive processes, and stimulus which was not significant (P<0.05) for all of them except for the coping style related ones. The interaction between all factors within the model was non-significant (P<0.05) for almost all cognitive processes. The development of complex multifactorial models to study cognitive processes may counteract the inherent variability in behavior genetics and the estimation and prediction of related breeding parameters, key for the implementation of successful conservation programmes in apparently functionally misplaced endangered breeds. Copyright © 2017 Elsevier Ltd. All rights reserved.
An Embedded Statistical Method for Coupling Molecular Dynamics and Finite Element Analyses
NASA Technical Reports Server (NTRS)
Saether, E.; Glaessgen, E.H.; Yamakov, V.
2008-01-01
The coupling of molecular dynamics (MD) simulations with finite element methods (FEM) yields computationally efficient models that link fundamental material processes at the atomistic level with continuum field responses at higher length scales. The theoretical challenge involves developing a seamless connection along an interface between two inherently different simulation frameworks. Various specialized methods have been developed to solve particular classes of problems. Many of these methods link the kinematics of individual MD atoms with FEM nodes at their common interface, necessarily requiring that the finite element mesh be refined to atomic resolution. Some of these coupling approaches also require simulations to be carried out at 0 K and restrict modeling to two-dimensional material domains due to difficulties in simulating full three-dimensional material processes. In the present work, a new approach to MD-FEM coupling is developed based on a restatement of the standard boundary value problem used to define a coupled domain. The method replaces a direct linkage of individual MD atoms and finite element (FE) nodes with a statistical averaging of atomistic displacements in local atomic volumes associated with each FE node in an interface region. The FEM and MD computational systems are effectively independent and communicate only through an iterative update of their boundary conditions. With the use of statistical averages of the atomistic quantities to couple the two computational schemes, the developed approach is referred to as an embedded statistical coupling method (ESCM). ESCM provides an enhanced coupling methodology that is inherently applicable to three-dimensional domains, avoids discretization of the continuum model to atomic scale resolution, and permits finite temperature states to be applied.
The Prisoner Problem--A Generalization.
ERIC Educational Resources Information Center
Gannon, Gerald E.; Martelli, Mario U.
2000-01-01
Presents a generalization to the classic prisoner problem, which is inherently interesting and has a solution within the reach of most high school mathematics students. Suggests the problem as a way to emphasize to students the final step in a problem-solver's tool kit, considering possible generalizations when a particular problem has been…
ERIC Educational Resources Information Center
Marzuk, Peter M.
1994-01-01
Reviews epidemiology of suicide among terminally ill. Discusses clinical assessment and management of suicidal terminally ill, emphasizing differences from evaluation and treatment of other suicidal individuals. Focuses on methodological issues inherent in studying treatment and characteristics of this population. Suggests blurring of line between…
[Argumentation and construction of validity in Carlos Matus' situational strategic planning].
Rivera, Francisco Javier Uribe
2011-09-01
This study analyzes the process of producing a situational plan according to a benchmark from the philosophy of language and argumentation theory. The basic approach used in the analysis was developed by Carlos Matus. Specifically, the study seeks to identify the inherent argumentative structure and patterns in the situational explanation and regulatory design in a plan's operations, taking argumentative approaches from pragma-dialectics and informal logic as the analytical parameters. The explanation of a health problem is used to illustrate the study. Methodologically, the study is based on the existing literature on the subject and case analyses. The study concludes with the proposition that the use of the specific references means introducing greater rigor into both the analysis of the validity of causal arguments and the design of proposals for interventions, in order for them to be more conclusive in achieving a plan's objectives.
Application of kaizen methodology to foster departmental engagement in quality improvement.
Knechtges, Paul; Decker, Michael Christopher
2014-12-01
The Toyota Production System, also known as Lean, is a structured approach to continuous quality improvement that has been developed over the past 50 years to transform the automotive manufacturing process. In recent years, these techniques have been successfully applied to quality and safety improvement in the medical field. One of these techniques is kaizen, which is the Japanese word for "good change." The central tenant of kaizen is the quick analysis of the small, manageable components of a problem and the rapid implementation of a solution with ongoing, real-time reassessment. Kaizen adds an additional "human element" that all stakeholders, not just management, must be involved in such change. Because of the small size of the changes involved in a kaizen event and the inherent focus on human factors and change management, a kaizen event can serve as good introduction to continuous quality improvement for a radiology department. Copyright © 2014. Published by Elsevier Inc.
Is pretenure interdisciplinary research a career risk?
NASA Astrophysics Data System (ADS)
Fischer, E. V.; Mackey, K. R. M.; Cusack, D. F.; DeSantis, L. R. G.; Hartzell-Nichols, L.; Lutz, J. A.; Melbourne-Thomas, J.; Meyer, R.; Riveros-Iregui, D. A.; Sorte, C. J. B.; Taylor, J. R.; White, S. A.
2012-08-01
Despite initiatives to promote interdisciplinary research, early-career academics continue to perceive professional risks to working at the interface between traditional disciplines. Unexpectedly, the inherent practical challenges of interdisciplinary scholarship, such as new methodologies and lexicons, are not the chief source of the perceived risk. The perception of risk is pervasive across disciplines, and it persists despite efforts to support career development for individuals with common interests [Mitchell and Weiler, 2011]. Suggestions that interdisciplinary work can go unrewarded in academia [Clark et al., 2011] foster a concern that targeting interdisciplinary questions, such as those presented by climate change, will pose problems for acquiring and succeeding in a tenure-track position. If self-preservation limits the questions posed by early-career academics, a perceived career risk is as damaging as a real one to new transdisciplinary initiatives. Thus, institutions should address the source of this perception whether real or specious.
Object-oriented programming for the biosciences.
Wiechert, W; Joksch, B; Wittig, R; Hartbrich, A; Höner, T; Möllney, M
1995-10-01
The development of software systems for the biosciences is always closely connected to experimental practice. Programs must be able to handle the inherent complexity and heterogeneous structure of biological systems in combination with the measuring equipment. Moreover, a high degree of flexibility is required to treat rapidly changing experimental conditions. Object-oriented methodology seems to be well suited for this purpose. It enables an evolutionary approach to software development that still maintains a high degree of modularity. This paper presents experience with object-oriented technology gathered during several years of programming in the fields of bioprocess development and metabolic engineering. It concentrates on the aspects of experimental support, data analysis, interaction and visualization. Several examples are presented and discussed in the general context of the experimental cycle of knowledge acquisition, thus pointing out the benefits and problems of object-oriented technology in the specific application field of the biosciences. Finally, some strategies for future development are described.
The Influence of Social Media on Addictive Behaviors in College Students.
Steers, Mai-Ly N; Moreno, Megan A; Neighbors, Clayton
2016-12-01
Social media has become a primary way for college students to communicate aspects of their daily lives to those within their social network. Such communications often include substance use displays (e.g., selfies of college students drinking). Furthermore, students' substance use displays have been found to robustly predict not only the posters' substance use-related outcomes (e.g., consumption, problems) but also that of their social networking peers. The current review summarizes findings of recent literature exploring the intersection between social media and substance use. Specifically, we examine how and why such substance use displays might shape college students' internalized norms surrounding substance use and how it impacts their substance use-related behaviors. Additional social media-related interventions are needed in order to target reduction of consumption among this at-risk group. We discuss the technological and methodological challenges inherent to conducting research and devising interventions in this domain.
Standards for Environmental Measurement Using GIS: Toward a Protocol for Protocols.
Forsyth, Ann; Schmitz, Kathryn H; Oakes, Michael; Zimmerman, Jason; Koepp, Joel
2006-02-01
Interdisciplinary research regarding how the built environment influences physical activity has recently increased. Many research projects conducted jointly by public health and environmental design professionals are using geographic information systems (GIS) to objectively measure the built environment. Numerous methodological issues remain, however, and environmental measurements have not been well documented with accepted, common definitions of valid, reliable variables. This paper proposes how to create and document standardized definitions for measures of environmental variables using GIS with the ultimate goal of developing reliable, valid measures. Inherent problems with software and data that hamper environmental measurement can be offset by protocols combining clear conceptual bases with detailed measurement instructions. Examples demonstrate how protocols can more clearly translate concepts into specific measurement. This paper provides a model for developing protocols to allow high quality comparative research on relationships between the environment and physical activity and other outcomes of public health interest.
Vidal, Fernando
2018-03-01
Science in film, and usual equivalents such as science on film or science on screen, refer to the cinematographic representation, staging, and enactment of actors, information, and processes involved in any aspect or dimension of science and its history. Of course, boundaries are blurry, and films shot as research tools or documentation also display science on screen. Nonetheless, they generally count as scientific film, and science in and on film or screen tend to designate productions whose purpose is entertainment and education. Moreover, these two purposes are often combined, and inherently concern empirical, methodological, and conceptual challenges associated with popularization, science communication, and the public understanding of science. It is in these areas that the notion of the deficit model emerged to designate a point of view and a mode of understanding, as well as a set of practical and theoretical problems about the relationship between science and the public.
Mewes, H W
2013-05-01
Psychiatric diseases provoke human tragedies. Asocial behaviour, mood imbalance, uncontrolled affect, and cognitive malfunction are the price for the biological and social complexity of neurobiology. To understand the etiology and to influence the onset and progress of mental diseases remains of upmost importance, but despite the much improved care for the patients, more then 100 years of research have not succeeded to understand the basic disease mechanisms and enabling rationale treatment. With the advent of the genome based technologies, much hope has been created to join the various dimension of -omics data to uncover the secrets of mental illness. Big Data as generated by -omics do not come with explanations. In this essay, I will discuss the inherent, not well understood methodological foundations and problems that seriously obstacle in striving for a quick success and may render lucky strikes impossible. © Georg Thieme Verlag KG Stuttgart · New York.
Predicting Causes of Data Quality Issues in a Clinical Data Research Network.
Khare, Ritu; Ruth, Byron J; Miller, Matthew; Tucker, Joshua; Utidjian, Levon H; Razzaghi, Hanieh; Patibandla, Nandan; Burrows, Evanette K; Bailey, L Charles
2018-01-01
Clinical data research networks (CDRNs) invest substantially in identifying and investigating data quality problems. While identification is largely automated, the investigation and resolution are carried out manually at individual institutions. In the PEDSnet CDRN, we found that only approximately 35% of the identified data quality issues are resolvable as they are caused by errors in the extract-transform-load (ETL) code. Nonetheless, with no prior knowledge of issue causes, partner institutions end up spending significant time investigating issues that represent either inherent data characteristics or false alarms. This work investigates whether the causes (ETL, Characteristic, or False alarm) can be predicted before spending time investigating issues. We trained a classifier on the metadata from 10,281 real-world data quality issues, and achieved a cause prediction F1-measure of up to 90%. While initially tested on PEDSnet, the proposed methodology is applicable to other CDRNs facing similar bottlenecks in handling data quality results.
Raising the "glass ceiling" for ethnic minority women in health care management.
Kumar, R; Johnston, G
1999-01-01
Ethnic minority women are well represented in the work force and in the health care system in general, but do not have a similar level of representation in the management sector. This paper explores three strategies for schools of health administration to consider to lessen the effect of a "glass ceiling" that may be encountered by ethnic minority women aspiring to positions of leadership in health services agencies. These strategies are advancing affirmative action, valuing ethnic women in health administration education, and investigating diversity management. Inherent in each of the three strategies is the need for acknowledgment and more open discussion of the "glass ceiling." Problem-solving in relation to the potential for systemic discrimination adversely affecting ethnic minority women in senior health care management positions, and greater study of the three strategies using both qualitative and quantitative methodologies is also needed.
Computer Programming: A Medium for Teaching Problem Solving.
ERIC Educational Resources Information Center
Casey, Patrick J.
1997-01-01
Argues that including computer programming in the curriculum as a medium for instruction is a feasible alternative for teaching problem solving. Discusses the nature of problem solving; the problem-solving elements of discovery, motivation, practical learning situations and flexibility which are inherent in programming; capabilities of computer…
MODELING METHODOLOGIES FOR OIL SPILLS
Oil spilled into aquatic environments is subject to a number of fates, including natural dispersion, emulsification and weathering. An oil slick moves due to the inherent spreading of the oil, currents, winds and waves. All of these processes influence the impacts of the oil on...
Feminist Research Methodologies as Collective Self-Education and Political Praxis.
ERIC Educational Resources Information Center
Joyappa, Vinitha; Self, Lois S.
1996-01-01
Opposing inherent biases in traditional research, feminist research methods acknowledge the worthiness of all human experience and emphasize changed relationships between researcher and researched. A more integrative feminist theory needs to avoid cultural imperialism and an implied universality of "women's experience." (SK)
Interpretational Confounding or Confounded Interpretations of Causal Indicators?
ERIC Educational Resources Information Center
Bainter, Sierra A.; Bollen, Kenneth A.
2014-01-01
In measurement theory, causal indicators are controversial and little understood. Methodological disagreement concerning causal indicators has centered on the question of whether causal indicators are inherently sensitive to interpretational confounding, which occurs when the empirical meaning of a latent construct departs from the meaning…
Development of fuzzy air quality index using soft computing approach.
Mandal, T; Gorai, A K; Pathak, G
2012-10-01
Proper assessment of air quality status in an atmosphere based on limited observations is an essential task for meeting the goals of environmental management. A number of classification methods are available for estimating the changing status of air quality. However, a discrepancy frequently arises from the quality criteria of air employed and vagueness or fuzziness embedded in the decision making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies like air quality index when describing integrated air quality conditions with respect to various pollutants parameters and time of exposure. In recent years, the fuzzy logic-based methods have demonstrated to be appropriated to address uncertainty and subjectivity in environmental issues. In the present study, a methodology based on fuzzy inference systems (FIS) to assess air quality is proposed. This paper presents a comparative study to assess status of air quality using fuzzy logic technique and that of conventional technique. The findings clearly indicate that the FIS may successfully harmonize inherent discrepancies and interpret complex conditions.
Modeling 4D Pathological Changes by Leveraging Normative Models
Wang, Bo; Prastawa, Marcel; Irimia, Andrei; Saha, Avishek; Liu, Wei; Goh, S.Y. Matthew; Vespa, Paul M.; Van Horn, John D.; Gerig, Guido
2016-01-01
With the increasing use of efficient multimodal 3D imaging, clinicians are able to access longitudinal imaging to stage pathological diseases, to monitor the efficacy of therapeutic interventions, or to assess and quantify rehabilitation efforts. Analysis of such four-dimensional (4D) image data presenting pathologies, including disappearing and newly appearing lesions, represents a significant challenge due to the presence of complex spatio-temporal changes. Image analysis methods for such 4D image data have to include not only a concept for joint segmentation of 3D datasets to account for inherent correlations of subject-specific repeated scans but also a mechanism to account for large deformations and the destruction and formation of lesions (e.g., edema, bleeding) due to underlying physiological processes associated with damage, intervention, and recovery. In this paper, we propose a novel framework that provides a joint segmentation-registration framework to tackle the inherent problem of image registration in the presence of objects not present in all images of the time series. Our methodology models 4D changes in pathological anatomy across time and and also provides an explicit mapping of a healthy normative template to a subject’s image data with pathologies. Since atlas-moderated segmentation methods cannot explain appearance and locality pathological structures that are not represented in the template atlas, the new framework provides different options for initialization via a supervised learning approach, iterative semisupervised active learning, and also transfer learning, which results in a fully automatic 4D segmentation method. We demonstrate the effectiveness of our novel approach with synthetic experiments and a 4D multimodal MRI dataset of severe traumatic brain injury (TBI), including validation via comparison to expert segmentations. However, the proposed methodology is generic in regard to different clinical applications requiring quantitative analysis of 4D imaging representing spatio-temporal changes of pathologies. PMID:27818606
Evaluating an Inquiry-Based Bioinformatics Course Using Q Methodology
ERIC Educational Resources Information Center
Ramlo, Susan E.; McConnell, David; Duan, Zhong-Hui; Moore, Francisco B.
2008-01-01
Faculty at a Midwestern metropolitan public university recently developed a course on bioinformatics that emphasized collaboration and inquiry. Bioinformatics, essentially the application of computational tools to biological data, is inherently interdisciplinary. Thus part of the challenge of creating this course was serving the needs and…
A new pre-loaded beam geometric stiffness matrix with full rigid body capabilities
NASA Astrophysics Data System (ADS)
Bosela, P. A.; Fertis, D. G.; Shaker, F. J.
1992-09-01
Space structures, such as the Space Station solar arrays, must be extremely light-weight, flexible structures. Accurate prediction of the natural frequencies and mode shapes is essential for determining the structural adequacy of components, and designing a controls system. The tension pre-load in the 'blanket' of photovoltaic solar collectors, and the free/free boundary conditions of a structure in space, causes serious reservations on the use of standard finite element techniques of solution. In particular, a phenomenon known as 'grounding', or false stiffening, of the stiffness matrix occurs during rigid body rotation. The authors have previously shown that the grounding phenomenon is caused by a lack of rigid body rotational capability, and is typical in beam geometric stiffness matrices formulated by others, including those which contain higher order effects. The cause of the problem was identified as the force imbalance inherent in the formulations. In this paper, the authors develop a beam geometric stiffness matrix for a directed force problem, and show that the resultant global stiffness matrix contains complete rigid body mode capabilities, and performs very well in the diagonalization methodology customarily used in dynamic analysis.
Adaptive multiresolution modeling of groundwater flow in heterogeneous porous media
NASA Astrophysics Data System (ADS)
Malenica, Luka; Gotovac, Hrvoje; Srzic, Veljko; Andric, Ivo
2016-04-01
Proposed methodology was originally developed by our scientific team in Split who designed multiresolution approach for analyzing flow and transport processes in highly heterogeneous porous media. The main properties of the adaptive Fup multi-resolution approach are: 1) computational capabilities of Fup basis functions with compact support capable to resolve all spatial and temporal scales, 2) multi-resolution presentation of heterogeneity as well as all other input and output variables, 3) accurate, adaptive and efficient strategy and 4) semi-analytical properties which increase our understanding of usually complex flow and transport processes in porous media. The main computational idea behind this approach is to separately find the minimum number of basis functions and resolution levels necessary to describe each flow and transport variable with the desired accuracy on a particular adaptive grid. Therefore, each variable is separately analyzed, and the adaptive and multi-scale nature of the methodology enables not only computational efficiency and accuracy, but it also describes subsurface processes closely related to their understood physical interpretation. The methodology inherently supports a mesh-free procedure, avoiding the classical numerical integration, and yields continuous velocity and flux fields, which is vitally important for flow and transport simulations. In this paper, we will show recent improvements within the proposed methodology. Since "state of the art" multiresolution approach usually uses method of lines and only spatial adaptive procedure, temporal approximation was rarely considered as a multiscale. Therefore, novel adaptive implicit Fup integration scheme is developed, resolving all time scales within each global time step. It means that algorithm uses smaller time steps only in lines where solution changes are intensive. Application of Fup basis functions enables continuous time approximation, simple interpolation calculations across different temporal lines and local time stepping control. Critical aspect of time integration accuracy is construction of spatial stencil due to accurate calculation of spatial derivatives. Since common approach applied for wavelets and splines uses a finite difference operator, we developed here collocation one including solution values and differential operator. In this way, new improved algorithm is adaptive in space and time enabling accurate solution for groundwater flow problems, especially in highly heterogeneous porous media with large lnK variances and different correlation length scales. In addition, differences between collocation and finite volume approaches are discussed. Finally, results show application of methodology to the groundwater flow problems in highly heterogeneous confined and unconfined aquifers.
Evidence Base Update for Psychosocial Treatments for Pediatric Obsessive-Compulsive Disorder
Freeman, Jennifer; Garcia, Abbe; Frank, Hannah; Benito, Kristen; Conelea, Christine; Walther, Michael; Edmunds, Julie
2013-01-01
Objective Pediatric Obsessive Compulsive Disorder (OCD) is a chronic and impairing condition that often persists into adulthood. Barrett and colleagues (2008), in this journal, provided a detailed review of evidence based psychosocial treatments for youth with OCD. The current review provides an evidence base update of the pediatric OCD psychosocial treatment literature with particular attention to advances in the field as well as to the methodological challenges inherent in evaluating such findings. Method Psychosocial treatment studies conducted since the last review are described and evaluated according to methodological rigor and evidence-based classification using the JCCAP evidence based treatment (EBT) evaluation criteria (Southam-Gerow and Prinstein, this issue). Results Findings from this review clearly converge in support of CBT as an effective and appropriate first line treatment for youth with OCD (either alone or in combination with medication). Although no treatment for pediatric OCD has yet to be designated as “well established”, both individual and individual family based treatments have been shown to be “probably efficacious.” Conclusions Moderators and predictors of treatment outcome are discussed as are the areas where we have advanced the field and the areas where we have room to grow. The methodological and clinical challenges inherent in a review of the evidence base are reviewed. Finally, future research directions are outlined. PMID:23746138
Rosendahl Appelquist, Lars; Balstrøm, Thomas
2015-04-01
This paper presents the application of a new methodology for coastal multi-hazard assessment & management under a changing global climate on the state of Karnataka, India. The recently published methodology termed the Coastal Hazard Wheel (CHW) is designed for local, regional and national hazard screening in areas with limited data availability, and covers the hazards of ecosystem disruption, gradual inundation, salt water intrusion, erosion and flooding. The application makes use of published geophysical data and remote sensing information and is showcasing how the CHW framework can be applied at a scale relevant for regional planning purposes. It uses a GIS approach to develop regional and sub-regional hazard maps as well as to produce relevant hazard risk data, and includes a discussion of uncertainties, limitations and management perspectives. The hazard assessment shows that 61 percent of Karnataka's coastline has a high or very high inherent hazard of erosion, making erosion the most prevalent coastal hazard. The hazards of flooding and salt water intrusion are also relatively widespread as 39 percent of Karnataka's coastline has a high or very high inherent hazard for both of these hazard types. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
In silico strategies toward enzyme function and dynamics.
Estácio, Sílvia G
2012-01-01
Enzymes are outstanding biocatalysts involved in a plethora of chemical reactions occurring in the cell. Despite their incommensurable importance, a comprehensive understanding of enzyme catalysis is still missing. This task becomes more laborious given the unavoidability of including the inherent dynamic nature of enzymes into that description. As such, it is essential to ascertain the nature and contribution of enzyme conformational changes to catalysis and to evaluate the adequacy of the proposal associating protein internal motions to the rate enhancement achieved. Dynamic events in enzymes span a wide range of time- and length-scales which have led to a surge in multiscale methodologies targeting enzyme function and dynamics. Computational strategies assume a preponderant role in such studies by allowing the atomic detail investigation of the fundamental mechanisms of enzyme catalysis thus surpassing what is achievable through experiments. While high-accuracy quantum mechanical methods are indicated to uncover the details of the chemical reaction occurring at the active site, molecular mechanical force fields and molecular dynamics approaches provide powerful means to access the conformational energy landscape accessible to enzymes. This review outlines some of the most important in silico methodologies in this area, highlighting examples of problems tackled and the insights obtained. Copyright © 2012 Elsevier Inc. All rights reserved.
Direct metal laser sintering titanium dental implants: a review of the current literature.
Mangano, F; Chambrone, L; van Noort, R; Miller, C; Hatton, P; Mangano, C
2014-01-01
Statement of Problem. Direct metal laser sintering (DMLS) is a technology that allows fabrication of complex-shaped objects from powder-based materials, according to a three-dimensional (3D) computer model. With DMLS, it is possible to fabricate titanium dental implants with an inherently porous surface, a key property required of implantation devices. Objective. The aim of this review was to evaluate the evidence for the reliability of DMLS titanium dental implants and their clinical and histologic/histomorphometric outcomes, as well as their mechanical properties. Materials and Methods. Electronic database searches were performed. Inclusion criteria were clinical and radiographic studies, histologic/histomorphometric studies in humans and animals, mechanical evaluations, and in vitro cell culture studies on DMLS titanium implants. Meta-analysis could be performed only for randomized controlled trials (RCTs); to evaluate the methodological quality of observational human studies, the Newcastle-Ottawa scale (NOS) was used. Results. Twenty-seven studies were included in this review. No RCTs were found, and meta-analysis could not be performed. The outcomes of observational human studies were assessed using the NOS: these studies showed medium methodological quality. Conclusions. Several studies have demonstrated the potential for the use of DMLS titanium implants. However, further studies that demonstrate the benefits of DMLS implants over conventional implants are needed.
Direct Metal Laser Sintering Titanium Dental Implants: A Review of the Current Literature
Mangano, F.; Chambrone, L.; van Noort, R.; Miller, C.; Hatton, P.; Mangano, C.
2014-01-01
Statement of Problem. Direct metal laser sintering (DMLS) is a technology that allows fabrication of complex-shaped objects from powder-based materials, according to a three-dimensional (3D) computer model. With DMLS, it is possible to fabricate titanium dental implants with an inherently porous surface, a key property required of implantation devices. Objective. The aim of this review was to evaluate the evidence for the reliability of DMLS titanium dental implants and their clinical and histologic/histomorphometric outcomes, as well as their mechanical properties. Materials and Methods. Electronic database searches were performed. Inclusion criteria were clinical and radiographic studies, histologic/histomorphometric studies in humans and animals, mechanical evaluations, and in vitro cell culture studies on DMLS titanium implants. Meta-analysis could be performed only for randomized controlled trials (RCTs); to evaluate the methodological quality of observational human studies, the Newcastle-Ottawa scale (NOS) was used. Results. Twenty-seven studies were included in this review. No RCTs were found, and meta-analysis could not be performed. The outcomes of observational human studies were assessed using the NOS: these studies showed medium methodological quality. Conclusions. Several studies have demonstrated the potential for the use of DMLS titanium implants. However, further studies that demonstrate the benefits of DMLS implants over conventional implants are needed. PMID:25525434
Mauro, John C; Loucks, Roger J; Balakrishnan, Jitendra; Raghavan, Srikanth
2007-05-21
The thermodynamics and kinetics of a many-body system can be described in terms of a potential energy landscape in multidimensional configuration space. The partition function of such a landscape can be written in terms of a density of states, which can be computed using a variety of Monte Carlo techniques. In this paper, a new self-consistent Monte Carlo method for computing density of states is described that uses importance sampling and a multiplicative update factor to achieve rapid convergence. The technique is then applied to compute the equilibrium quench probability of the various inherent structures (minima) in the landscape. The quench probability depends on both the potential energy of the inherent structure and the volume of its corresponding basin in configuration space. Finally, the methodology is extended to the isothermal-isobaric ensemble in order to compute inherent structure quench probabilities in an enthalpy landscape.
Classification versus inference learning contrasted with real-world categories.
Jones, Erin L; Ross, Brian H
2011-07-01
Categories are learned and used in a variety of ways, but the research focus has been on classification learning. Recent work contrasting classification with inference learning of categories found important later differences in category performance. However, theoretical accounts differ on whether this is due to an inherent difference between the tasks or to the implementation decisions. The inherent-difference explanation argues that inference learners focus on the internal structure of the categories--what each category is like--while classification learners focus on diagnostic information to predict category membership. In two experiments, using real-world categories and controlling for earlier methodological differences, inference learners learned more about what each category was like than did classification learners, as evidenced by higher performance on a novel classification test. These results suggest that there is an inherent difference between learning new categories by classifying an item versus inferring a feature.
Approaches to advancescientific understanding of macrosystems ecology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levy, Ofir; Ball, Becky; Bond-Lamberty, Benjamin
Macrosystem ecological studies inherently investigate processes that interact across multiple spatial and temporal scales, requiring intensive sampling and massive amounts of data from diverse sources to incorporate complex cross-scale and hierarchical interactions. Inherent challenges associated with these characteristics include high computational demands, data standardization and assimilation, identification of important processes and scales without prior knowledge, and the need for large, cross-disciplinary research teams that conduct long-term studies. Therefore, macrosystem ecology studies must utilize a unique set of approaches that are capable of encompassing these methodological characteristics and associated challenges. Several case studies demonstrate innovative methods used in current macrosystem ecologymore » studies.« less
Jazz Researchers: Riding the Dissonance of Pedagogy and Inquiry
ERIC Educational Resources Information Center
Lozenski, Brian D.
2016-01-01
Drawing from a two-year ethnographic study, this article establishes jazz as an epistemological metaphor for critical participatory action research. The author juxtaposes the tensions inherent in jazz music and critical participatory research methodologies to provide a framework for understanding how dissonance can become a productive element for…
Relative radiometric calibration of LANDSAT TM reflective bands
NASA Technical Reports Server (NTRS)
Barker, J. L.
1984-01-01
A common scientific methodology and terminology is outlined for characterizing the radiometry of both TM sensors. The magnitude of the most significant sources of radiometric variability are discussed and methods are recommended for achieving the exceptional potential inherent in the radiometric precision and accuracy of the TM sensors.
Skepticism and Qualitative Research: A View from Inside.
ERIC Educational Resources Information Center
Smith, Richard
1980-01-01
Discusses the tendency to formalize qualitative research methodologies in order to clarify basic issues inherent in contemplating ethnographic research. Presents a critique of social phenomenological positions in educational research and suggests two alternative qualitative approaches, one conceived by A.W. Imershein and the other by J.W. Knight.…
Bridging Some Intercultural Gaps: Methodological Reflections from Afar
ERIC Educational Resources Information Center
Kama, Amit
2006-01-01
Identity formation and self construction are inherently cultural phenomena. Although it may seem that human psychology--e.g., basic traits, tendencies, "characteristics," or even the definition of self--are universal and ahistorical, this essentialist view is quite erroneous and needs to be recognized and avoided. The task of studying one's…
Radiation environment and shielding for early manned Mars missions
NASA Technical Reports Server (NTRS)
Hall, Stephen B.; Mccann, Michael E.
1986-01-01
The problem of shielding a crew during early manned Mars missions is discussed. Requirements for shielding are presented in the context of current astronaut exposure limits, natural ionizing radiation sources, and shielding inherent in a particular Mars vehicle configuration. An estimated range for shielding weight is presented based on the worst solar flare dose, mission duration, and inherent vehicle shielding.
NASA Technical Reports Server (NTRS)
Campbell, John P; Mckinney, Marion O , Jr
1954-01-01
Considerable interest has recently been shown in means of obtaining satisfactory stability of the dutch roll oscillation for modern high-performance airplanes without resort to complicated artificial stabilizing devices. One approach to this problem is to lay out the airplane in the earliest stages of design so that it will have the greatest practicable inherent stability of the lateral oscillation. The present report presents some preliminary results of a theoretical analysis to determine the design features that appear most promising in providing adequate inherent stability. These preliminary results cover the case of fighter airplanes at subsonic speeds. The investigation indicated that it is possible to design fighter airplanes to have substantially better inherent stability than most current designs. Since the use of low-aspect-ratio swept-back wings is largely responsible for poor dutch roll stability, it is important to design the airplane with the maximum aspect ratio and minimum sweep that will permit attainment of the desired performance. The radius of gyration in roll should be kept as low as possible and the nose-up inclination of the principal longitudinal axis of inertia should be made as great as practicable. (author)
Common Methodological Problems in Research on the Addictions.
ERIC Educational Resources Information Center
Nathan, Peter E.; Lansky, David
1978-01-01
Identifies common problems in research on the addictions and offers suggestions for remediating these methodological problems. The addictions considered include alcoholism and drug dependencies. Problems considered are those arising from inadequate, incomplete, or biased reviews of relevant literatures and methodological shortcomings of subject…
Nutritional status as an indicator of impending food stress*.
Galvin, K A
1988-06-01
Famine early warning systems benefit from a variety of indicators which together signal the initial stages of food stress for particular population groups. Anthropometry has been used as an indicator in early warning systems, but there are inherent problems in its use which should be understood. Using data from Turkana pastoralists of northwest Kenya, this paper discusses the problems of: time lag between food shortages and changes in body size and composition; use of reference points; accurate age assessment; and establishment of baseline data. Diet composition data are suggested to be an additional nutrition-oriented indicator of impending food stress and one in which problems associated with anthropometry are not inherent. Both measures may be useful in monitoring a population, but their strengths and weaknesses should be appreciated.
The Cloud-Based Integrated Data Viewer (IDV)
NASA Astrophysics Data System (ADS)
Fisher, Ward
2015-04-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.
Hydraulic Conductivity Estimation using Bayesian Model Averaging and Generalized Parameterization
NASA Astrophysics Data System (ADS)
Tsai, F. T.; Li, X.
2006-12-01
Non-uniqueness in parameterization scheme is an inherent problem in groundwater inverse modeling due to limited data. To cope with the non-uniqueness problem of parameterization, we introduce a Bayesian Model Averaging (BMA) method to integrate a set of selected parameterization methods. The estimation uncertainty in BMA includes the uncertainty in individual parameterization methods as the within-parameterization variance and the uncertainty from using different parameterization methods as the between-parameterization variance. Moreover, the generalized parameterization (GP) method is considered in the geostatistical framework in this study. The GP method aims at increasing the flexibility of parameterization through the combination of a zonation structure and an interpolation method. The use of BMP with GP avoids over-confidence in a single parameterization method. A normalized least-squares estimation (NLSE) is adopted to calculate the posterior probability for each GP. We employee the adjoint state method for the sensitivity analysis on the weighting coefficients in the GP method. The adjoint state method is also applied to the NLSE problem. The proposed methodology is implemented to the Alamitos Barrier Project (ABP) in California, where the spatially distributed hydraulic conductivity is estimated. The optimal weighting coefficients embedded in GP are identified through the maximum likelihood estimation (MLE) where the misfits between the observed and calculated groundwater heads are minimized. The conditional mean and conditional variance of the estimated hydraulic conductivity distribution using BMA are obtained to assess the estimation uncertainty.
NASA Astrophysics Data System (ADS)
Hunter, C. K.; Bolster, D.; Gironas, J. A.
2014-12-01
Water resources are essential to development, not only economically but also socially, politically and ecologically. With growing demand and potentially shrinking supply, water scarcity is one of the most pressing socio-ecological problems of the 21st century. Considering implications of global change and the complexity of interrelated systems, uncertain future conditions compound problems associated with water stress, requiring hydrologic models to re-examine traditional water resource planning and management. The Copiapó water basin, located in the Atacama Desert of northern Chile exhibits a complex resource management scenario. With annual average precipitation of only 28 mm, water intensive sectors such as export agriculture, extensive mining, and a growing population have depleted the aquifeŕs reserves to near critical levels. Being that global climate change models predict a decrease in already scarce precipitation, and that growing population and economies demand will likely increase, the real future situation might be even worse than that predicted. A viable option for alleviation of water stress, water reuse and recycling has evolved through technological innovation to feasibly meet hydraulic needs with reclaimed water. For the proper application of these methods for resource management, however, stakeholders must possess tools by which to quantify hydrologic risk, understand its factors of causation, and choose between competing management scenarios and technologies so as to optimize productivity. While previous investigations have addressed similar problems, they often overlook aspects of forecasting uncertainty, proposing solutions that while accurate under specific scenarios, lack robustness to withstand future variations. Using the WEAP (Water Evaluation and Planning) platform for hydrologic modeling, this study proposes a methodology, applicable to other stressed watersheds, to quantify inherent risk in water management positions, while considering uncertainties in supply (climate change), demand (market variations), and measurement (risk definition). Applied to the Copaipó case study, this methodology proposes the solution of a 30% demand decrease within the agricultural sector through urban wastewater recycling and increased irrigation efficiency.
Fast and Efficient Discrimination of Traveling Salesperson Problem Stimulus Difficulty
ERIC Educational Resources Information Center
Dry, Matthew J.; Fontaine, Elizabeth L.
2014-01-01
The Traveling Salesperson Problem (TSP) is a computationally difficult combinatorial optimization problem. In spite of its relative difficulty, human solvers are able to generate close-to-optimal solutions in a close-to-linear time frame, and it has been suggested that this is due to the visual system's inherent sensitivity to certain geometric…
Why Inquiry Is Inherently Difficult...and Some Ways to Make It Easier
ERIC Educational Resources Information Center
Meyer, Daniel Z.; Avery, Leanne M.
2010-01-01
In this article, the authors offer a framework that identifies two critical problems in designing inquiry-based instruction and suggests three models for developing instruction that overcomes those problems. The Protocol Model overcomes the Getting on Board Problem by providing students an initial experience through clearly delineated steps with a…
Implications of the Social Web Environment for User Story Education
ERIC Educational Resources Information Center
Fancott, Terrill; Kamthan, Pankaj; Shahmir, Nazlie
2012-01-01
In recent years, user stories have emerged in academia, as well as industry, as a notable approach for expressing user requirements of interactive software systems that are developed using agile methodologies. There are social aspects inherent to software development, in general, and user stories, in particular. This paper presents directions and…
Higher Education Value Added Using Multiple Outcomes
ERIC Educational Resources Information Center
Milla, Joniada; Martín, Ernesto San; Van Bellegem, Sébastien
2016-01-01
In this article we develop a methodology for the joint value added analysis of multiple outcomes that takes into account the inherent correlation between them. This is especially crucial in the analysis of higher education institutions. We use a unique Colombian database on universities, which contains scores in five domains tested in a…
Comparison Groups in Autism Family Research: Down Syndrome, Fragile X Syndrome, and Schizophrenia
ERIC Educational Resources Information Center
Seltzer, Marsha Mailick; Abbeduto, Leonard; Krauss, Marty Wyngaarden; Greenberg, Jan; Swe, April
2004-01-01
This paper examines methodological challenges inherent in conducting research on families of children with autism and in comparing these families with others who are coping with different types of disabilities or who have nondisabled children. Although most comparative research has contrasted families whose child has autism with those whose child…
Team-Teaching a Current Events-Based Biology Course for Nonmajors
ERIC Educational Resources Information Center
Bondos, Sarah E.; Phillips, Dereth
2008-01-01
Rice University has created a team-taught interactive biology course for nonmajors with a focus on cutting edge biology in the news--advances in biotechnology, medicine, and science policy, along with the biological principles and methodology upon which these advances are based. The challenges inherent to teaching current topics were minimized by…
Introducing Sustainability into Business Education Contexts Using Active Learning
ERIC Educational Resources Information Center
MacVaugh, Jason; Norton, Mike
2012-01-01
Purpose: The purpose of this paper is to explore how active learning may help address the legitimacy and practicability issues inherent in introducing education for sustainability into business-related degree programs. Design/methodology/approach: The focus of this study is the experience of the authors in the development and implementation of…
Learning Gaps in a Learning Organization: Professionals' Values versus Management Values
ERIC Educational Resources Information Center
Parding, Karolina; Abrahamsson, Lena
2010-01-01
Purpose: The aim of this article is to challenge the concept of "the learning organization" as unproblematic and inherently good. Design/methodology/approach: The research looked at how teachers--as an example of public sector professionals in a work organization that claims to be a learning organization--view their conditions for…
Implicit and Explicit Preference Structures in Models of Labor Supply.
ERIC Educational Resources Information Center
Dickinson, Jonathan
The study of labor supply is directed to a theoretical methodology under which the choice of the general functional form of the income-leisure preference structure may be regarded as an empirical question. The author has reviewed the common functional forms employed in empirical labor supply models and has characterized the inherent preference…
Studying Young People's New Media Use: Methodological Shifts and Educational Innovations
ERIC Educational Resources Information Center
Pascoe, C. J.
2012-01-01
A lack of good information about what youth are doing with new media stimulates fears and hopes about the relationship between young people and digital technologies. This article focuses on new modes of inquiry into youth new media use, highlighting the challenges, complexities, and opportunities inherent in studying young people's digital…
Major Challenges for the Modern Chemistry in Particular and Science in General.
Uskokovíc, Vuk
2010-11-01
In the past few hundred years, science has exerted an enormous influence on the way the world appears to human observers. Despite phenomenal accomplishments of science, science nowadays faces numerous challenges that threaten its continued success. As scientific inventions become embedded within human societies, the challenges are further multiplied. In this critical review, some of the critical challenges for the field of modern chemistry are discussed, including: (a) interlinking theoretical knowledge and experimental approaches; (b) implementing the principles of sustainability at the roots of the chemical design; (c) defining science from a philosophical perspective that acknowledges both pragmatic and realistic aspects thereof; (d) instigating interdisciplinary research; (e) learning to recognize and appreciate the aesthetic aspects of scientific knowledge and methodology, and promote truly inspiring education in chemistry. In the conclusion, I recapitulate that the evolution of human knowledge inherently depends upon our ability to adopt creative problem-solving attitudes, and that challenges will always be present within the scope of scientific interests.
The relationship between seasonal mood change and personality: more apparent than real?
Jang, K L; Lam, R W; Livesley, W J; Vernon, P A
1997-06-01
A number of recent research reports have reported significant relationships between seasonal mood change (seasonality) and personality. However, some of the results are difficult to interpret because of inherent methodological problems, the most important of which is the use of samples drawn from the southern as opposed to the northern hemisphere, where the phenomenon of seasonality may be quite different. The present study examined the relationship between personality and seasonality in a sample from the northern hemisphere (minimum latitude = 49 degrees N). A total of 297 adults drawn from the general population (112 male and 185 female subjects) completed the Seasonal Pattern Assessment Questionnaire, and the results obtained confirmed most of the previously reported relationships and showed that these are reliable across (i) different hemispheres, (ii) different measures of personality and (iii) clinical and general population samples. However, the impact of the relationship seems to be more apparent than real, with personality accounting for just under 15% of the total variance.
Bergen, P L; Nemec, D
1999-01-01
In December 1997, the authors completed an in-depth collection assessment project at the University of Wisconsin-Madison Health Sciences Libraries. The purpose was to develop a framework for future collection assessment projects by completing a multifaceted evaluation of the libraries' monograph and serial collections in the subject area of drug resistance. Evaluators adapted and synthesized several traditional collection assessment tools, including shelflist measurement, bibliography and standard list checking, and citation analysis. Throughout the project, evaluators explored strategies to overcome some of the problems inherent in the application of traditional collection assessment methods to the evaluation of biomedical collections. Their efforts resulted in the identification of standard monographs and core journals for the subject area, a measurement of the collections' strength relative to the collections of benchmark libraries, and a foundation for future collection development within the subject area. The project's primary outcome was a collection assessment methodology that has potential application to both internal and cooperative collection development in medical, pharmaceutical, and other health sciences libraries.
NASA Technical Reports Server (NTRS)
Swift, Daniel W.
1991-01-01
The primary methodology during the grant period has been the use of micro or meso-scale simulations to address specific questions concerning magnetospheric processes related to the aurora and substorm morphology. This approach, while useful in providing some answers, has its limitations. Many of the problems relating to the magnetosphere are inherently global and kinetic. Effort during the last year of the grant period has increasingly focused on development of a global-scale hybrid code to model the entire, coupled magnetosheath - magnetosphere - ionosphere system. In particular, numerical procedures for curvilinear coordinate generation and exactly conservative differencing schemes for hybrid codes in curvilinear coordinates have been developed. The new computer algorithms and the massively parallel computer architectures now make this global code a feasible proposition. Support provided by this project has played an important role in laying the groundwork for the eventual development or a global-scale code to model and forecast magnetospheric weather.
Martial recycling from renewable landfill and associated risks: A review.
Ziyang, Lou; Luochun, Wang; Nanwen, Zhu; Youcai, Zhao
2015-07-01
Landfill is the dominant disposal choice for the non-classified waste, which results in the stockpile of materials after a long term stabilization process. A novel landfill, namely renewable landfill (RL), is developed and applied as a strategy to recycle the residual materials and reuse the land occupation, aim to reduce the inherent problems of large land occupied, materials wasted and long-term pollutants released in the conventional landfill. The principle means of RL is to accelerate the waste biodegradation process in the initial period, recover the various material resources disposal and extend the landfill volume for waste re-landfilling after waste stabilized. The residual material available and risk assessment, the methodology of landfill excavation, the potential utilization routes for different materials, and the reclamation options for the unsanitary landfill are proposed, and the integrated beneficial impacts are identified finally from the economic, social and environmental perspectives. RL could be draw as the future reservoirs for resource extraction. Copyright © 2015 Elsevier Ltd. All rights reserved.
The Influence of Social Media on Addictive Behaviors in College Students
Steers, Mai-Ly N.; Moreno, Megan A.; Neighbors, Clayton
2016-01-01
Social media has become a primary way for college students to communicate aspects of their daily lives to those within their social network. Such communications often include substance use displays (e.g., selfies of college students drinking). Furthermore, students’ substance use displays have been found to robustly predict not only the posters’ substance use-related outcomes (e.g., consumption, problems) but also that of their social networking peers. Purpose of review The current review summarizes findings of recent literature exploring the intersection between social media and substance use. Recent findings Specifically, we examine how and why such substance use displays might shape college students’ internalized norms surrounding substance use and how it impacts their substance use-related behaviors. Summary Additional social media-related interventions are needed in order to target reduction of consumption among this at-risk group. We discuss the technological and methodological challenges inherent to conducting research and devising interventions in this domain. PMID:28458990
Simulating The Prompt Electromagnetic Pulse In 3D Using Vector Spherical Harmonics
NASA Astrophysics Data System (ADS)
Friedman, Alex; Cohen, Bruce I.; Eng, Chester D.; Farmer, William A.; Grote, David P.; Kruger, Hans W.; Larson, David J.
2017-10-01
We describe a new, efficient code for simulating the prompt electromagnetic pulse. In SHEMP (``Spherical Harmonic EMP''), we extend to 3-D the methods pioneered in C. Longmire's CHAP code. The geomagnetic field and air density are consistent with CHAP's assumed spherical symmetry only for narrow domains of influence about the line of sight, limiting validity to very early times. Also, we seek to model inherently 3-D situations. In CHAP and our own CHAP-lite, the independent coordinates are r (the distance from the source) and τ = t-r/c; the pulse varies slowly with r at fixed τ, so a coarse radial grid suffices. We add non-spherically-symmetric physics via a vector spherical harmonic decomposition. For each (l,m) harmonic, the radial equation is similar to that in CHAP and CHAP-lite. We present our methodology and results on model problems. This work was performed under the auspices of the U.S. DOE by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Major Challenges for the Modern Chemistry in Particular and Science in General
Uskokovíc, Vuk
2013-01-01
In the past few hundred years, science has exerted an enormous influence on the way the world appears to human observers. Despite phenomenal accomplishments of science, science nowadays faces numerous challenges that threaten its continued success. As scientific inventions become embedded within human societies, the challenges are further multiplied. In this critical review, some of the critical challenges for the field of modern chemistry are discussed, including: (a) interlinking theoretical knowledge and experimental approaches; (b) implementing the principles of sustainability at the roots of the chemical design; (c) defining science from a philosophical perspective that acknowledges both pragmatic and realistic aspects thereof; (d) instigating interdisciplinary research; (e) learning to recognize and appreciate the aesthetic aspects of scientific knowledge and methodology, and promote truly inspiring education in chemistry. In the conclusion, I recapitulate that the evolution of human knowledge inherently depends upon our ability to adopt creative problem-solving attitudes, and that challenges will always be present within the scope of scientific interests. PMID:24465151
Force-controlled absorption in a fully-nonlinear numerical wave tank
NASA Astrophysics Data System (ADS)
Spinneken, Johannes; Christou, Marios; Swan, Chris
2014-09-01
An active control methodology for the absorption of water waves in a numerical wave tank is introduced. This methodology is based upon a force-feedback technique which has previously been shown to be very effective in physical wave tanks. Unlike other methods, an a-priori knowledge of the wave conditions in the tank is not required; the absorption controller being designed to automatically respond to a wide range of wave conditions. In comparison to numerical sponge layers, effective wave absorption is achieved on the boundary, thereby minimising the spatial extent of the numerical wave tank. In contrast to the imposition of radiation conditions, the scheme is inherently capable of absorbing irregular waves. Most importantly, simultaneous generation and absorption can be achieved. This is an important advance when considering inclusion of reflective bodies within the numerical wave tank. In designing the absorption controller, an infinite impulse response filter is adopted, thereby eliminating the problem of non-causality in the controller optimisation. Two alternative controllers are considered, both implemented in a fully-nonlinear wave tank based on a multiple-flux boundary element scheme. To simplify the problem under consideration, the present analysis is limited to water waves propagating in a two-dimensional domain. The paper presents an extensive numerical validation which demonstrates the success of the method for a wide range of wave conditions including regular, focused and random waves. The numerical investigation also highlights some of the limitations of the method, particularly in simultaneously generating and absorbing large amplitude or highly-nonlinear waves. The findings of the present numerical study are directly applicable to related fields where optimum absorption is sought; these include physical wavemaking, wave power absorption and a wide range of numerical wave tank schemes.
An Integrated approach to the Space Situational Awareness Problem
2016-12-15
data coming from the sensors. We developed particle-based Gaussian Mixture Filters that are immune to the “curse of dimensionality”/ “particle...depletion” problem inherent in particle filtering . This method maps the data assimilation/ filtering problem into an unsupervised learning problem. Results...Gaussian Mixture Filters ; particle depletion; Finite Set Statistics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 1
A comparative study of the nonuniqueness problem of the potential equation
NASA Technical Reports Server (NTRS)
Salas, M. D.; Jameson, A.; Melnik, R. E.
1985-01-01
The nonuniqueness problem occurring at transonic speeds with the conservative potential equation is investigated numerically. The study indicates that the problem is not an inviscid phenomenon, but results from approximate treatment of shock waves inherent in the conservative potential model. A new bound on the limit of validity of the conservative potential model is proposed.
L.S. Bauer; J. Granett
1979-01-01
Black flies have been long-time residents of Maine and cause extensive nuisance problems for people, domestic animals, and wildlife. The black fly problem has no simple solution because of the multitude of species present, the diverse and ecologically sensitive habitats in which they are found, and the problems inherent in measuring the extent of the damage they cause...
APPLICATION OF A FINITE-DIFFERENCE TECHNIQUE TO THE HUMAN RADIOFREQUENCY DOSIMETRY PROBLEM
A powerful finite difference numerical technique has been applied to the human radiofrequency dosimetry problem. The method possesses inherent advantages over the method of moments approach in that its implementation requires much less computer memory. Consequently, it has the ca...
Sincere but naive: methodological queries concerning the British Columbia polygamy reference trial.
Ashley, Sean Matthew
2014-11-01
Academics frequently serve as expert witnesses in legal cases, yet their role as transmitters of social scientific knowledge remains under-examined. The present study analyzes the deployment of social science within British Columbia's polygamy reference trial where research is used to support the assertion that polygamy is inherently harmful to society. Within the trial record and the written decision, the protection of monogamy as an institution is performed in part through the marginalization of qualitative methodology and the concurrent privileging of quantitative studies that purportedly demonstrate widespread social harms associated with the practice of polygyny.
Save money by understanding variance and tolerancing.
Stuart, K
2007-01-01
Manufacturing processes are inherently variable, which results in component and assembly variance. Unless process capability, variance and tolerancing are fully understood, incorrect design tolerances may be applied, which will lead to more expensive tooling, inflated production costs, high reject rates, product recalls and excessive warranty costs. A methodology is described for correctly allocating tolerances and performing appropriate analyses.
ERIC Educational Resources Information Center
Wharton, Tracy; Alexander, Neil
2013-01-01
This article describes lessons learned about implementing evaluations in hospital settings. In order to overcome the methodological dilemmas inherent in this environment, we used a practical participatory evaluation (P-PE) strategy to engage as many stakeholders as possible in the process of evaluating a clinical demonstration project.…
A Holistic Approach to Science Education: Disciplinary, Affective, and Equitable
ERIC Educational Resources Information Center
Mehta, Rohit; Mehta, Swati; Seals, Christopher
2017-01-01
In this chapter, we argue that science education is more than the high stakes, rigorous practices and methodology that students often find dull and uninspiring. We present that aesthetic and humanistic motivations, such as wonder, curiosity, and social justice, are also inherent reasons for doing science. In the MSUrbanSTEM program, we designed an…
Evaluation of self-combustion risk in tire derived aggregate fills.
Arroyo, Marcos; San Martin, Ignacio; Olivella, Sebastian; Saaltink, Maarten W
2011-01-01
Lightweight tire derived aggregate (TDA) fills are a proven recycling outlet for waste tires, requiring relatively low cost waste processing and being competitively priced against other lightweight fill alternatives. However its value has been marred as several TDA fills have self-combusted during the early applications of this technique. An empirical review of these cases led to prescriptive guidelines from the ASTM aimed at avoiding this problem. This approach has been successful in avoiding further incidents of self-combustion. However, at present there remains no rational method available to quantify self-combustion risk in TDA fills. This means that it is not clear which aspects of the ASTM guidelines are essential and which are accessory. This hinders the practical use of TDA fills despite their inherent advantages as lightweight fill. Here a quantitative approach to self-combustion risk evaluation is developed and illustrated with a parametric analysis of an embankment case. This is later particularized to model a reported field self-combustion case. The approach is based on the available experimental observations and incorporates well-tested methodological (ISO corrosion evaluation) and theoretical tools (finite element analysis of coupled heat and mass flow). The results obtained offer clear insights into the critical aspects of the problem, allowing already some meaningful recommendations for guideline revision. Copyright © 2011 Elsevier Ltd. All rights reserved.
[Follow-up of children conceived by assisted reproductive technologies].
Bouillon, C; Fauque, P
2013-05-01
Since the birth of the first baby conceived by in vitro fertilization (IVF) 30 years ago (Louise Brown in 1978), there has been a rapid and constant increase in the number of couples using assisted reproductive technologies (ART). Around four million of children have been born from couples experiencing fertility problems, through the use of ART, comprising roughly 2-3 % of all births in Europe and U.S. That highlights that these modes of fertilization are now well assumed by our societies. However, several questions on health of these children remain to be elucidated. As evoked in this review, even if methodological limitations exist, numerous studies have reported increased risks of birth defects, like prematurity, foetal hypotrophy, neonatal complications, congenital malformations and epigenetic diseases among ART-conceived children as compared to naturally conceived children. Nowadays, it is difficult to determine if these increased risks found in ART infants are a consequence of the ART procedures or are inherent to the infertility problems per se. However, absolute risks remain moderate and reassuring as well as the data on follow-up into infancy and early childhood. Nevertheless, because the effects may occur at the adulthood, there is a need for long-term follow-up of children born after ART. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Psychotherapy Outcome Research: Issues and Questions.
Shean, Glenn
2016-03-01
Emphasis on identifying evidence-based therapies (EBTs) has increased markedly. Lists of EBTs are the rationale for recommendations for how psychotherapy provider training programs should be evaluated, professional competence assessed, and licensure and reimbursement policies structured. There are however methodological concerns that limit the external validity of EBTs. Among the most salient is the circularity inherent in randomized control trials (RCTs) of psychotherapy that constrains the manner in which the psychological problems are defined, psychotherapy can be practiced, and change evaluated. RCT studies favor therapies that focus of specific symptoms and can be described in a manual, administered reliably across patients, completed in relatively few sessions, and involve short-term evaluations of outcome. The epistemological assumptions of a natural science approach to psychotherapy research limit how studies are conducted and assessed in ways that that advantage symptom-focused approaches and disadvantage those approaches that seek to bring broad recovery-based changes. Research methods that are not limited to RCTs and include methodology to minimize the effects of "therapist allegiance" are necessary for valid evaluations of therapeutic approaches that seek to facilitate changes that are broader than symptom reduction. Recent proposals to adopt policies that dictate training, credentialing, and reimbursement based on lists of EBTs unduly limit how psychotherapy can be conceptualized and practiced, and are not in the best interests of the profession or of individuals seeking psychotherapy services.
Hassmiller Lich, Kristen; Urban, Jennifer Brown; Frerichs, Leah; Dave, Gaurav
2017-02-01
Group concept mapping (GCM) has been successfully employed in program planning and evaluation for over 25 years. The broader set of systems thinking methodologies (of which GCM is one), have only recently found their way into the field. We present an overview of systems thinking emerging from a system dynamics (SD) perspective, and illustrate the potential synergy between GCM and SD. As with GCM, participatory processes are frequently employed when building SD models; however, it can be challenging to engage a large and diverse group of stakeholders in the iterative cycles of divergent thinking and consensus building required, while maintaining a broad perspective on the issue being studied. GCM provides a compelling resource for overcoming this challenge, by richly engaging a diverse set of stakeholders in broad exploration, structuring, and prioritization. SD provides an opportunity to extend GCM findings by embedding constructs in a testable hypothesis (SD model) describing how system structure and changes in constructs affect outcomes over time. SD can be used to simulate the hypothesized dynamics inherent in GCM concept maps. We illustrate the potential of the marriage of these methodologies in a case study of BECOMING, a federally-funded program aimed at strengthening the cross-sector system of care for youth with severe emotional disturbances. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Batzias, Dimitris F.; Ifanti, Konstantina
2012-12-01
Process simulation models are usually empirical, therefore there is an inherent difficulty in serving as carriers for knowledge acquisition and technology transfer, since their parameters have no physical meaning to facilitate verification of the dependence on the production conditions; in such a case, a 'black box' regression model or a neural network might be used to simply connect input-output characteristics. In several cases, scientific/mechanismic models may be proved valid, in which case parameter identification is required to find out the independent/explanatory variables and parameters, which each parameter depends on. This is a difficult task, since the phenomenological level at which each parameter is defined is different. In this paper, we have developed a methodological framework under the form of an algorithmic procedure to solve this problem. The main parts of this procedure are: (i) stratification of relevant knowledge in discrete layers immediately adjacent to the layer that the initial model under investigation belongs to, (ii) design of the ontology corresponding to these layers, (iii) elimination of the less relevant parts of the ontology by thinning, (iv) retrieval of the stronger interrelations between the remaining nodes within the revised ontological network, and (v) parameter identification taking into account the most influential interrelations revealed in (iv). The functionality of this methodology is demonstrated by quoting two representative case examples on wastewater treatment.
Relaxation therapies for asthma: a systematic review
Huntley, A; White, A; Ernst, E
2002-01-01
Background: Emotional stress can either precipitate or exacerbate both acute and chronic asthma. There is a large body of literature available on the use of relaxation techniques for the treatment of asthma symptoms. The aim of this systematic review was to determine if there is any evidence for or against the clinical efficacy of such interventions. Methods: Four independent literature searches were performed on Medline, Cochrane Library, CISCOM, and Embase. Only randomised clinical trials (RCTs) were included. There were no restrictions on the language of publication. The data from trials that statistically compared the treatment group with that of the control were extracted in a standardised predefined manner and assessed critically by two independent reviewers. Results: Fifteen trials were identified, of which nine compared the treatment group with the control group appropriately. Five RCTs tested progressive muscle relaxation or mental and muscular relaxation, two of which showed significant effects of therapy. One RCT investigating hypnotherapy, one of autogenic training, and two of biofeedback techniques revealed no therapeutic effects. Overall, the methodological quality of the studies was poor. Conclusions: There is a lack of evidence for the efficacy of relaxation therapies in the management of asthma. This deficiency is due to the poor methodology of the studies as well as the inherent problems of conducting such trials. There is some evidence that muscular relaxation improves lung function of patients with asthma but no evidence for any other relaxation technique. PMID:11828041
Relaxation therapies for asthma: a systematic review.
Huntley, A; White, A R; Ernst, E
2002-02-01
Emotional stress can either precipitate or exacerbate both acute and chronic asthma. There is a large body of literature available on the use of relaxation techniques for the treatment of asthma symptoms. The aim of this systematic review was to determine if there is any evidence for or against the clinical efficacy of such interventions. Four independent literature searches were performed on Medline, Cochrane Library, CISCOM, and Embase. Only randomised clinical trials (RCTs) were included. There were no restrictions on the language of publication. The data from trials that statistically compared the treatment group with that of the control were extracted in a standardised predefined manner and assessed critically by two independent reviewers. Fifteen trials were identified, of which nine compared the treatment group with the control group appropriately. Five RCTs tested progressive muscle relaxation or mental and muscular relaxation, two of which showed significant effects of therapy. One RCT investigating hypnotherapy, one of autogenic training, and two of biofeedback techniques revealed no therapeutic effects. Overall, the methodological quality of the studies was poor. There is a lack of evidence for the efficacy of relaxation therapies in the management of asthma. This deficiency is due to the poor methodology of the studies as well as the inherent problems of conducting such trials. There is some evidence that muscular relaxation improves lung function of patients with asthma but no evidence for any other relaxation technique.
Clarity of objectives and working principles enhances the success of biomimetic programs.
Wolff, Jonas O; Wells, David; Reid, Chris R; Blamires, Sean J
2017-09-26
Biomimetics, the transfer of functional principles from living systems into product designs, is increasingly being utilized by engineers. Nevertheless, recurring problems must be overcome if it is to avoid becoming a short-lived fad. Here we assess the efficiency and suitability of methods typically employed by examining three flagship examples of biomimetic design approaches from different disciplines: (1) the creation of gecko-inspired adhesives; (2) the synthesis of spider silk, and (3) the derivation of computer algorithms from natural self-organizing systems. We find that identification of the elemental working principles is the most crucial step in the biomimetic design process. It bears the highest risk of failure (e.g. losing the target function) due to false assumptions about the working principle. Common problems that hamper successful implementation are: (i) a discrepancy between biological functions and the desired properties of the product, (ii) uncertainty about objectives and applications, (iii) inherent limits in methodologies, and (iv) false assumptions about the biology of the models. Projects that aim for multi-functional products are particularly challenging to accomplish. We suggest a simplification, modularisation and specification of objectives, and a critical assessment of the suitability of the model. Comparative analyses, experimental manipulation, and numerical simulations followed by tests of artificial models have led to the successful extraction of working principles. A searchable database of biological systems would optimize the choice of a model system in top-down approaches that start at an engineering problem. Only when biomimetic projects become more predictable will there be wider acceptance of biomimetics as an innovative problem-solving tool among engineers and industry.
Integrative Potential of Architectural Activities
NASA Astrophysics Data System (ADS)
Davydova, O. V.
2017-11-01
The architectural activity integrative potential is considered through the combination as well as the organization of necessary universal human and professional, artificial and natural, social and individual architectural activities in the multidimensional unity of its components reflecting and influencing the public thinking with the artistic-figurative language of international communication using experimental form-building, interactive presentations, theatrical and gaming expressiveness to organize an easier contact with the consumer, methods of design and advertising. The methodology is used to reflect the mutual influence of personal and social problems through globalization and identification of their problem in the public, to study the existing methods of the problem solving, to analyze their effectiveness, to search for actual problems and new solutions to them using the latest achievements of technological progress, artistic patterns, creation of a holistic architectural image reflecting the author’s worldview in the general picture of the modern world with its inherent tendencies “Surah” and “entertainment”. The operative communication means in the chain of social experience are developed - the teacher - the trainee - the new educational result used to transmit the updated information in a generalized form, the current and final control through the use of feedback sheets, supporting summaries, info cards, its decisions. The paper considers the study time efficiency due to the organization of the research activity which allows students to obtain a theoretical generalized information (the creator’s limitation) in the process of filling or compiling informative and diagnostic maps that provide the theoretical framework for the creative activity through gaming activity that turns into a work activity which has a diagnosed result.
Space Transportation System Availability Relationships to Life Cycle Cost
NASA Technical Reports Server (NTRS)
Rhodes, Russel E.; Donahue, Benjamin B.; Chen, Timothy T.
2009-01-01
Future space transportation architectures and designs must be affordable. Consequently, their Life Cycle Cost (LCC) must be controlled. For the LCC to be controlled, it is necessary to identify all the requirements and elements of the architecture at the beginning of the concept phase. Controlling LCC requires the establishment of the major operational cost drivers. Two of these major cost drivers are reliability and maintainability, in other words, the system's availability (responsiveness). Potential reasons that may drive the inherent availability requirement are the need to control the number of unique parts and the spare parts required to support the transportation system's operation. For more typical space transportation systems used to place satellites in space, the productivity of the system will drive the launch cost. This system productivity is the resultant output of the system availability. Availability is equal to the mean uptime divided by the sum of the mean uptime plus the mean downtime. Since many operational factors cannot be projected early in the definition phase, the focus will be on inherent availability which is equal to the mean time between a failure (MTBF) divided by the MTBF plus the mean time to repair (MTTR) the system. The MTBF is a function of reliability or the expected frequency of failures. When the system experiences failures the result is added operational flow time, parts consumption, and increased labor with an impact to responsiveness resulting in increased LCC. The other function of availability is the MTTR, or maintainability. In other words, how accessible is the failed hardware that requires replacement and what operational functions are required before and after change-out to make the system operable. This paper will describe how the MTTR can be equated to additional labor, additional operational flow time, and additional structural access capability, all of which drive up the LCC. A methodology will be presented that provides the decision makers with the understanding necessary to place constraints on the design definition. This methodology for the major drivers will determine the inherent availability, safety, reliability, maintainability, and the life cycle cost of the fielded system. This methodology will focus on the achievement of an affordable, responsive space transportation system. It is the intent of this paper to not only provide the visibility of the relationships of these major attribute drivers (variables) to each other and the resultant system inherent availability, but also to provide the capability to bound the variables, thus providing the insight required to control the system's engineering solution. An example of this visibility is the need to provide integration of similar discipline functions to allow control of the total parts count of the space transportation system. Also, selecting a reliability requirement will place a constraint on parts count to achieve a given inherent availability requirement, or require accepting a larger parts count with the resulting higher individual part reliability requirements. This paper will provide an understanding of the relationship of mean repair time (mean downtime) to maintainability (accessibility for repair), and both mean time between failure (reliability of hardware) and the system inherent availability.
Problem solving using soft systems methodology.
Land, L
This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.
Designing a fuzzy scheduler for hard real-time systems
NASA Technical Reports Server (NTRS)
Yen, John; Lee, Jonathan; Pfluger, Nathan; Natarajan, Swami
1992-01-01
In hard real-time systems, tasks have to be performed not only correctly, but also in a timely fashion. If timing constraints are not met, there might be severe consequences. Task scheduling is the most important problem in designing a hard real-time system, because the scheduling algorithm ensures that tasks meet their deadlines. However, the inherent nature of uncertainty in dynamic hard real-time systems increases the problems inherent in scheduling. In an effort to alleviate these problems, we have developed a fuzzy scheduler to facilitate searching for a feasible schedule. A set of fuzzy rules are proposed to guide the search. The situation we are trying to address is the performance of the system when no feasible solution can be found, and therefore, certain tasks will not be executed. We wish to limit the number of important tasks that are not scheduled.
Edgar Schein's Process versus Content Consultation Models.
ERIC Educational Resources Information Center
Rockwood, Gary F.
1993-01-01
Describes Schein's three models of consultation based on assumptions inherent in different helping styles: purchase of expertise and doctor-patient models, which focus on content of organization problems; and process consultation model, which focuses on how organizational problems are solved. Notes that Schein has suggested that consultants begin…
The Drop-Outs and the Dilatory on the Road to the Doctorate.
ERIC Educational Resources Information Center
Rudd, Ernest
1986-01-01
Results of a survey of British doctoral students not completing their graduate programs are reported. Students' reasons for dropping out include individual characteristics, personal problems and accidents, problems inherent in research projects, and poor supervision. Faculty attitudes and government policy are discussed. (MSE)
ERIC Educational Resources Information Center
Heyne, David A.; Vreeke, Leonie J.; Maric, Marija; Boelens, Harrie; Van Widenfelt, Brigit M.
2017-01-01
The "School Refusal Assessment Scale" (SRAS) was developed to identify four factors that might maintain a youth's school attendance problem (SAP), and thus be targeted for treatment. There is still limited support for the four-factor model inherent to the SRAS and its revision (SRAS-R). Recent studies indicate problems with the wording…
Preschoolers Grow Their Brains: Shifting Mindsets for Greater Resiliency and Better Problem Solving
ERIC Educational Resources Information Center
Pawlina, Shelby; Stanford, Christie
2011-01-01
Challenges, mistakes, and problems are inherent every day in learning activities and social interactions. How children think about and respond to those difficult situations has an impact on how they see themselves as being able to shape their own learning and on how they handle the next problem that comes their way. Building resilience means…
The Roles of Women in the Army and Their Impact on Military Operations and Organizations.
ERIC Educational Resources Information Center
Batts, John H.; And Others
Problems inherent in the expanded utilization of female soldiers in the U.S. Army are numerous. Attitudes of a wide sample of Army personnel, men and women, enlisted and officer, were surveyed pertaining to those problems. Some problems such as uniforms, billeting, assignments, and training are obvious and with proper planning can and will be…
From data to evidence: evaluative methods in evidence-based medicine.
Landry, M D; Sibbald, W J
2001-11-01
The amount of published information is increasing exponentially, and recent technologic advances have created systems whereby mass distribution of this information can occur at an infinite rate. This is particularly true in the broad field of medicine, as the absolute volume of data available to the practicing clinician is creating new challenges in the management of relevant information flow. Evidence-based medicine (EBM) is an information management and learning strategy that seeks to integrate clinical expertise with the best evidence available in order to make effective clinical decisions that will ultimately improve patient care. The systematic approach underlying EBM encourages the clinician to formulate specific and relevant questions, which are answered in an iterative manner through accessing the best available published evidence. The arguments against EBM stem from the idea that there are inherent weaknesses in research methodologies and that emphasis placed on published research may ignore clinical skills and individual patient needs. Despite these arguments, EBM is gaining momentum and is consistently used as a method of learning and improving health care delivery. However, if EBM is to be effective, the clinician needs to have a critical understanding of research methodology in order to judge the value and level of a particular data source. Without critical analysis of research methodology, there is an inherent risk of drawing incorrect conclusions that may affect clinical decision-making. Currently, there is a trend toward using secondary pre-appraised data rather than primary sources as best evidence. We review the qualitative and quantitative methodology commonly used in EBM and argue that it is necessary for the clinician to preferentially use primary rather than secondary sources in making clinically relevant decisions.
The confounding problem of polydrug use in recreational ecstasy/MDMA users: a brief overview.
Gouzoulis-Mayfrank, Euphrosyne; Daumann, Jörg
2006-03-01
The popular dance drug ecstasy (3,4-methylenedioxymethamphetamine -- MDMA) is neurotoxic upon central serotonergic neurons in laboratory animals and possibly also in humans. In recent years, several studies reported alterations of serotonergic transmission and neuropsychiatric abnormalities in ecstasy users which might be related to MDMA-induced neurotoxic brain damage. To date, the most consistent findings associate subtle cognitive, particularly memory, deficits with heavy ecstasy use. However, most studies have important inherent methodological problems. One of the most serious confounds is the widespread pattern of polydrug use which makes it dif.cult to relate the findings in user populations to one specific drug. The present paper represents a brief overview on this issue. The most commonly co-used substances are alcohol, cannabis and stimulants (amphetamines and cocaine). Stimulants are also neurotoxic upon both serotonergic and dopaminergic neurons. Hence, they may act synergistically with MDMA and enhance its long-term adverse effects. The interactions between MDMA and cannabis use may be more complex: cannabis use is a well-recognized risk factor for neuropsychiatric disorders and it was shown to contribute to psychological problems and cognitive failures in ecstasy users. However, at the cellular level, cannabinoids have neuroprotective actions and they were shown to (partially) block MDMA-induced neurotoxicity in laboratory animals. In future, longitudinal and prospective research designs should hopefully lead to a better understanding of the relation between drug use and subclinical psychological symptoms or neurocognitive failures and, also, of questions around interactions between the various substances of abuse.
A low emission vehicle procurement approach for Washington state
NASA Astrophysics Data System (ADS)
McCoy, G. A.; Lyons, J. K.; Ware, G.
1992-06-01
The Clean Air Washington Act of 1991 directs the Department of Ecology to establish a clean-fuel vehicle standard. The Department of General Administration shall purchase vehicles based on this standard beginning in the Fall of 1992. The following summarizes the major issues effecting vehicle emissions and their regulation, and present a methodology for procuring clean-fuel vehicles for the State of Washington. Washington State's air quality problems are much less severe than in other parts of the country such as California, the East Coast and parts of the Mid West. Ozone, which is arguably the dominant air quality problem in the US, is a recent and relatively minor issue in Washington. Carbon monoxide (CO) represents a more immediate problem in Washington, with most of the state's urban areas exceeding national CO air quality standards. Since the mid-1960's, vehicle tailpipe hydrocarbon and carbon monoxide emissions have been reduced by 96 percent relative to precontrol vehicles. Nitrogen oxide emissions have been reduced by 76 percent. Emissions from currently available vehicles are quite low with respect to in-place exhaust emission standards. Cold-start emissions constitute about 75 percent of the total emissions measured with the Federal Test Procedure used to certify motor vehicles. There is no currently available 'inherently clean burning fuel'. In 1991, 3052 vehicles were purchased under Washington State contract. Provided that the same number are acquired in 1993, the state will need to purchase 915 vehicles which meet the definition of a 'clean-fueled vehicle'.
Problems and Solutions in Evaluating Child Outcomes of Large-Scale Educational Programs.
ERIC Educational Resources Information Center
Abrams, Allan S.; And Others
1979-01-01
Evaluation of large-scale programs is problematical because of inherent bias in assignment of treatment and control groups, resulting in serious regression artifacts even with the use of analysis of covariance designs. Nonuniformity of program implementation across sites and classrooms is also a problem. (Author/GSK)
Quantitative Relationships Involving Additive Differences: Numerical Resilience
ERIC Educational Resources Information Center
Ramful, Ajay; Ho, Siew Yin
2014-01-01
This case study describes the ways in which problems involving additive differences with unknown starting quantities, constrain the problem solver in articulating the inherent quantitative relationship. It gives empirical evidence to show how numerical reasoning takes over as a Grade 6 student instantiates the quantitative relation by resorting to…
English/Japanese Professional Interpretation: Its Linguistic and Conceptual Problems.
ERIC Educational Resources Information Center
Ishikawa, Luli
1995-01-01
A study of simultaneous interpretation from Japanese to English focused on problems inherent in simultaneous language processing. Data were drawn from a discussion session at an international conference of physicians concerning nuclear war. Transcription of the Japanese source text (romanized), English product, and a gloss of lexical equivalents…
ERIC Educational Resources Information Center
LaHart, David, Ed.
Fossil fuels, upon which we now depend almost exclusively, are finite resources. Because the environmental problems inherent in large scale fossil fuel consumption are increasingly apparent, the reality of developing alternative energy sources must be faced. Solar energy is the obvious solution to the problem. It is a renewable, clean source that…
Analysis and control of high-speed wheeled vehicles
NASA Astrophysics Data System (ADS)
Velenis, Efstathios
In this work we reproduce driving techniques to mimic expert race drivers and obtain the open-loop control signals that may be used by auto-pilot agents driving autonomous ground wheeled vehicles. Race drivers operate their vehicles at the limits of the acceleration envelope. An accurate characterization of the acceleration capacity of the vehicle is required. Understanding and reproduction of such complex maneuvers also require a physics-based mathematical description of the vehicle dynamics. While most of the modeling issues of ground-vehicles/automobiles are already well established in the literature, lack of understanding of the physics associated with friction generation results in ad-hoc approaches to tire friction modeling. In this work we revisit this aspect of the overall vehicle modeling and develop a tire friction model that provides physical interpretation of the tire forces. The new model is free of those singularities at low vehicle speed and wheel angular rate that are inherent in the widely used empirical static models. In addition, the dynamic nature of the tire model proposed herein allows the study of dynamic effects such as transients and hysteresis. The trajectory-planning problem for an autonomous ground wheeled vehicle is formulated in an optimal control framework aiming to minimize the time of travel and maximize the use of the available acceleration capacity. The first approach to solve the optimal control problem is using numerical techniques. Numerical optimization allows incorporation of a vehicle model of high fidelity and generates realistic solutions. Such an optimization scheme provides an ideal platform to study the limit operation of the vehicle, which would not be possible via straightforward simulation. In this work we emphasize the importance of online applicability of the proposed methodologies. This underlines the need for optimal solutions that require little computational cost and are able to incorporate real, unpredictable environments. A semi-analytic methodology is developed to generate the optimal velocity profile for minimum time travel along a prescribed path. The semi-analytic nature ensures minimal computational cost while a receding horizon implementation allows application of the methodology in uncertain environments. Extensions to increase fidelity of the vehicle model are finally provided.
ERIC Educational Resources Information Center
Rönnerman, Karin; Salo, Petri; Furu, Eli Moksnes; Lund, Torbjørn; Olin, Anette; Jakhelln, Rachel
2016-01-01
In this article we present the Nordic Network for Action Research, established in 2004. We describe how the network has explored, bridged and nurtured the inherent action research dynamics of ideology and methodology. This has been done through an understanding anchored in educational traditions, and by focus on three important ideal-shaping…
Cooper, P David; Smart, David R
2017-03-01
In an era of ever-increasing medical costs, the identification and prohibition of ineffective medical therapies is of considerable economic interest to healthcare funding bodies. Likewise, the avoidance of interventions with an unduly elevated clinical risk/benefit ratio would be similarly advantageous for patients. Regrettably, the identification of such therapies has proven problematic. A recent paper from the Grattan Institute in Australia (identifying five hospital procedures as having the potential for disinvestment on these grounds) serves as a timely illustration of the difficulties inherent in non-clinicians attempting to accurately recognize such interventions using non-clinical, indirect or poorly validated datasets. To evaluate the Grattan Institute report and associated publications, and determine the validity of their assertions regarding hyperbaric oxygen treatment (HBOT) utilisation in Australia. Critical analysis of the HBOT metadata included in the Grattan Institute study was undertaken and compared against other publicly available Australian Government and independent data sources. The consistency, accuracy and reproducibility of data definitions and terminology across the various publications were appraised and the authors' methodology was reviewed. Reference sources were examined for relevance and temporal eligibility. Review of the Grattan publications demonstrated multiple problems, including (but not limited to): confusing patient-treatments with total patient numbers; incorrect identification of 'appropriate' vs. 'inappropriate' indications for HBOT; reliance upon a compromised primary dataset; lack of appropriate clinical input, muddled methodology and use of inapplicable references. These errors resulted in a more than seventy-fold over-estimation of the number of patients potentially treated inappropriately with HBOT in Australia that year. Numerous methodological flaws and factual errors have been identified in this Grattan Institute study. Its conclusions are not valid and a formal retraction is required.
Evidence-based medicine for neurosurgeons: introduction and methodology.
Linskey, Mark E
2006-01-01
Evidence-based medicine is a tool of considerable value for medicine and neurosurgery that provides a secure base for clinical practice and practice improvement, but is not without inherent drawbacks, weaknesses and limitations. EBM finds answers to only those questions open to its techniques, and the best available evidence can be a far cry from scientific truth. With the support and backing of governmental agencies, professional medical societies, the AAMC, the ACGME, and the ABMS, EBM is likely here to stay. The fact that: (1) EBM philosophy and critical appraisal techniques have become fully integrated into the training and culture of our younger colleagues, (2) that maintenance of certification will require individuals to demonstrate personal evidence based practice based on tracking and critical analysis of personal practice outcomes as part of the performance-based learning and improvement competency, and (3) that the progressively growing national healthcare expenditures will necessitate increasing basis of reimbursement and funding based on evidence-based effectiveness and guidelines, all point to the likelihood that complete immersion of neurosurgical practice in EBM is inevitable. This article thoroughly explores the history of EBM in medicine in general and in neurosurgery in particular. Emphasis is placed on identifying the legislative and regulatory motive forces at work behind its promulgation and the role that organized medicine has taken to facilitate and foster its acceptance and implementation. An accounting of resources open to neurosurgeons, and a detailed description EBM clinical decision-making methodology is presented. Special emphasis is placed on outlining the methodology as well as the limitations of meta-analyses, randomized clinic trials, and clinical practice parameter guidelines. Commonly perceived objections, as well as substantive problems and limitations of EBM assumptions, tools, and approaches both for individual clinical practice and health policy design and implementation are explored in detail.
Fitzpatrick, Anne; Tumlinson, Katherine
2017-03-24
The use of simulated clients or "mystery clients" is a data collection approach in which a study team member presents at a health care facility or outlet pretending to be a real customer, patient, or client. Following the visit, the shopper records her observations. The use of mystery clients can overcome challenges of obtaining accurate measures of health care quality and improve the validity of quality assessments, particularly in low- and middle-income countries. However, mystery client studies should be carefully designed and monitored to avoid problems inherent to this data collection approach. In this article, we discuss our experiences with the mystery client methodology in studies conducted in public- and private-sector health facilities in Kenya and in private-sector facilities in Uganda. We identify both the benefits and the challenges in using this methodology to guide other researchers interested in using this technique. Recruitment of appropriate mystery clients who accurately represent the facility's clientele, have strong recall of recent events, and are comfortable in their role as undercover data collectors are key to successful implementation of this methodology. Additionally, developing detailed training protocols can help ensure mystery clients behave identically and mimic real patrons accurately while short checklists can help ensure mystery client responses are standardized. Strict confidentiality and protocols to avoid unnecessary exams or procedures should also be stressed during training and monitored carefully throughout the study. Despite these challenges, researchers should consider mystery client designs to measure actual provider behavior and to supplement self-reported provider behavior. Data from mystery client studies can provide critical insight into the quality of service provision unavailable from other data collection methods. The unique information available from the mystery client approach far outweighs the cost. © Fitzpatrick and Tumlinson.
Fitzpatrick, Anne; Tumlinson, Katherine
2017-01-01
ABSTRACT The use of simulated clients or “mystery clients” is a data collection approach in which a study team member presents at a health care facility or outlet pretending to be a real customer, patient, or client. Following the visit, the shopper records her observations. The use of mystery clients can overcome challenges of obtaining accurate measures of health care quality and improve the validity of quality assessments, particularly in low- and middle-income countries. However, mystery client studies should be carefully designed and monitored to avoid problems inherent to this data collection approach. In this article, we discuss our experiences with the mystery client methodology in studies conducted in public- and private-sector health facilities in Kenya and in private-sector facilities in Uganda. We identify both the benefits and the challenges in using this methodology to guide other researchers interested in using this technique. Recruitment of appropriate mystery clients who accurately represent the facility's clientele, have strong recall of recent events, and are comfortable in their role as undercover data collectors are key to successful implementation of this methodology. Additionally, developing detailed training protocols can help ensure mystery clients behave identically and mimic real patrons accurately while short checklists can help ensure mystery client responses are standardized. Strict confidentiality and protocols to avoid unnecessary exams or procedures should also be stressed during training and monitored carefully throughout the study. Despite these challenges, researchers should consider mystery client designs to measure actual provider behavior and to supplement self-reported provider behavior. Data from mystery client studies can provide critical insight into the quality of service provision unavailable from other data collection methods. The unique information available from the mystery client approach far outweighs the cost. PMID:28126970
Promoting Post-Formal Thinking in a U.S. History Survey Course: A Problem-Based Approach
ERIC Educational Resources Information Center
Wynn, Charles T.; Mosholder, Richard S.; Larsen, Carolee A.
2016-01-01
This article presents a problem-based learning (PBL) model for teaching a college U.S. history survey course (U.S. history since 1890) designed to promote postformal thinking skills and identify and explain thinking systems inherent in adult complex problem-solving. We also present the results of a study in which the outcomes of the PBL model were…
A hierarchical-multiobjective framework for risk management
NASA Technical Reports Server (NTRS)
Haimes, Yacov Y.; Li, Duan
1991-01-01
A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.
Evaluation of the HARDMAN comparability methodology for manpower, personnel and training
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.
1984-01-01
The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.
Production system chunking in SOAR: Case studies in automated learning
NASA Technical Reports Server (NTRS)
Allen, Robert
1989-01-01
A preliminary study of SOAR, a general intelligent architecture for automated problem solving and learning, is presented. The underlying principles of universal subgoaling and chunking were applied to a simple, yet representative, problem in artificial intelligence. A number of problem space representations were examined and compared. It is concluded that learning is an inherent and beneficial aspect of problem solving. Additional studies are suggested in domains relevant to mission planning and to SOAR itself.
Creativity and Inspiration for Problem Solving in Engineering Education
ERIC Educational Resources Information Center
Nordstrom, Katrina; Korpelainen, Paivi
2011-01-01
Problem solving is a critical skill for engineering students and essential to development of creativity and innovativeness. Essential to such learning is an ease of communication and allowing students to address the issues at hand via the terminology, attitudes, humor and empathy, which is inherent to their frame of mind as novices, without the…
The Effects of Athletic Competition on Character Development in College Student Athletes
ERIC Educational Resources Information Center
Stoll, Sharon Kay
2012-01-01
This article argues that there are inherent problems in athletic competition relating to character development in college student athletes. A review of the research supports the claim that athletic competitions do not build character. The author proposes ways to address this problem and provides personal observations and published research to…
The formaldehyde problem in wood-based products : an annotated bibliography
F. H. Max Nestler
1977-01-01
Urea-formaldehyde-type adhesives have the inherent characteristic of giving off free formaldehyde under some conditions of use. The vapor can build up to concentrations which can be a nuisance, uncomfortable, or an actual health hazard. The "formaldehyde problem" is reviewed, from literature sources, in five respects : oriqins, analytical, control and removal...
A Complementary Measure of Heterogeneity on Mathematical Skills
ERIC Educational Resources Information Center
Fedriani, Eugenio M.; Moyano, Rafael
2012-01-01
Finding educational truths is an inherently multivariate problem. There are many factors affecting each student and their performances. Because of this, both measuring of skills and assessing students are always complex processes. This is a well-known problem, and a number of solutions have been proposed by specialists. One of its ramifications is…
The Chemistry of Paper Preservation Part 4. Alkaline Paper.
ERIC Educational Resources Information Center
Carter, Henry A.
1997-01-01
Discusses the problem of the inherent instability of paper due to the presence of acids that catalyze the hydrolytic degradation of cellulose. Focuses on the chemistry involved in the sizing of both acid and alkaline papers and the types of fillers used. Discusses advantages and problems of alkaline papermaking. Contains 48 references. (JRH)
Educational Malpractice: Can the Judiciary Remedy the Growing Problem of Functional Illiteracy?
ERIC Educational Resources Information Center
Klein, Alice J.
1979-01-01
Investigates the viability of a negligence action for inadequate public school education. Explores the problems inherent in proving each element of negligence, the available defense, and the potential consequences for plaintiffs, defendants, and educational policy-making that would flow from judicial recognition of a cause of action. Journal…
[Problems Inherent in Attempting Standardization of Libraries.
ERIC Educational Resources Information Center
Port, Idelle
In setting standards for a large and geographically dispersed library system, one must reconcile the many varying practices that affect what is being measured or discussed. The California State University and Colleges (CSUC) consists of 19 very distinct campuses. The problems and solutions of one type of CSUC library are not likely to be those of…
Zigzag laser with reduced optical distortion
Albrecht, G.F.; Comaskey, B.; Sutton, S.B.
1994-04-19
The architecture of the present invention has been driven by the need to solve the beam quality problems inherent in Brewster's angle tipped slab lasers. The entrance and exit faces of a solid state slab laser are cut perpendicular with respect to the pump face, thus intrinsically eliminating distortion caused by the unpumped Brewster's angled faces. For a given zigzag angle, the residual distortions inherent in the remaining unpumped or lightly pumped ends may be reduced further by tailoring the pump intensity at these ends. 11 figures.
Zigzag laser with reduced optical distortion
Albrecht, Georg F.; Comaskey, Brian; Sutton, Steven B.
1994-01-01
The architecture of the present invention has been driven by the need to solve the beam quality problems inherent in Brewster's angle tipped slab lasers. The entrance and exit faces of a solid state slab laser are cut perpendicular with respect to the pump face, thus intrinsically eliminating distortion caused by the unpumped Brewster's angled faces. For a given zigzag angle, the residual distortions inherent in the remaining unpumped or lightly pumped ends may be reduced further by tailoring the pump intensity at these ends.
Field-Programmable Gate Array Computer in Structural Analysis: An Initial Exploration
NASA Technical Reports Server (NTRS)
Singleterry, Robert C., Jr.; Sobieszczanski-Sobieski, Jaroslaw; Brown, Samuel
2002-01-01
This paper reports on an initial assessment of using a Field-Programmable Gate Array (FPGA) computational device as a new tool for solving structural mechanics problems. A FPGA is an assemblage of binary gates arranged in logical blocks that are interconnected via software in a manner dependent on the algorithm being implemented and can be reprogrammed thousands of times per second. In effect, this creates a computer specialized for the problem that automatically exploits all the potential for parallel computing intrinsic in an algorithm. This inherent parallelism is the most important feature of the FPGA computational environment. It is therefore important that if a problem offers a choice of different solution algorithms, an algorithm of a higher degree of inherent parallelism should be selected. It is found that in structural analysis, an 'analog computer' style of programming, which solves problems by direct simulation of the terms in the governing differential equations, yields a more favorable solution algorithm than current solution methods. This style of programming is facilitated by a 'drag-and-drop' graphic programming language that is supplied with the particular type of FPGA computer reported in this paper. Simple examples in structural dynamics and statics illustrate the solution approach used. The FPGA system also allows linear scalability in computing capability. As the problem grows, the number of FPGA chips can be increased with no loss of computing efficiency due to data flow or algorithmic latency that occurs when a single problem is distributed among many conventional processors that operate in parallel. This initial assessment finds the FPGA hardware and software to be in their infancy in regard to the user conveniences; however, they have enormous potential for shrinking the elapsed time of structural analysis solutions if programmed with algorithms that exhibit inherent parallelism and linear scalability. This potential warrants further development of FPGA-tailored algorithms for structural analysis.
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
Drug discovery chemistry: a primer for the non-specialist.
Jordan, Allan M; Roughley, Stephen D
2009-08-01
Like all scientific disciplines, drug discovery chemistry is rife with terminology and methodology that can seem intractable to those outside the sphere of synthetic chemistry. Derived from a successful in-house workshop, this Foundation Review aims to demystify some of this inherent terminology, providing the non-specialist with a general insight into the nomenclature, terminology and workflow of medicinal chemists within the pharmaceutical industry.
Kofler, Michael J; Spiegel, Jamie A; Soto, Elia F; Irwin, Lauren N; Wells, Erica L; Austin, Kristin E
2018-06-19
Reading problems are common in children with ADHD and show strong covariation with these children's underdeveloped working memory abilities. In contrast, working memory training does not appear to improve reading performance for children with ADHD or neurotypical children. The current study bridges the gap between these conflicting findings, and combines dual-task methodology with Bayesian modeling to examine the role of working memory for explaining ADHD-related reading problems. Children ages 8-13 (M = 10.50, SD = 1.59) with and without ADHD (N = 78; 29 girls; 63% Caucasian/Non-Hispanic) completed a counterbalanced series of reading tasks that systematically manipulated concurrent working memory demands. Adding working memory demands produced disproportionate decrements in reading comprehension for children with ADHD (d = -0.67) relative to Non-ADHD children (d = -0.18); comprehension was significantly reduced in both groups when working memory demands were increased. These effects were robust to controls for foundational reading skills (decoding, sight word vocabulary) and comorbid reading disability. Concurrent working memory demands did not slow reading speed for either group. The ADHD group showed lower comprehension (d = 1.02) and speed (d = 0.69) even before adding working memory demands beyond those inherently required for reading. Exploratory conditional effects analyses indicated that underdeveloped working memory overlapped with 41% (comprehension) and 85% (speed) of these between-group differences. Reading problems in ADHD appear attributable, at least in part, to their underdeveloped working memory abilities. Combined with prior cross-sectional and longitudinal findings, the current experimental evidence positions working memory as a potential causal mechanism that is necessary but not sufficient for effectively understanding written language.
Holburn, Steve
1997-01-01
After a slow start, the popularity of applied behavior analysis for people with severe behavior problems peaked in the 1970s and was then battered down by the effects of methodological behaviorism, the aversives controversy, overregulation, and the inherent limitations of congregate living. Despite the ethical, technical, and conceptual advancements in behavior analysis, many people with challenging behavior live in futile environments in which the behavior analyst can only tinker. A radically behavioristic approach has become available that has the power to change these conditions, to restore the reciprocity necessary for new learning, and to bring residential behavior analysts more in contact with the contingencies of helping and teaching. The approach is consistent with alternatives that behaviorists have suggested for years to improve the image and effectiveness of applied behavior analysis, although it will take the behaviorist far from the usual patterns of practice. Finally, the approach promotes its own survival by promoting access to interlocking organizational contingencies, but its antithetical nature presents many conceptual and practical challenges to agency adoption. PMID:22478282
Lebo, J.A.; Huckins, J.N.; Petty, J.D.; Ho, K.T.
1999-01-01
Work was performed to determine the feasibility of selectively detoxifying organic contaminants in sediments. The results of this research will be used to aid in the development of a scheme for whole-sediment toxicity identification evaluations (TIEs). The context in which the method will be used inherently restricts the treatments to which the sediments can be subjected: Sediments cannot be significantly altered physically or chemically and the presence and bioavailabilities of other toxicants must not be changed. The methodological problem is daunting because of the requirement that the detoxification method be relatively fast and convenient together with the stipulation that only innocuous and minimally invasive treatments be used. Some of the experiments described here dealt with degrees of decontamination (i.e., detoxification as predicted from instrumental measurements) of spiked sediments rather than with degrees of detoxification as gauged by toxicity tests (e.g., 48-h toxicity tests with amphipods). Although the larger TIE scheme itself is mostly outside the scope of this paper, theoretical aspects of bioavailability and of the desorption of organic contaminants from sediments are discussed.
The use of narrative in Jewish medical ethics.
Jotkowitz, Alan
2013-09-01
Anne Jones has pointed out that over the last three decades, stories have been important to medical ethics in at least three ways: (1). Stories as cases for teaching principle-based medical ethics (2). Narratives for moral guides on what is considered living a good life (3). Stories as testimonials written by both patients and physicians. A pioneer in this effort, particularly in regard to using narratives as moral guides, has been the ethicist and philosopher Stanley Hauerwas. Heavily influenced by virtue ethics, Hauerwas believes that it is a person's particular narrative tradition that provides one with convictions that form the basis of one's morality. Befitting a Protestant theologian, he is particularly concerned with the Christian narrative. From a Jewish perspective, there has been much less written on the use of narrative in medical ethics. However, it is a mistake to think that narrative has little, if any, role in Rabbinic ethical decision making. The purpose of this article is to demonstrate the centrality of narrative in the thought of Orthodox Jewish decisors and the problems inherent in this methodology.
Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel
2016-01-01
An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different casemore » studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.« less
Bergen, P L; Nemec, D
1999-01-01
In December 1997, the authors completed an in-depth collection assessment project at the University of Wisconsin-Madison Health Sciences Libraries. The purpose was to develop a framework for future collection assessment projects by completing a multifaceted evaluation of the libraries' monograph and serial collections in the subject area of drug resistance. Evaluators adapted and synthesized several traditional collection assessment tools, including shelflist measurement, bibliography and standard list checking, and citation analysis. Throughout the project, evaluators explored strategies to overcome some of the problems inherent in the application of traditional collection assessment methods to the evaluation of biomedical collections. Their efforts resulted in the identification of standard monographs and core journals for the subject area, a measurement of the collections' strength relative to the collections of benchmark libraries, and a foundation for future collection development within the subject area. The project's primary outcome was a collection assessment methodology that has potential application to both internal and cooperative collection development in medical, pharmaceutical, and other health sciences libraries. PMID:9934527
Influence of Attitudes Toward Curriculum on Dishonest Academic Behavior
Austin, Zubin; Collins, David; Remillard, Alfred; Kelcher, Sheila; Chui, Stephanie
2006-01-01
Objectives The objective of this study was to examine possible associations between students' self-reported behaviors and opinions towards academic dishonesty, and their attitudes towards curriculum, assessment, and teaching within the pharmacy program. Methods A questionnaire was developed and distributed to undergraduate (pre-licensure) students at 4 schools of pharmacy in Canada, including students enrolled in the international pharmacy graduate program. Results More than 80% of respondents indicated they had participated in one or more of the act of academic dishonesty described in the questionnaire. A weak to moderate correlation was found between students' attitudes towards pharmacy education and their self-reported behaviors related to academic dishonesty. Conclusions This study confirmed previous findings suggesting widespread academic dishonesty as well as a hierarchy of values with respect to students' perceptions regarding severity and importance of academic dishonesty. Despite methodological limitations inherent in examining academic dishonesty, there is a definite need to continue to examine this important issue. While this study indicated only a moderate correlation between attitudes towards curriculum and dishonest behaviors, the problem of academic misconduct is multifactorial and will require ongoing study. PMID:17136171
General Methodology for Designing Spacecraft Trajectories
NASA Technical Reports Server (NTRS)
Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.
2012-01-01
A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.
Decision-problem state analysis methodology
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.
Freedom to Make Choices for Health: Plus 40 Years
ERIC Educational Resources Information Center
Airhihenbuwa, Collins O.; Iwelunmor, Juliet
2010-01-01
In the 1969 inaugural issue of the "School Health Review", Douglass J.H. examined four major issues he felt were central to the question of choices one has about health: (1) problems with health care delivery methods; (2) persistent poverty in our population and its impact on health; (3) systemic problems inherent in social and institutional…
The Community College in the Twenty-First Century. A Systems Approach;.
ERIC Educational Resources Information Center
Cain, Michael Scott
The thesis of the book is that the problems community colleges face were inherent from the beginning and became more prominent because the particular vantage points from which the schools were viewed prohibited the taking of any action on the problems. Chapter one presents a portrait of what the colleges have become, establishing the governing…
ERIC Educational Resources Information Center
Doleck, Tenzin; Jarrell, Amanda; Poitras, Eric G.; Chaouachi, Maher; Lajoie, Susanne P.
2016-01-01
Clinical reasoning is a central skill in diagnosing cases. However, diagnosing a clinical case poses several challenges that are inherent to solving multifaceted ill-structured problems. In particular, when solving such problems, the complexity stems from the existence of multiple paths to arriving at the correct solution (Lajoie, 2003). Moreover,…
Street Youth: Adaptation and Survival in the AIDS Decade.
ERIC Educational Resources Information Center
Luna, G. Cajetan
Street youth remain at the fringes of society reflecting larger inherent social problems. Whether due to the death of parents, as a result of war, poverty, famine, disease, abandonment or abuse the health and social problems of the world's 100 million street youth are profound. By 1987 it was accepted that street youth were a high risk population…
ERIC Educational Resources Information Center
Kaungamno, E. E.
This paper discusses the role of information in national development, addressing such issues as for whom and for what purposes information is needed in developing countries, the impact of the information explosion on the Third World, and the problems inherent in current national and international information infrastructures. A series of statements…
Balard, Frédéric; Corre, Stéphanie Pin Le; Trouvé, Hélène; Saint-Jean, Olivier; Somme, Dominique
2013-01-01
By matching needs to resource services, case management could be a useful tool for improving the care of older people with complex living conditions. Collecting and analysing the users' experiences represents a good way to evaluate the effectiveness and efficiency of a case-management service. However, in the literature, fieldwork is very rarely considered and the users included in qualitative research seem to be the most accessible. This study was undertaken to describe the challenges of conducting qualitative research with older people with complex living conditions in order to understand their experiences with case-management services. Reflective analysis was applied to describe the process of recruiting and interviewing older people with complex living conditions in private homes, describing the protocol with respect to fieldwork chronology. The practical difficulties inherent in this type of study are addressed, particularly in terms of defining a sample, the procedure for contacting the users and conducting the interview. The users are people who suffer from a loss of autonomy because of cognitive impairment, severe disease and/or psychiatric or social problems. Notably, most of them refuse care and assistance. Reflective analysis of our protocol showed that the methodology and difficulties encountered constituted the first phase of data analysis. Understanding the experience of users of case management to analyse the outcomes of case-management services requires a clear methodology for the fieldwork.
Dynamic Loads Generation for Multi-Point Vibration Excitation Problems
NASA Technical Reports Server (NTRS)
Shen, Lawrence
2011-01-01
A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.
Environmental care in agricultural catchments: Toward the communicative catchment
NASA Astrophysics Data System (ADS)
Martin, Peter
1991-11-01
Substantial land degradation of agricultural catchments in Australia has resulted from the importation of European farming methods and the large-scale clearing of land. Rural communities are now being encouraged by government to take responsibility for environmental care. The importance of community involvement is supported by the view that environmental problems are a function of interactions between people and their environment. It is suggested that the commonly held view that community groups cannot care for their resources is due to inappropriate social institutions rather that any inherent disability in people. The communicative catchment is developed as a vision for environmental care into the future. This concept emerges from a critique of resource management through the catchment metaphors of the reduced, mechanical, and the complex, evolving catchment, which reflect the development of systemic and people-centered approaches to environmental care. The communicative catchment is one where both community and resource managers participate collaboratively in environmental care. A methodology based on action research and systemic thinking (systemic action research) is proposed as a way of moving towards the communicative catchment of the future. Action research is a way of taking action in organizations and communities that is participative and informed by theory, while systemic thinking takes into account the interconnections and relationships between social and natural worlds. The proposed vision, methodology, and practical operating principles stem from involvement in an action research project looking at extension strategies for the implementation of total catchment management in the Hunter Valley, New South Wales.
NASA Technical Reports Server (NTRS)
Palosz, B.; Stelmakh, S.; Grzanka, E.; Gierlotka, S.; Zhao, Y.; Palosz, W.
2003-01-01
The real atomic structure of nanocrystals determines key properties of the materials. For such materials the serious experimental problem lies in obtaining sufficiently accurate measurements of the structural parameters of the crystals, since very small crystals constitute rather a two-phase than a uniform crystallographic phase system. As a result, elastic properties of nanograins may be expected to reflect a dual nature of their structure, with a corresponding set of different elastic property parameters. We studied those properties by in-situ high-pressure powder diffraction technique. For nanocrystalline, even one-phase materials such measurements are particularly difficult to make since determination of the lattice parameters of very small crystals presents a challenge due to inherent limitations of standard elaboration of powder diffractograms. In this investigation we used our methodology of the structural analysis, the 'apparent lattice parameter' (alp) concept. The methodology allowed us to avoid the traps (if applied to nanocrystals) of standard powder diffraction evaluation techniques. The experiments were performed for nanocrystalline Sic and GaN powders using synchrotron sources. We applied both hydrostatic and isostatic pressures in the range of up to 40 GPa. Elastic properties of the samples were examined based on the measurements of a change of the lattice parameters with pressure. The results show a dual nature of the mechanical properties (compressibilities) of the materials, indicating a complex, core-shell structure of the grains.
NASA Astrophysics Data System (ADS)
Akram, Muhammad Farooq Bin
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
An Architectural Model of Visual Motion Understanding
1989-08-01
of the Center for Visual Sciences of the University of Rochester. Their courage in the face of the overwhelming com- plexity of the human visual...analysis should perform better than either approach by itself. Notice that the problems of the two approaches are non-overlapping. Continuous methods face no...success. This is not terribly surprising, as the problem is inherently very difficult. Consider the problems faced by a unit that is trying to compute the
Ellingson, Roger M.; Gallun, Frederick J.; Bock, Guillaume
2015-01-01
It can be problematic to measure stationary acoustic sound pressure level in any environment when the target level approaches or lies below the minimum measureable sound pressure level of the measurement system itself. This minimum measureable level, referred to as the inherent measurement system noise floor, is generally established by noise emission characteristics of measurement system components such as microphones, preamplifiers, and other system circuitry. In this paper, methods are presented and shown accurate measuring stationary levels within 20 dB above and below this system noise floor. Methodology includes (1) measuring inherent measurement system noise, (2) subtractive energy based, inherent noise adjustment of levels affected by system noise floor, and (3) verifying accuracy of inherent noise adjustment technique. While generalizable to other purposes, the techniques presented here were specifically developed to quantify ambient noise levels in very quiet rooms used to evaluate free-field human hearing thresholds. Results obtained applying the methods to objectively measure and verify the ambient noise level in an extremely quiet room, using various measurement system noise floors and analysis bandwidths, are presented and discussed. The verified results demonstrate the adjustment method can accurately extend measurement range to 20 dB below the measurement system noise floor, and how measurement system frequency bandwidth can affect accuracy of reported noise levels. PMID:25786932
Classical problems in computational aero-acoustics
NASA Technical Reports Server (NTRS)
Hardin, Jay C.
1996-01-01
In relation to the expected problems in the development of computational aeroacoustics (CAA), the preliminary applications were to classical problems where the known analytical solutions could be used to validate the numerical results. Such comparisons were used to overcome the numerical problems inherent in these calculations. Comparisons were made between the various numerical approaches to the problems such as direct simulations, acoustic analogies and acoustic/viscous splitting techniques. The aim was to demonstrate the applicability of CAA as a tool in the same class as computational fluid dynamics. The scattering problems that occur are considered and simple sources are discussed.
Code of Federal Regulations, 2010 CFR
2010-10-01
... performance of an inherently governmental function (see subpart 7.5). (d) Non-personal service contracts are... describing the need to be filled, or problem to be resolved, through service contracting in a manner that...
Code of Federal Regulations, 2011 CFR
2011-10-01
... performance of an inherently governmental function (see subpart 7.5). (d) Non-personal service contracts are... describing the need to be filled, or problem to be resolved, through service contracting in a manner that...
Systems identification technology development for large space systems
NASA Technical Reports Server (NTRS)
Armstrong, E. S.
1982-01-01
A methodology for synthesizinng systems identification, both parameter and state, estimation and related control schemes for flexible aerospace structures is developed with emphasis on the Maypole hoop column antenna as a real world application. Modeling studies of the Maypole cable hoop membrane type antenna are conducted using a transfer matrix numerical analysis approach. This methodology was chosen as particularly well suited for handling a large number of antenna configurations of a generic type. A dedicated transfer matrix analysis, both by virtue of its specialization and the inherently easy compartmentalization of the formulation and numerical procedures, is significantly more efficient not only in computer time required but, more importantly, in the time needed to review and interpret the results.
ERIC Educational Resources Information Center
Gauthier, Benoit; And Others
1997-01-01
Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)
Spectrally-balanced chromatic approach-lighting system
NASA Technical Reports Server (NTRS)
Chase, W. D.
1977-01-01
Approach lighting system employing combinations of red and blue lights reduces problem of color-based optical illusions. System exploits inherent chromatic aberration of eye to create three-dimensional effect, giving pilot visual clues of position.
A Rationale for Relating Salaries to Learner Outcomes.
ERIC Educational Resources Information Center
Benedict, Gary C.; Gerardi, Robert J.
1985-01-01
Presents a formula for relating teacher salaries to student achievement. Discusses the problems inherent in measuring student achievement and the importance of the principal's administrative training. Includes a chart and diagrams. (MD)
ERIC Educational Resources Information Center
Paz, Benito Castejon; And Others
The major aim of this study is to devise a model for rationalizing sports policies by defining the basic concepts that should be inherent in any proper sports policy despite the infinite diversity that characterizes actual sport situations. The first part of the study discusses three concepts which are basic to the model: a) the "level of sport"…
ERIC Educational Resources Information Center
Calhoun, Shawn P.
2012-01-01
Information literacy is a complex knowledge domain. Cognitive processing theory describes the effects an instructional subject and the learning environment have on working memory. Essential processing is one component of cognitive processing theory that explains the inherent complexity of knowledge domains such as information literacy. Prior…
2003-02-01
Holistic Life Prediction Methodology Engineering is a profession based in science, but in the face of limited data or resources, the application of...the process. (see Table 1). "* HLPM uses continuum mechanics but defines limits of applicability - is material and process specific. "* HLPM defines...LEFM - EPFM ?) Nucleated Structure dominated Data base** Tensile/compressive discontinuity (not crack growth buckling inherent) type, size, Appropriate
Power relations in qualitative research.
Karnieli-Miller, Orit; Strier, Roni; Pessach, Liat
2009-02-01
This article focuses on the tensions between the commitment to power redistribution of the qualitative paradigm and the ethical and methodological complexity inherent in clinical research. Qualitative inquiry, in general, though there are significant variations between its different paradigms and traditions, proposes to reduce power differences and encourages disclosure and authenticity between researchers and participants. It clearly departs from the traditional conception of quantitative research, whereby the researcher is the ultimate source of authority and promotes the participants' equal participation in the research process. But it is precisely this admirable desire to democratize the research process, and the tendency to question traditional role boundaries, that raises multiple ethical dilemmas and serious methodological challenges. In this article, we offer a conceptual frame for addressing questions of power distribution in qualitative research through a developmental analysis of power relations across the different stages of the research process. We discuss ethical and methodological issues.
A new methodology for vibration error compensation of optical encoders.
Lopez, Jesus; Artes, Mariano
2012-01-01
Optical encoders are sensors based on grating interference patterns. Tolerances inherent to the manufacturing process can induce errors in the position accuracy as the measurement signals stand apart from the ideal conditions. In case the encoder is working under vibrations, the oscillating movement of the scanning head is registered by the encoder system as a displacement, introducing an error into the counter to be added up to graduation, system and installation errors. Behavior improvement can be based on different techniques trying to compensate the error from measurement signals processing. In this work a new "ad hoc" methodology is presented to compensate the error of the encoder when is working under the influence of vibration. The methodology is based on fitting techniques to the Lissajous figure of the deteriorated measurement signals and the use of a look up table, giving as a result a compensation procedure in which a higher accuracy of the sensor is obtained.
Word-based Morphology: Some Problems from a Polysynthetic Language.
ERIC Educational Resources Information Center
Axelrod, Melissa
Some of the problems inherent in a word-based hypothesis asserting that the word/stem is taken as the minimal sign not only for syntax but also for morphology are examined in an analysis of a polysynthetic language, Koyukon, an Athabaskan language of Alaska. Data from the Central dialect is considered in the analysis. A brief sketch of the verbal…
Toward 2D and 3D imaging of magnetic nanoparticles using EPR measurements.
Coene, A; Crevecoeur, G; Leliaert, J; Dupré, L
2015-09-01
Magnetic nanoparticles (MNPs) are an important asset in many biomedical applications. An effective working of these applications requires an accurate knowledge of the spatial MNP distribution. A promising, noninvasive, and sensitive technique to visualize MNP distributions in vivo is electron paramagnetic resonance (EPR). Currently only 1D MNP distributions can be reconstructed. In this paper, the authors propose extending 1D EPR toward 2D and 3D using computer simulations to allow accurate imaging of MNP distributions. To find the MNP distribution belonging to EPR measurements, an inverse problem needs to be solved. The solution of this inverse problem highly depends on the stability of the inverse problem. The authors adapt 1D EPR imaging to realize the imaging of multidimensional MNP distributions. Furthermore, the authors introduce partial volume excitation in which only parts of the volume are imaged to increase stability of the inverse solution and to speed up the measurements. The authors simulate EPR measurements of different 2D and 3D MNP distributions and solve the inverse problem. The stability is evaluated by calculating the condition measure and by comparing the actual MNP distribution to the reconstructed MNP distribution. Based on these simulations, the authors define requirements for the EPR system to cope with the added dimensions. Moreover, the authors investigate how EPR measurements should be conducted to improve the stability of the associated inverse problem and to increase reconstruction quality. The approach used in 1D EPR can only be employed for the reconstruction of small volumes in 2D and 3D EPRs due to numerical instability of the inverse solution. The authors performed EPR measurements of increasing cylindrical volumes and evaluated the condition measure. This showed that a reduction of the inherent symmetry in the EPR methodology is necessary. By reducing the symmetry of the EPR setup, quantitative images of larger volumes can be obtained. The authors found that, by selectively exciting parts of the volume, the authors could increase the reconstruction quality even further while reducing the amount of measurements. Additionally, the inverse solution of this activation method degrades slower for increasing volumes. Finally, the methodology was applied to noisy EPR measurements: using the reduced EPR setup's symmetry and the partial activation method, an increase in reconstruction quality of ≈ 80% can be seen with a speedup of the measurements with 10%. Applying the aforementioned requirements to the EPR setup and stabilizing the EPR measurements showed a tremendous increase in noise robustness, thereby making EPR a valuable method for quantitative imaging of multidimensional MNP distributions.
Participatory Literacy Education: A Complex Phenomenon.
ERIC Educational Resources Information Center
Demetrion, George
1993-01-01
A case study of the Bob Steele Reading Center in Connecticut demonstrates problems inherent in participatory literacy education when the learners, the program's culture, and the sociocultural context are not grounded in the participatory democratic ethic. (SK)
ERIC Educational Resources Information Center
Shaw, Richard
2001-01-01
Examines the maintenance management problems inherent in cleaning multiple flooring materials revealing the need for school officials to keep it simple when choosing flooring types. Also highlighted is a carpet recycling program used by Wright State University (Ohio). (GR)
NASA Astrophysics Data System (ADS)
Ning, Boda; Jin, Jiong; Zheng, Jinchuan; Man, Zhihong
2018-06-01
This paper is concerned with finite-time and fixed-time consensus of multi-agent systems in a leader-following framework. Different from conventional leader-following tracking approaches where inherent dynamics satisfying the Lipschitz continuous condition is required, a more generalised case is investigated: discontinuous inherent dynamics. By nonsmooth techniques, a nonlinear protocol is first proposed to achieve the finite-time leader-following consensus. Then, based on fixed-time stability strategies, the fixed-time leader-following consensus problem is solved. An upper bound of settling time is obtained by using a new protocol, and such a bound is independent of initial states, thereby providing additional options for designers in practical scenarios where initial conditions are unavailable. Finally, numerical simulations are provided to demonstrate the effectiveness of the theoretical results.
Eco-analytical Methodology in Environmental Problems Monitoring
NASA Astrophysics Data System (ADS)
Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.
2017-01-01
Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.
Heuristic decomposition for non-hierarchic systems
NASA Technical Reports Server (NTRS)
Bloebaum, Christina L.; Hajela, P.
1991-01-01
Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-12
..., low precipitation, and wind scour, features they predicted would persist over time, especially on... Christmas Bird Count (CBC), suffer from a variety of problems, including the inherent difficulties...
Some Observations on Cost-Effectiveness Analysis in Education.
ERIC Educational Resources Information Center
Geske, Terry G.
1979-01-01
The general nature of cost-effectiveness analysis is discussed, analytical frameworks for conducting cost-effectiveness studies are described, and some of the problems inherent in measuring educational costs and in assessing program effectiveness are addressed. (Author/IRT)
Study on thick film spin-on carbon hardmask
NASA Astrophysics Data System (ADS)
Kim, Taeho; Kim, Youngmin; Hwang, Sunmin; Lee, Hyunsoo; Han, Miyeon; Lim, Sanghak
2017-03-01
A thick spin-on carbon hardmask (SOH) material is designed to overcome inherent problems of amorphous deposited carbon layer (ACL) and thick photoresist. For ACL in use of semiconductor production process, especially when film thickness from sub-micrometer up to few micrometers is required, not only its inherent low transparency at long wavelength light often causes alignment problems with under layers, but also considerable variation of film thickness within a wafer can also cause patterning problems. To avoid these issues, a thick SOH is designed with monomers of high transparency and good solubility at the same time. In comparison with photoresist, the SOH has good etch resistance and high thermal stability, and it provides wide process window of decreased film thickness and increased thermal budget up to 400°C after processes such as high temperature deposition of SiON. In order to achieve high thickness along with uniform film, many solvent factors was considered such as solubility parameter, surface tension, vapor pressure, and others. By optimizing many solvent factors, we were able to develop a product with a good coating performance
Iterative Region-of-Interest Reconstruction from Limited Data Using Prior Information
NASA Astrophysics Data System (ADS)
Vogelgesang, Jonas; Schorr, Christian
2017-12-01
In practice, computed tomography and computed laminography applications suffer from incomplete data. In particular, when inspecting large objects with extremely different diameters in longitudinal and transversal directions or when high resolution reconstructions are desired, the physical conditions of the scanning system lead to restricted data and truncated projections, also known as the interior or region-of-interest (ROI) problem. To recover the searched-for density function of the inspected object, we derive a semi-discrete model of the ROI problem that inherently allows the incorporation of geometrical prior information in an abstract Hilbert space setting for bounded linear operators. Assuming that the attenuation inside the object is approximately constant, as for fibre reinforced plastics parts or homogeneous objects where one is interested in locating defects like cracks or porosities, we apply the semi-discrete Landweber-Kaczmarz method to recover the inner structure of the object inside the ROI from the measured data resulting in a semi-discrete iteration method. Finally, numerical experiments for three-dimensional tomographic applications with both an inherent restricted source and ROI problem are provided to verify the proposed method for the ROI reconstruction.
Comparison of the CENTRM resonance processor to the NITAWL resonance processor in SCALE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollenbach, D.F.; Petrie, L.M.
1998-01-01
This report compares the MTAWL and CENTRM resonance processors in the SCALE code system. The cases examined consist of the International OECD/NEA Criticality Working Group Benchmark 20 problem. These cases represent fuel pellets partially dissolved in a borated solution. The assumptions inherent to the Nordheim Integral Treatment, used in MTAWL, are not valid for these problems. CENTRM resolves this limitation by explicitly calculating a problem dependent point flux from point cross sections, which is then used to create group cross sections.
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
Polar exponential sensor arrays unify iconic and Hough space representation
NASA Technical Reports Server (NTRS)
Weiman, Carl F. R.
1990-01-01
The log-polar coordinate system, inherent in both polar exponential sensor arrays and log-polar remapped video imagery, is identical to the coordinate system of its corresponding Hough transform parameter space. The resulting unification of iconic and Hough domains simplifies computation for line recognition and eliminates the slope quantization problems inherent in the classical Cartesian Hough transform. The geometric organization of the algorithm is more amenable to massively parallel architectures than that of the Cartesian version. The neural architecture of the human visual cortex meets the geometric requirements to execute 'in-place' log-Hough algorithms of the kind described here.
Autocalibration method for non-stationary CT bias correction.
Vegas-Sánchez-Ferrero, Gonzalo; Ledesma-Carbayo, Maria J; Washko, George R; Estépar, Raúl San José
2018-02-01
Computed tomography (CT) is a widely used imaging modality for screening and diagnosis. However, the deleterious effects of radiation exposure inherent in CT imaging require the development of image reconstruction methods which can reduce exposure levels. The development of iterative reconstruction techniques is now enabling the acquisition of low-dose CT images whose quality is comparable to that of CT images acquired with much higher radiation dosages. However, the characterization and calibration of the CT signal due to changes in dosage and reconstruction approaches is crucial to provide clinically relevant data. Although CT scanners are calibrated as part of the imaging workflow, the calibration is limited to select global reference values and does not consider other inherent factors of the acquisition that depend on the subject scanned (e.g. photon starvation, partial volume effect, beam hardening) and result in a non-stationary noise response. In this work, we analyze the effect of reconstruction biases caused by non-stationary noise and propose an autocalibration methodology to compensate it. Our contributions are: 1) the derivation of a functional relationship between observed bias and non-stationary noise, 2) a robust and accurate method to estimate the local variance, 3) an autocalibration methodology that does not necessarily rely on a calibration phantom, attenuates the bias caused by noise and removes the systematic bias observed in devices from different vendors. The validation of the proposed methodology was performed with a physical phantom and clinical CT scans acquired with different configurations (kernels, doses, algorithms including iterative reconstruction). The results confirmed the suitability of the proposed methods for removing the intra-device and inter-device reconstruction biases. Copyright © 2017 Elsevier B.V. All rights reserved.
Provencher, Steeve; Archer, Stephen L; Ramirez, F Daniel; Hibbert, Benjamin; Paulin, Roxane; Boucherat, Olivier; Lacasse, Yves; Bonnet, Sébastien
2018-03-30
Despite advances in our understanding of the pathophysiology and the management of pulmonary arterial hypertension (PAH), significant therapeutic gaps remain for this devastating disease. Yet, few innovative therapies beyond the traditional pathways of endothelial dysfunction have reached clinical trial phases in PAH. Although there are inherent limitations of the currently available models of PAH, the leaky pipeline of innovative therapies relates, in part, to flawed preclinical research methodology, including lack of rigour in trial design, incomplete invasive hemodynamic assessment, and lack of careful translational studies that replicate randomized controlled trials in humans with attention to adverse effects and benefits. Rigorous methodology should include the use of prespecified eligibility criteria, sample sizes that permit valid statistical analysis, randomization, blinded assessment of standardized outcomes, and transparent reporting of results. Better design and implementation of preclinical studies can minimize inherent flaws in the models of PAH, reduce the risk of bias, and enhance external validity and our ability to distinguish truly promising therapies form many false-positive or overstated leads. Ideally, preclinical studies should use advanced imaging, study several preclinical pulmonary hypertension models, or correlate rodent and human findings and consider the fate of the right ventricle, which is the major determinant of prognosis in human PAH. Although these principles are widely endorsed, empirical evidence suggests that such rigor is often lacking in pulmonary hypertension preclinical research. The present article discusses the pitfalls in the design of preclinical pulmonary hypertension trials and discusses opportunities to create preclinical trials with improved predictive value in guiding early-phase drug development in patients with PAH, which will need support not only from researchers, peer reviewers, and editors but also from academic institutions, funding agencies, and animal ethics authorities. © 2018 American Heart Association, Inc.
Castillo, Edward; Castillo, Richard; Fuentes, David; Guerrero, Thomas
2014-01-01
Purpose: Block matching is a well-known strategy for estimating corresponding voxel locations between a pair of images according to an image similarity metric. Though robust to issues such as image noise and large magnitude voxel displacements, the estimated point matches are not guaranteed to be spatially accurate. However, the underlying optimization problem solved by the block matching procedure is similar in structure to the class of optimization problem associated with B-spline based registration methods. By exploiting this relationship, the authors derive a numerical method for computing a global minimizer to a constrained B-spline registration problem that incorporates the robustness of block matching with the global smoothness properties inherent to B-spline parameterization. Methods: The method reformulates the traditional B-spline registration problem as a basis pursuit problem describing the minimal l1-perturbation to block match pairs required to produce a B-spline fitting error within a given tolerance. The sparsity pattern of the optimal perturbation then defines a voxel point cloud subset on which the B-spline fit is a global minimizer to a constrained variant of the B-spline registration problem. As opposed to traditional B-spline algorithms, the optimization step involving the actual image data is addressed by block matching. Results: The performance of the method is measured in terms of spatial accuracy using ten inhale/exhale thoracic CT image pairs (available for download at www.dir-lab.com) obtained from the COPDgene dataset and corresponding sets of expert-determined landmark point pairs. The results of the validation procedure demonstrate that the method can achieve a high spatial accuracy on a significantly complex image set. Conclusions: The proposed methodology is demonstrated to achieve a high spatial accuracy and is generalizable in that in can employ any displacement field parameterization described as a least squares fit to block match generated estimates. Thus, the framework allows for a wide range of image similarity block match metric and physical modeling combinations. PMID:24694135
Resin additive improves performance of high-temperature hydrocarbon lubricants
NASA Technical Reports Server (NTRS)
Johnson, R. L.; Loomis, W. R.
1971-01-01
Paraffinic resins, in high temperature applications, improve strength of thin lubricant film in Hertzian contacts even though they do not increase bulk oil viscosity. Use of resin circumvents corrosivity and high volatility problems inherent with many chemical additives.
Linking ecosystem services, rehabilitation and river hydrogeomorphology
Assignment of values for natural ecological benefits and anthropocentric ecosystem services in riverine landscapes has been problematic, because a firm scientific basis linking these to the river's physical structure has been absent. We highlight some inherent problems in this pr...
Methodological issues in the study of violence against women
Ruiz‐Pérez, Isabel; Plazaola‐Castaño, Juncal; Vives‐Cases, Carmen
2007-01-01
The objective of this paper is to review the methodological issues that arise when studying violence against women as a public health problem, focusing on intimate partner violence (IPV), since this is the form of violence that has the greatest consequences at a social and political level. The paper focuses first on the problems of defining what is meant by IPV. Secondly, the paper describes the difficulties in assessing the magnitude of the problem. Obtaining reliable data on this type of violence is a complex task, because of the methodological issues derived from the very nature of the phenomenon, such as the private, intimate context in which this violence often takes place, which means the problem cannot be directly observed. Finally, the paper examines the limitations and bias in research on violence, including the lack of consensus with regard to measuring events that may or may not represent a risk factor for violence against women or the methodological problem related to the type of sampling used in both aetiological and prevalence studies. PMID:18000113
Review and evaluation of innovative technologies for measuring diet in nutritional epidemiology.
Illner, A-K; Freisling, H; Boeing, H; Huybrechts, I; Crispim, S P; Slimani, N
2012-08-01
The use of innovative technologies is deemed to improve dietary assessment in various research settings. However, their relative merits in nutritional epidemiological studies, which require accurate quantitative estimates of the usual intake at individual level, still need to be evaluated. To report on the inventory of available innovative technologies for dietary assessment and to critically evaluate their strengths and weaknesses as compared with the conventional methodologies (i.e. Food Frequency Questionnaires, food records, 24-hour dietary recalls) used in epidemiological studies. A list of currently available technologies was identified from English-language journals, using PubMed and Web of Science. The search criteria were principally based on the date of publication (between 1995 and 2011) and pre-defined search keywords. Six main groups of innovative technologies were identified ('Personal Digital Assistant-', 'Mobile-phone-', 'Interactive computer-', 'Web-', 'Camera- and tape-recorder-' and 'Scan- and sensor-based' technologies). Compared with the conventional food records, Personal Digital Assistant and mobile phone devices seem to improve the recording through the possibility for 'real-time' recording at eating events, but their validity to estimate individual dietary intakes was low to moderate. In 24-hour dietary recalls, there is still limited knowledge regarding the accuracy of fully automated approaches; and methodological problems, such as the inaccuracy in self-reported portion sizes might be more critical than in interview-based applications. In contrast, measurement errors in innovative web-based and in conventional paper-based Food Frequency Questionnaires are most likely similar, suggesting that the underlying methodology is unchanged by the technology. Most of the new technologies in dietary assessment were seen to have overlapping methodological features with the conventional methods predominantly used for nutritional epidemiology. Their main potential to enhance dietary assessment is through more cost- and time-effective, less laborious ways of data collection and higher subject acceptance, though their integration in epidemiological studies would need additional considerations, such as the study objectives, the target population and the financial resources available. However, even in innovative technologies, the inherent individual bias related to self-reported dietary intake will not be resolved. More research is therefore crucial to investigate the validity of innovative dietary assessment technologies.
Training effectiveness assessment: Methodological problems and issues
NASA Technical Reports Server (NTRS)
Cross, Kenneth D.
1992-01-01
The U.S. military uses a large number of simulators to train and sustain the flying skills of helicopter pilots. Despite the enormous resources required to purchase, maintain, and use those simulators, little effort has been expended in assessing their training effectiveness. One reason for this is the lack of an evaluation methodology that yields comprehensive and valid data at a practical cost. Some of these methodological problems and issues that arise in assessing simulator training effectiveness, as well as problems with the classical transfer-of-learning paradigm were discussed.
NASA Astrophysics Data System (ADS)
Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho
2015-01-01
Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.
Ngo, Joy; Gurinovic, Mirjana; Frost-Andersen, Lene; Serra-Majem, Lluís
2009-07-01
Immigrants comprise a noteworthy segment of the European population whose numbers are increasing. Research on the dietary habits of immigrants is critical for correctly providing diet counselling and implementing effective interventions. The aim of the present study was to identify the presently used methods and adaptations required for measuring dietary intake in European immigrant groups. A comprehensive review strategy included a structured MEDLINE search, related references and key expert consultations. The review targeted adults from non-European union (European union-15 countries) ethnic groups having the largest populations in Europe. As studies evaluating nutrient intake were scarce, papers evaluating intake at the level of foods were included. Forty-six papers were selected. Although Eastern Europe, Turkey, Africa (North, Sub-Saharan and Afro-Caribbean), Asia and Latin America represented the most numerous immigrant groups, papers on dietary intake were not available for all populations. Interview-administered FFQ and repeated 24 hour recalls were the most frequently applied instruments. Inclusion of ethnic foods and quantification of specific portion sizes of traditional foods and dishes in assessment tools as well as food composition databases were commonly identified problems. For FFQ, food list elaboration required particular consideration to reflect key ethnic foods and relative contribution to nutrient intake. Extra efforts were observed to overcome cultural barriers to study participation. Evaluating dietary intake of immigrant populations requires special attention to various methodological aspects (sampling, recruiting, instruments used, method of administration, food composition database, acculturation, etc.) so as to adequately address the range of socio-cultural factors inherent in these nutritionally at risk target groups.
Barallon, Rita; Bauer, Steven R.; Butler, John; Capes-Davis, Amanda; Dirks, Wilhelm G.; Furtado, Manohar; Kline, Margaret C.; Kohara, Arihiro; Los, Georgyi V.; MacLeod, Roderick A. F.; Masters, John R. W.; Nardone, Mark; Nardone, Roland M.; Nims, Raymond W.; Price, Paul J.; Reid, Yvonne A.; Shewale, Jaiprakash; Sykes, Gregory; Steuer, Anton F.; Storts, Douglas R.; Thomson, Jim; Taraporewala, Zenobia; Alston-Roberts, Christine; Kerrigan, Liz
2010-01-01
Cell misidentification and cross-contamination have plagued biomedical research for as long as cells have been employed as research tools. Examples of misidentified cell lines continue to surface to this day. Efforts to eradicate the problem by raising awareness of the issue and by asking scientists voluntarily to take appropriate actions have not been successful. Unambiguous cell authentication is an essential step in the scientific process and should be an inherent consideration during peer review of papers submitted for publication or during review of grants submitted for funding. In order to facilitate proper identity testing, accurate, reliable, inexpensive, and standardized methods for authentication of cells and cell lines must be made available. To this end, an international team of scientists is, at this time, preparing a consensus standard on the authentication of human cells using short tandem repeat (STR) profiling. This standard, which will be submitted for review and approval as an American National Standard by the American National Standards Institute, will provide investigators guidance on the use of STR profiling for authenticating human cell lines. Such guidance will include methodological detail on the preparation of the DNA sample, the appropriate numbers and types of loci to be evaluated, and the interpretation and quality control of the results. Associated with the standard itself will be the establishment and maintenance of a public STR profile database under the auspices of the National Center for Biotechnology Information. The consensus standard is anticipated to be adopted by granting agencies and scientific journals as appropriate methodology for authenticating human cell lines, stem cells, and tissues. PMID:20614197
A Strategy for a Parametric Flood Insurance Using Proxies
NASA Astrophysics Data System (ADS)
Haraguchi, M.; Lall, U.
2017-12-01
Traditionally, the design of flood control infrastructure and flood plain zoning require the estimation of return periods, which have been calculated by river hydraulic models with rainfall-runoff models. However, this multi-step modeling process leads to significant uncertainty to assess inundation. In addition, land use change and changing climate alter the potential losses, as well as make the modeling results obsolete. For these reasons, there is a strong need to create parametric indexes for the financial risk transfer for large flood events, to enable rapid response and recovery. Hence, this study examines the possibility of developing a parametric flood index at the national or regional level in Asia, which can be quickly mobilized after catastrophic floods. Specifically, we compare a single trigger based on rainfall index with multiple triggers using rainfall and streamflow indices by conducting case studies in Bangladesh and Thailand. The proposed methodology is 1) selecting suitable indices of rainfall and streamflow (if available), 2) identifying trigger levels for specified return periods for losses using stepwise and logistic regressions, 3) measuring the performance of indices, and 4) deriving return periods of selected windows and trigger levels. Based on the methodology, actual trigger levels were identified for Bangladesh and Thailand. Models based on multiple triggers reduced basis risks, an inherent problem in an index insurance. The proposed parametric flood index can be applied to countries with similar geographic and meteorological characteristics, and serve as a promising method for ex-ante risk financing for developing countries. This work is intended to be a preliminary work supporting future work on pricing risk transfer mechanisms in ex-ante risk finance.
Barallon, Rita; Bauer, Steven R; Butler, John; Capes-Davis, Amanda; Dirks, Wilhelm G; Elmore, Eugene; Furtado, Manohar; Kline, Margaret C; Kohara, Arihiro; Los, Georgyi V; MacLeod, Roderick A F; Masters, John R W; Nardone, Mark; Nardone, Roland M; Nims, Raymond W; Price, Paul J; Reid, Yvonne A; Shewale, Jaiprakash; Sykes, Gregory; Steuer, Anton F; Storts, Douglas R; Thomson, Jim; Taraporewala, Zenobia; Alston-Roberts, Christine; Kerrigan, Liz
2010-10-01
Cell misidentification and cross-contamination have plagued biomedical research for as long as cells have been employed as research tools. Examples of misidentified cell lines continue to surface to this day. Efforts to eradicate the problem by raising awareness of the issue and by asking scientists voluntarily to take appropriate actions have not been successful. Unambiguous cell authentication is an essential step in the scientific process and should be an inherent consideration during peer review of papers submitted for publication or during review of grants submitted for funding. In order to facilitate proper identity testing, accurate, reliable, inexpensive, and standardized methods for authentication of cells and cell lines must be made available. To this end, an international team of scientists is, at this time, preparing a consensus standard on the authentication of human cells using short tandem repeat (STR) profiling. This standard, which will be submitted for review and approval as an American National Standard by the American National Standards Institute, will provide investigators guidance on the use of STR profiling for authenticating human cell lines. Such guidance will include methodological detail on the preparation of the DNA sample, the appropriate numbers and types of loci to be evaluated, and the interpretation and quality control of the results. Associated with the standard itself will be the establishment and maintenance of a public STR profile database under the auspices of the National Center for Biotechnology Information. The consensus standard is anticipated to be adopted by granting agencies and scientific journals as appropriate methodology for authenticating human cell lines, stem cells, and tissues.
Ethical and Legal Implications of the Methodological Crisis in Neuroimaging.
Kellmeyer, Philipp
2017-10-01
Currently, many scientific fields such as psychology or biomedicine face a methodological crisis concerning the reproducibility, replicability, and validity of their research. In neuroimaging, similar methodological concerns have taken hold of the field, and researchers are working frantically toward finding solutions for the methodological problems specific to neuroimaging. This article examines some ethical and legal implications of this methodological crisis in neuroimaging. With respect to ethical challenges, the article discusses the impact of flawed methods in neuroimaging research in cognitive and clinical neuroscience, particularly with respect to faulty brain-based models of human cognition, behavior, and personality. Specifically examined is whether such faulty models, when they are applied to neurological or psychiatric diseases, could put patients at risk, and whether this places special obligations on researchers using neuroimaging. In the legal domain, the actual use of neuroimaging as evidence in United States courtrooms is surveyed, followed by an examination of ways that the methodological problems may create challenges for the criminal justice system. Finally, the article reviews and promotes some promising ideas and initiatives from within the neuroimaging community for addressing the methodological problems.
A hierarchical anatomical classification schema for prediction of phenotypic side effects
Kanji, Rakesh
2018-01-01
Prediction of adverse drug reactions is an important problem in drug discovery endeavors which can be addressed with data-driven strategies. SIDER is one of the most reliable and frequently used datasets for identification of key features as well as building machine learning models for side effects prediction. The inherently unbalanced nature of this data presents with a difficult multi-label multi-class problem towards prediction of drug side effects. We highlight the intrinsic issue with SIDER data and methodological flaws in relying on performance measures such as AUC while attempting to predict side effects.We argue for the use of metrics that are robust to class imbalance for evaluation of classifiers. Importantly, we present a ‘hierarchical anatomical classification schema’ which aggregates side effects into organs, sub-systems, and systems. With the help of a weighted performance measure, using 5-fold cross-validation we show that this strategy facilitates biologically meaningful side effects prediction at different levels of anatomical hierarchy. By implementing various machine learning classifiers we show that Random Forest model yields best classification accuracy at each level of coarse-graining. The manually curated, hierarchical schema for side effects can also serve as the basis of future studies towards prediction of adverse reactions and identification of key features linked to specific organ systems. Our study provides a strategy for hierarchical classification of side effects rooted in the anatomy and can pave the way for calibrated expert systems for multi-level prediction of side effects. PMID:29494708
A hierarchical anatomical classification schema for prediction of phenotypic side effects.
Wadhwa, Somin; Gupta, Aishwarya; Dokania, Shubham; Kanji, Rakesh; Bagler, Ganesh
2018-01-01
Prediction of adverse drug reactions is an important problem in drug discovery endeavors which can be addressed with data-driven strategies. SIDER is one of the most reliable and frequently used datasets for identification of key features as well as building machine learning models for side effects prediction. The inherently unbalanced nature of this data presents with a difficult multi-label multi-class problem towards prediction of drug side effects. We highlight the intrinsic issue with SIDER data and methodological flaws in relying on performance measures such as AUC while attempting to predict side effects.We argue for the use of metrics that are robust to class imbalance for evaluation of classifiers. Importantly, we present a 'hierarchical anatomical classification schema' which aggregates side effects into organs, sub-systems, and systems. With the help of a weighted performance measure, using 5-fold cross-validation we show that this strategy facilitates biologically meaningful side effects prediction at different levels of anatomical hierarchy. By implementing various machine learning classifiers we show that Random Forest model yields best classification accuracy at each level of coarse-graining. The manually curated, hierarchical schema for side effects can also serve as the basis of future studies towards prediction of adverse reactions and identification of key features linked to specific organ systems. Our study provides a strategy for hierarchical classification of side effects rooted in the anatomy and can pave the way for calibrated expert systems for multi-level prediction of side effects.
NASA Astrophysics Data System (ADS)
Yuval; Rimon, Y.; Graber, E. R.; Furman, A.
2013-07-01
A large fraction of the fresh water available for human use is stored in groundwater aquifers. Since human activities such as mining, agriculture, industry and urbanization often result in incursion of various pollutants to groundwater, routine monitoring of water quality is an indispensable component of judicious aquifer management. Unfortunately, groundwater pollution monitoring is expensive and usually cannot cover an aquifer with the spatial resolution necessary for making adequate management decisions. Interpolation of monitoring data between points is thus an important tool for supplementing measured data. However, interpolating routine groundwater pollution data poses a special problem due to the nature of the observations. The data from a producing aquifer usually includes many zero pollution concentration values from the clean parts of the aquifer but may span a wide range (up to a few orders of magnitude) of values in the polluted areas. This manuscript presents a methodology that can cope with such datasets and use them to produce maps that present the pollution plumes but also delineates the clean areas that are fit for production. A method for assessing the quality of mapping in a way which is suitable to the data's dynamic range of values is also presented. Local variant of inverse distance weighting is employed to interpolate the data. Inclusion zones around the interpolation points ensure that only relevant observations contribute to each interpolated concentration. Using inclusion zones improves the accuracy of the mapping but results in interpolation grid points which are not assigned a value. That inherent trade-off between the interpolation accuracy and coverage is demonstrated using both circular and elliptical inclusion zones. A leave-one-out cross testing is used to assess and compare the performance of the interpolations. The methodology is demonstrated using groundwater pollution monitoring data from the Coastal aquifer along the Israeli shoreline.
Yuval, Yuval; Rimon, Yaara; Graber, Ellen R; Furman, Alex
2014-08-01
A large fraction of the fresh water available for human use is stored in groundwater aquifers. Since human activities such as mining, agriculture, industry and urbanisation often result in incursion of various pollutants to groundwater, routine monitoring of water quality is an indispensable component of judicious aquifer management. Unfortunately, groundwater pollution monitoring is expensive and usually cannot cover an aquifer with the spatial resolution necessary for making adequate management decisions. Interpolation of monitoring data is thus an important tool for supplementing monitoring observations. However, interpolating routine groundwater pollution data poses a special problem due to the nature of the observations. The data from a producing aquifer usually includes many zero pollution concentration values from the clean parts of the aquifer but may span a wide range of values (up to a few orders of magnitude) in the polluted areas. This manuscript presents a methodology that can cope with such datasets and use them to produce maps that present the pollution plumes but also delineates the clean areas that are fit for production. A method for assessing the quality of mapping in a way which is suitable to the data's dynamic range of values is also presented. A local variant of inverse distance weighting is employed to interpolate the data. Inclusion zones around the interpolation points ensure that only relevant observations contribute to each interpolated concentration. Using inclusion zones improves the accuracy of the mapping but results in interpolation grid points which are not assigned a value. The inherent trade-off between the interpolation accuracy and coverage is demonstrated using both circular and elliptical inclusion zones. A leave-one-out cross testing is used to assess and compare the performance of the interpolations. The methodology is demonstrated using groundwater pollution monitoring data from the coastal aquifer along the Israeli shoreline. The implications for aquifer management are discussed.
NASA Astrophysics Data System (ADS)
Harmon, T. C.; Villamizar, S. R.; Conde, D.; Rusak, J.; Reid, B.; Astorga, A.; Perillo, G. M.; Piccolo, M. C.; Zilio, M.; London, S.; Velez, M.; Hoyos, N.; Escobar, J.
2014-12-01
Freshwater ecosystems and the services they provide are under increasing anthropogenic pressure at local (e.g., irrigation diversions, wastewater discharge) and global scales (e.g., climate change, global trading). The impact depends on an ecosystem's sensitivity, which is determined by its geophysical and ecological settings, and the population and activities in its surrounding watershed. Given the importance of ecosystem services, it is critical that we improve our ability to identify and understand changes in aquatic ecosystems, and translate them to risk of service loss. Furthermore, to inspire changes in human behavior, it is equally critical that we learn to communicate risk, and pose risk mitigation strategies, in a manner acceptable to a broad spectrum of stakeholders. Quantifying the nature and timing of the risk is difficult because (1) we often fail to understand the connection between anthropogenic pressures and the timing and extent of ecosystem changes; and (2) the concept of risk is inherently coupled to human perception, which generally differs with cultural and socio-economic conditions. In this study, we endeavor to assess aquatic ecosystem risks across an international array of six study sites. The challenge is to construct a methodology capable of capturing the marked biogeographical, socioeconomic, and cultural differences among the sites, which include: (1) Muskoka River watershed in humid continental Ontario, Canada; (2) Lower San Joaquin River, an impounded snow-fed river in semi-arid Central California; (3) Ciénaga Grande de Santa Marta, a tropical coastal lagoon in Colombia; (4) Senguer River basin in the semi-arid part of Argentina; (5) Laguna de Rocha watershed in humid subtropical Uruguay; and (6) Palomas Lake complex in oceanic Chilean Patagonia. Results will include a characterization of the experimental gradient over the six sites, an overview of the risk assessment methodology, and preliminary findings for several of the sites.
Quantitative evaluation of photoplethysmographic artifact reduction for pulse oximetry
NASA Astrophysics Data System (ADS)
Hayes, Matthew J.; Smith, Peter R.
1999-01-01
Motion artefact corruption of pulse oximeter output, causing both measurement inaccuracies and false alarm conditions, is a primary restriction in the current clinical practice and future applications of this useful technique. Artefact reduction in photoplethysmography (PPG), and therefore by application in pulse oximetry, is demonstrated using a novel non-linear methodology recently proposed by the authors. The significance of these processed PPG signals for pulse oximetry measurement is discussed, with particular attention to the normalization inherent in the artefact reduction process. Quantitative experimental investigation of the performance of PPG artefact reduction is then utilized to evaluate this technology for application to pulse oximetry. While the successfully demonstrated reduction of severe artefacts may widen the applicability of all PPG technologies and decrease the occurrence of pulse oximeter false alarms, the observed reduction of slight artefacts suggests that many such effects may go unnoticed in clinical practice. The signal processing and output averaging used in most commercial oximeters can incorporate these artefact errors into the output, while masking the true PPG signal corruption. It is therefore suggested that PPG artefact reduction should be incorporated into conventional pulse oximetry measurement, even in the absence of end-user artefact problems.
Foresight for commanders: a methodology to assist planning for effects-based operations
NASA Astrophysics Data System (ADS)
Davis, Paul K.; Kahan, James P.
2006-05-01
Looking at the battlespace as a system of systems is a cornerstone of Effects-Based Operations and a key element in the planning of such operations, and in developing the Commander's Predictive Environment. Instead of a physical battleground to be approached with weapons of force, the battlespace is an interrelated super-system of political, military, economic, social, information and infrastructure systems to be approached with diplomatic, informational, military and economic actions. A concept that has proved useful in policy arenas other than defense, such as research and development for information technology, addressing cybercrime, and providing appropriate and cost-effective health care, is foresight. In this paper, we provide an overview of how the foresight approach addresses the inherent uncertainties in planning courses of action, present a set of steps in the conduct of foresight, and then illustrate the application of foresight to a commander's decision problem. We conclude that foresight approach that we describe is consistent with current doctrinal intelligence preparation of the battlespace and operational planning, but represents an advance in that it explicitly addresses the uncertainties in the environment and planning in a way that identifies strategies that are robust over different possible ground truths. It should supplement other planning methods.
The acoustics of ducted propellers
NASA Astrophysics Data System (ADS)
Ali, Sherif F.
The return of the propeller to the long haul commercial service may be rapidly approaching in the form of advanced "prop fans". It is believed that the advanced turboprop will considerably reduce the operational cost. However, such aircraft will come into general use only if their noise levels meet the standards of community acceptability currently applied to existing aircraft. In this work a time-marching boundary-element technique is developed, and used to study the acoustics of ducted propeller. The numerical technique is developed in this work eliminated the inherent instability suffered by conventional approaches. The methodology is validated against other numerical and analytical results. The results show excellent agreement with the analytical solution and show no indication of unstable behavior. For the ducted propeller problem, the propeller is modeled by a rotating source-sink pairs, and the duct is modeled by rigid annular body of elliptical cross-section. Using the model and the developed technique, the effect of different parameters on the acoustic field is predicted and analyzed. This includes the effect of duct length, propeller axial location, and source Mach number. The results of this study show that installing a short duct around the propeller can reduce the noise that reaches an observer on a side line.
Measuring water affordability in developed economies. The added value of a needs-based approach.
Vanhille, Josefine; Goedemé, Tim; Penne, Tess; Van Thielen, Leen; Storms, Bérénice
2018-07-01
In developed countries, water affordability problems remain up on the agenda as the increasing financial costs of water services can impede the realisation of an equal access to water. More than ever, public authorities that define water tariffs face the challenge of reconciling environmental and cost recovery objectives with equity and financial accessibility for all. Indicators of water affordability can be helpful in this regard. Conventional affordability indicators often rely on the actual amount that households spend on water use. In contrast, we propose a needs-based indicator that measures the risk of being unable to afford the amount of water necessary to fulfill essential needs, i.e. needs that should be fulfilled for adequate participation in society. In this paper we set forth the methodological choices inherent to constructing a needs-based affordability indicator. Using a micro-dataset on households in Flanders (Belgium), we compare its results with the outcomes of a more common actual expenses-indicator. The paper illustrates how the constructed needs-based indicator can complement existing affordability indicators, and its capacity to reveal important risk groups. Copyright © 2018 Elsevier Ltd. All rights reserved.
Measuring cancer in indigenous populations.
Sarfati, Diana; Garvey, Gail; Robson, Bridget; Moore, Suzanne; Cunningham, Ruth; Withrow, Diana; Griffiths, Kalinda; Caron, Nadine R; Bray, Freddie
2018-05-01
It is estimated that there are 370 million indigenous peoples in 90 countries globally. Indigenous peoples generally face substantial disadvantage and poorer health status compared with nonindigenous peoples. Population-level cancer surveillance provides data to set priorities, inform policies, and monitor progress over time. Measuring the cancer burden of vulnerable subpopulations, particularly indigenous peoples, is problematic. There are a number of practical and methodological issues potentially resulting in substantial underestimation of cancer incidence and mortality rates, and biased survival rates, among indigenous peoples. This, in turn, may result in a deprioritization of cancer-related programs and policies among these populations. This commentary describes key issues relating to cancer surveillance among indigenous populations including 1) suboptimal identification of indigenous populations, 2) numerator-denominator bias, 3) problems with data linkage in survival analysis, and 4) statistical analytic considerations. We suggest solutions that can be implemented to strengthen the visibility of indigenous peoples around the world. These include acknowledgment of the central importance of full engagement of indigenous peoples with all data-related processes, encouraging the use of indigenous identifiers in national and regional data sets and mitigation and/or careful assessment of biases inherent in cancer surveillance methods for indigenous peoples. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Ryabenkii, V. S.; Turchaninov, V. I.; Tsynkov, S. V.
1999-01-01
We propose a family of algorithms for solving numerically a Cauchy problem for the three-dimensional wave equation. The sources that drive the equation (i.e., the right-hand side) are compactly supported in space for any given time; they, however, may actually move in space with a subsonic speed. The solution is calculated inside a finite domain (e.g., sphere) that also moves with a subsonic speed and always contains the support of the right-hand side. The algorithms employ a standard consistent and stable explicit finite-difference scheme for the wave equation. They allow one to calculate tile solution for arbitrarily long time intervals without error accumulation and with the fixed non-growing amount of tile CPU time and memory required for advancing one time step. The algorithms are inherently three-dimensional; they rely on the presence of lacunae in the solutions of the wave equation in oddly dimensional spaces. The methodology presented in the paper is, in fact, a building block for constructing the nonlocal highly accurate unsteady artificial boundary conditions to be used for the numerical simulation of waves propagating with finite speed over unbounded domains.
The questioned p value: clinical, practical and statistical significance.
Jiménez-Paneque, Rosa
2016-09-09
The use of p-value and statistical significance have been questioned since the early 80s in the last century until today. Much has been discussed about it in the field of statistics and its applications, especially in Epidemiology and Public Health. As a matter of fact, the p-value and its equivalent, statistical significance, are difficult concepts to grasp for the many health professionals some way involved in research applied to their work areas. However, its meaning should be clear in intuitive terms although it is based on theoretical concepts of the field of Statistics. This paper attempts to present the p-value as a concept that applies to everyday life and therefore intuitively simple but whose proper use cannot be separated from theoretical and methodological elements of inherent complexity. The reasons behind the criticism received by the p-value and its isolated use are intuitively explained, mainly the need to demarcate statistical significance from clinical significance and some of the recommended remedies for these problems are approached as well. It finally refers to the current trend to vindicate the p-value appealing to the convenience of its use in certain situations and the recent statement of the American Statistical Association in this regard.
A Comprehensive Planning Model
ERIC Educational Resources Information Center
Temkin, Sanford
1972-01-01
Combines elements of the problem solving approach inherent in methods of applied economics and operations research and the structural-functional analysis common in social science modeling to develop an approach for economic planning and resource allocation for schools and other public sector organizations. (Author)
Digital Ethics: Computers, Photographs, and the Manipulation of Pixels.
ERIC Educational Resources Information Center
Mercedes, Dawn
1996-01-01
Summarizes negative aspects of computer technology and problems inherent in the field of digital imaging. Considers the postmodernist response that borrowing and alteration are essential characteristics of the technology. Discusses the implications of this for education and research. (MJP)
Role of the ruminal microbiome in the production and composition of milk
USDA-ARS?s Scientific Manuscript database
Environmental problems associated with animal agriculture arise primarily from inherent inefficiencies in converting energy in feeds to useful products. Attenuating these inefficiencies provides substantial potential to improve both the economics and environmental footprint of animal agriculture, pa...
Improving vegetable oil properties for lubrication methods
USDA-ARS?s Scientific Manuscript database
The inherent problems of vegetable oils, such as poor oxidation and low-temperature properties, can be improved by attaching functional groups at the sites of unsaturation through chemical modifications. In this article, you will see how functionalization helps overcome these disadvantages....
NASA Technical Reports Server (NTRS)
Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.
Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence
Han, Paul K. J.
2014-01-01
The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891
A New Methodology for Vibration Error Compensation of Optical Encoders
Lopez, Jesus; Artes, Mariano
2012-01-01
Optical encoders are sensors based on grating interference patterns. Tolerances inherent to the manufacturing process can induce errors in the position accuracy as the measurement signals stand apart from the ideal conditions. In case the encoder is working under vibrations, the oscillating movement of the scanning head is registered by the encoder system as a displacement, introducing an error into the counter to be added up to graduation, system and installation errors. Behavior improvement can be based on different techniques trying to compensate the error from measurement signals processing. In this work a new “ad hoc” methodology is presented to compensate the error of the encoder when is working under the influence of vibration. The methodology is based on fitting techniques to the Lissajous figure of the deteriorated measurement signals and the use of a look up table, giving as a result a compensation procedure in which a higher accuracy of the sensor is obtained. PMID:22666067
Flores, Walter
2010-01-01
Governance refers to decision-making processes in which power relationships and actors and institutions' particular interests converge. Situations of consensus and conflict are inherent to such processes. Furthermore, decision-making happens within a framework of ethical principles, motivations and incentives which could be explicit or implicit. Health systems in most Latin-American and Caribbean countries take the principles of equity, solidarity, social participation and the right to health as their guiding principles; such principles must thus rule governance processes. However, this is not always the case and this is where the importance of investigating governance in health systems lies. Making advances in investigating governance involves conceptual and methodological implications. Clarifying and integrating normative and analytical approaches is relevant at conceptual level as both are necessary for an approach seeking to investigate and understand social phenomena's complexity. In relation to methodological level, there is a need to expand the range of variables, sources of information and indicators for studying decision-making aimed to greater equity, health citizenship and public policy efficiency.
Signal and noise extraction from analog memory elements for neuromorphic computing.
Gong, N; Idé, T; Kim, S; Boybat, I; Sebastian, A; Narayanan, V; Ando, T
2018-05-29
Dense crossbar arrays of non-volatile memory (NVM) can potentially enable massively parallel and highly energy-efficient neuromorphic computing systems. The key requirements for the NVM elements are continuous (analog-like) conductance tuning capability and switching symmetry with acceptable noise levels. However, most NVM devices show non-linear and asymmetric switching behaviors. Such non-linear behaviors render separation of signal and noise extremely difficult with conventional characterization techniques. In this study, we establish a practical methodology based on Gaussian process regression to address this issue. The methodology is agnostic to switching mechanisms and applicable to various NVM devices. We show tradeoff between switching symmetry and signal-to-noise ratio for HfO 2 -based resistive random access memory. Then, we characterize 1000 phase-change memory devices based on Ge 2 Sb 2 Te 5 and separate total variability into device-to-device variability and inherent randomness from individual devices. These results highlight the usefulness of our methodology to realize ideal NVM devices for neuromorphic computing.
[Methodological problems in the use of information technologies in physical education].
Martirosov, E G; Zaĭtseva, G A
2000-01-01
The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.
BIOMOLECULAR SENSING FOR BIOLOGICAL PROCESSES AND ENVIRONMENTAL MONITORING APPLICATIONS
Biomolecular recognition is being increasingly employed as the basis for a variety of analytical methods such as biosensors. he sensitivity, selectivity, and format versatility inherent in these methods may allow them to be adapted to solving a number of analytical problems. ltho...
Assessment Strategies for Minority Groups.
ERIC Educational Resources Information Center
Sharma, Sarla
1986-01-01
Far-reaching ramifications for minority children of psychological assessment warrants that it be accurate, fair, and valid. This article addresses: (1) problems inherent in standardized testing; (2) a moratorium on intelligence testing; (3) alternate approaches to testing; and (4) guidelines for assessing ethnic minority groups. (LHW)
The Integration of Social-Ecological Resilience and Law
Growing recognition of the inherent uncertainty associated with the dynamics of ecological systems and their often non-linear and surprising behavior, however, presents a set of problems outside the scope of classic environmental law, and has lead to a fundamental understanding a...
Advances in Arachis genomics for peanut improvement
USDA-ARS?s Scientific Manuscript database
Peanut genomics is very challenging due to its inherent problem of genetic architecture. Blockage of gene flow from diploid wild relatives to tetraploid cultivated peanut, recent polyploidization combined with self pollination and narrow genetic base of primary gene pool resulted in low genetic dive...
Design and Analysis of Cognitive Interviews for Comparative Multinational Testing
Fitzgerald, Rory; Padilla, José-Luis; Willson, Stephanie; Widdop, Sally; Caspar, Rachel; Dimov, Martin; Gray, Michelle; Nunes, Cátia; Prüfer, Peter; Schöbi, Nicole; Schoua-Glusberg, Alisú
2011-01-01
This article summarizes the work of the Comparative Cognitive Testing Workgroup, an international coalition of survey methodologists interested in developing an evidence-based methodology for examining the comparability of survey questions within cross-cultural or multinational contexts. To meet this objective, it was necessary to ensure that the cognitive interviewing (CI) method itself did not introduce method bias. Therefore, the workgroup first identified specific characteristics inherent in CI methodology that could undermine the comparability of CI evidence. The group then developed and implemented a protocol addressing those issues. In total, 135 cognitive interviews were conducted by participating countries. Through the process, the group identified various interpretive patterns resulting from sociocultural and language-related differences among countries as well as other patterns of error that would impede comparability of survey data. PMID:29081719
Probabilistic simulation of stress concentration in composite laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, L.
1993-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.
An embodied perspective on expertise in solving the problem of making a geologic map
NASA Astrophysics Data System (ADS)
Callahan, Caitlin Norah
The task of constructing a geologic map is a cognitively and physically demanding field-based problem. The map produced is understood to be an individual's two-dimensional interpretation or mental model of the three-dimensional underlying geology. A popular view within the geoscience community is that teaching students how to make a geologic map is valuable for preparing them to deal with disparate and incomplete data sets, for helping them develop problem-solving skills, and for acquiring expertise in geology. Few previous studies have focused specifically on expertise in geologic mapping. Drawing from literature related to expertise, to problem solving, and to mental models, two overarching research questions were identified: How do geologists of different levels of expertise constrain and solve an ill-structured problem such as making a geologic map? How do geologists address the uncertainties inherent to the processes and interpretations involved in solving a geologic mapping problem? These questions were answered using a methodology that captured the physical actions, expressed thoughts, and navigation paths of geologists as they made a geologic map. Eight geologists, from novice to expert, wore a head-mounted video camera with an attached microphone to record those actions and thoughts, creating "video logs" while in the field. The video logs were also time-stamped, which allowed the visual and audio data to be synchronized with the GPS data that tracked participants' movements in the field. Analysis of the video logs yielded evidence that all eight participants expressed thoughts that reflected the process of becoming mentally situated in the mapping task (e.g. relating between distance on a map and distance in three-dimensional space); the prominence of several of these early thoughts waned in the expressed thoughts later in the day. All participants collected several types of data while in the field; novices, however, did so more continuously throughout the day whereas the experts collected more of their data earlier in the day. Experts and novices also differed in that experts focused more on evaluating certainty in their interpretations; the novices focused more on evaluating the certainty of their observations and sense of location.
Design Optimization Toolkit: Users' Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aguilo Valentin, Miguel Alejandro
The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less
Perfect and broadband acoustic absorption by critically coupled sub-wavelength resonators.
Romero-García, V; Theocharis, G; Richoux, O; Merkel, A; Tournat, V; Pagneux, V
2016-01-19
Perfect absorption is an interdisciplinary topic with a large number of applications, the challenge of which consists of broadening its inherently narrow frequency-band performance. We experimentally and analytically report perfect and broadband absorption for audible sound, by the mechanism of critical coupling, with a sub-wavelength multi-resonant scatterer (SMRS) made of a plate-resonator/closed waveguide structure. In order to introduce the role of the key parameters, we first present the case of a single resonant scatterer (SRS) made of a Helmholtz resonator/closed waveguide structure. In both cases the controlled balance between the energy leakage of the several resonances and the inherent losses of the system leads to perfect absorption peaks. In the case of the SMRS we show that systems with large inherent losses can be critically coupled using resonances with large leakage. In particular, we show that in the SMRS system, with a thickness of λ/12 and diameter of λ/7, several perfect absorption peaks overlap to produce absorption bigger than 93% for frequencies that extend over a factor of 2 in audible frequencies. The reported concepts and methodology provide guidelines for the design of broadband perfect absorbers which could contribute to solve the major issue of noise reduction.
Perfect and broadband acoustic absorption by critically coupled sub-wavelength resonators
NASA Astrophysics Data System (ADS)
Romero-García, V.; Theocharis, G.; Richoux, O.; Merkel, A.; Tournat, V.; Pagneux, V.
2016-01-01
Perfect absorption is an interdisciplinary topic with a large number of applications, the challenge of which consists of broadening its inherently narrow frequency-band performance. We experimentally and analytically report perfect and broadband absorption for audible sound, by the mechanism of critical coupling, with a sub-wavelength multi-resonant scatterer (SMRS) made of a plate-resonator/closed waveguide structure. In order to introduce the role of the key parameters, we first present the case of a single resonant scatterer (SRS) made of a Helmholtz resonator/closed waveguide structure. In both cases the controlled balance between the energy leakage of the several resonances and the inherent losses of the system leads to perfect absorption peaks. In the case of the SMRS we show that systems with large inherent losses can be critically coupled using resonances with large leakage. In particular, we show that in the SMRS system, with a thickness of λ/12 and diameter of λ/7, several perfect absorption peaks overlap to produce absorption bigger than 93% for frequencies that extend over a factor of 2 in audible frequencies. The reported concepts and methodology provide guidelines for the design of broadband perfect absorbers which could contribute to solve the major issue of noise reduction.
Perfect and broadband acoustic absorption by critically coupled sub-wavelength resonators
Romero-García, V.; Theocharis, G.; Richoux, O.; Merkel, A.; Tournat, V.; Pagneux, V.
2016-01-01
Perfect absorption is an interdisciplinary topic with a large number of applications, the challenge of which consists of broadening its inherently narrow frequency-band performance. We experimentally and analytically report perfect and broadband absorption for audible sound, by the mechanism of critical coupling, with a sub-wavelength multi-resonant scatterer (SMRS) made of a plate-resonator/closed waveguide structure. In order to introduce the role of the key parameters, we first present the case of a single resonant scatterer (SRS) made of a Helmholtz resonator/closed waveguide structure. In both cases the controlled balance between the energy leakage of the several resonances and the inherent losses of the system leads to perfect absorption peaks. In the case of the SMRS we show that systems with large inherent losses can be critically coupled using resonances with large leakage. In particular, we show that in the SMRS system, with a thickness of λ/12 and diameter of λ/7, several perfect absorption peaks overlap to produce absorption bigger than 93% for frequencies that extend over a factor of 2 in audible frequencies. The reported concepts and methodology provide guidelines for the design of broadband perfect absorbers which could contribute to solve the major issue of noise reduction. PMID:26781863
Development and Operation of a Modern Information Portal for the ISS Medical Groups
NASA Technical Reports Server (NTRS)
Damann, V.; Johnson, MaGee; Sargsyan, Ashot; McDonald, P. Vernon; Armstrong, C.; Scheer, M.; Duncan, J. Michael
2007-01-01
This viewgraph presentation begins with a review of some of the problems inherent in running medical services for the International Space Station. Part of the solution for the problems is the development of the information portal for the ISS medical groups. The presentation shows the tools that have been developed to assist in collaboration for the medical services, the security system and the capabilities of the portal.
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Farooq, Mohammad U.
1986-01-01
The definition of proposed research addressing the development and validation of a methodology for the design and evaluation of user interfaces for interactive information systems is given. The major objectives of this research are: the development of a comprehensive, objective, and generalizable methodology for the design and evaluation of user interfaces for information systems; the development of equations and/or analytical models to characterize user behavior and the performance of a designed interface; the design of a prototype system for the development and administration of user interfaces; and the design and use of controlled experiments to support the research and test/validate the proposed methodology. The proposed design methodology views the user interface as a virtual machine composed of three layers: an interactive layer, a dialogue manager layer, and an application interface layer. A command language model of user system interactions is presented because of its inherent simplicity and structured approach based on interaction events. All interaction events have a common structure based on common generic elements necessary for a successful dialogue. It is shown that, using this model, various types of interfaces could be designed and implemented to accommodate various categories of users. The implementation methodology is discussed in terms of how to store and organize the information.
Comparing healthcare outcomes.
Orchard, C.
1994-01-01
Governments are increasingly concerned to compare the quality and effectiveness of healthcare interventions but find this a complex matter. Crude hospital statistics can be dangerously misleading and need adjusting for case mix, but identifying and weighting the patient characteristics which affect prognosis are problematical for conceptual, methodological, and practical reasons. These include the inherently uncertain nature of prognosis itself and the practical difficulties of collecting and quantifying data on the outcomes of interest for specific healthcare interventions and known risk factors such as severity. Images p1494-a PMID:8019285
EHV systems technology - A look at the principles and current status. [Electric and Hybrid Vehicle
NASA Technical Reports Server (NTRS)
Kurtz, D. W.; Levin, R. R.
1983-01-01
An examination of the basic principles and practices of systems engineering is undertaken in the context of their application to the component and subsystem technologies involved in electric and hybrid vehicle (EHV) development. The limitations of purely electric vehicles are contrasted with hybrid, heat engine-incorporating vehicle technology, which is inherently more versatile. A hybrid vehicle concept assessment methodology is presented which employs current technology and yet fully satisfies U.S. Department of Energy petroleum displacement goals.
Conjugate gradient based projection - A new explicit methodology for frictional contact
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Li, Maocheng; Sha, Desong
1993-01-01
With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.
Spinal Cord Injury-Induced Dysautonomia via Plasticity in Paravertebral Sympathetic Postganglionic
2017-10-01
their near anatomical inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent...inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent to which paravertebral
Human Prenatal Effects: Methodological Problems and Some Suggested Solutions
ERIC Educational Resources Information Center
Copans, Stuart A.
1974-01-01
Briefly reviews the relevant literature on human prenatal effects, describes some of the possible designs for such studies; and discusses some of the methodological problem areas: sample choice, measurement of prenatal variables, monitoring of labor and delivery, and neonatal assessment. (CS)
Does finite-temperature decoding deliver better optima for noisy Hamiltonians?
NASA Astrophysics Data System (ADS)
Ochoa, Andrew J.; Nishimura, Kohji; Nishimori, Hidetoshi; Katzgraber, Helmut G.
The minimization of an Ising spin-glass Hamiltonian is an NP-hard problem. Because many problems across disciplines can be mapped onto this class of Hamiltonian, novel efficient computing techniques are highly sought after. The recent development of quantum annealing machines promises to minimize these difficult problems more efficiently. However, the inherent noise found in these analog devices makes the minimization procedure difficult. While the machine might be working correctly, it might be minimizing a different Hamiltonian due to the inherent noise. This means that, in general, the ground-state configuration that correctly minimizes a noisy Hamiltonian might not minimize the noise-less Hamiltonian. Inspired by rigorous results that the energy of the noise-less ground-state configuration is equal to the expectation value of the energy of the noisy Hamiltonian at the (nonzero) Nishimori temperature [J. Phys. Soc. Jpn., 62, 40132930 (1993)], we numerically study the decoding probability of the original noise-less ground state with noisy Hamiltonians in two space dimensions, as well as the D-Wave Inc. Chimera topology. Our results suggest that thermal fluctuations might be beneficial during the optimization process in analog quantum annealing machines.
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
Overcoming an obstacle in expanding a UMLS semantic type extent.
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2012-02-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. Copyright © 2011 Elsevier Inc. All rights reserved.
Overcoming an Obstacle in Expanding a UMLS Semantic Type Extent
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2011-01-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. PMID:21925287
The problems inherent in teaching technical writing and report writing to native Americans
NASA Technical Reports Server (NTRS)
Zukowski/faust, J.
1981-01-01
Teaching technical writing to Native Americans contending with a second language and culture is addressed. Learning difficulties arising from differences between native and acquired language and cultural systems are examined. Compartmentalized teaching, which presents the ideals of technical writing in minimal units, and skills development are considered. Rhetorical problems treated include logic of arrangement, selection of support and scope of detail, and time and space. Specific problems selected include the concept of promptness, the contextualization of purpose, interpersonal relationships, wordiness, mixture of registers, and the problem of abstracting. Four inductive procedures for students having writing and perception problems are included. Four sample exercises and a bibliography of 13 references are also included.
Mind over matter? I: philosophical aspects of the mind-brain problem.
Schimmel, P
2001-08-01
To conceptualize the essence of the mind-body or mind-brain problem as one of metaphysics rather than science, and to propose a formulation of the problem in the context of current scientific knowledge and its limitations. The background and conceptual parameters of the mind-body problem are delineated, and the limitations of brain research in formulating a solution identified. The problem is reformulated and stated in terms of two propositions. These constitute a 'double aspect theory'. The problem appears to arise as a consequence of the conceptual limitations of the human mind, and hence remains essentially a metaphysical one. A 'double aspect theory' recognizes the essential unity of mind and brain, while remaining consistent with the dualism inherent in human experience.
Teaching Old French Literature to Undergraduates.
ERIC Educational Resources Information Center
Stewart, Harry E.
As a prelude to graduate-level work for French majors, medieval studies are proposed for undergraduate students. Problems inherent in the establishment of the undergraduate program are identified with some suggested solutions. Concepts related to historical grammar, teaching materials, literature, and linguistics are developed. A logical course…
Micromechanical Behavior and Modelling of Granular Soil
1989-07-01
DiMaggio and Sandier 1971, Baladi and Rohani 1979). The problem of inherent (structural) anisotropy - especially important for 3 anisotropically...Republic of Germany. Baladi ,G.Y. and Rohani, B. (1979), "Elastic-Plastic Model for Saturated Sand," Journal of the Geotechnical Engineering Division, ASCE
International Education and Multicultural Interdisciplinary Team Training.
ERIC Educational Resources Information Center
Oomkes, Frank R.
Problems inherent in international rural development cooperative efforts are those caused by the international mix of participants, difficulties in intercultural communication, and cultural biases of western teaching contents and methods. A 6-month program established in 1982 at the International Agricultural Center at Wageningen, Netherlands,…
Native Employment in a Frontier Region.
ERIC Educational Resources Information Center
Farnsworth, J. M.
By employing southern strategies and preconceptions to develop the north, southern Canadians have complicated northern Canadian development problems. Assuming that the only recognizable work is "paid" work and that welfare recipients do not want to work, southern Canadians have failed to recognize the inherent relationship between…
Departments: Problems and Alternatives.
ERIC Educational Resources Information Center
Faricy, William H.
This paper deals with the existing inter- and intra-departmental phenomena. Current practices of forming departments and aggregating departments into colleges are investigated and are shown to have inherent difficulties which inhibit the performance of a university. These difficulties are illustrated with empirical data from a representative set…
ERIC Educational Resources Information Center
Murphy, Linda; Della Corte, Suzanne
1987-01-01
The newsletter's main article focuses on hyperactivity and attention deficit disorder. The causes of hyperactivity, which affects 3-5 percent of all children, are elusive but may include neurological immaturity, inherent genetic problems, or fetal exposure to harmful substances. Patterns of behavior that typify a hyperactive child include a short…
Evolutionary Multiobjective Design Targeting a Field Programmable Transistor Array
NASA Technical Reports Server (NTRS)
Aguirre, Arturo Hernandez; Zebulum, Ricardo S.; Coello, Carlos Coello
2004-01-01
This paper introduces the ISPAES algorithm for circuit design targeting a Field Programmable Transistor Array (FPTA). The use of evolutionary algorithms is common in circuit design problems, where a single fitness function drives the evolution process. Frequently, the design problem is subject to several goals or operating constraints, thus, designing a suitable fitness function catching all requirements becomes an issue. Such a problem is amenable for multi-objective optimization, however, evolutionary algorithms lack an inherent mechanism for constraint handling. This paper introduces ISPAES, an evolutionary optimization algorithm enhanced with a constraint handling technique. Several design problems targeting a FPTA show the potential of our approach.
E-therapy for mental health problems: a systematic review.
Postel, Marloes G; de Haan, Hein A; De Jong, Cor A J
2008-09-01
The widespread availability of the Internet offers opportunities for improving access to therapy for people with mental health problems. There is a seemingly infinite supply of Internet-based interventions available on the World Wide Web. The aim of the present study is to systematically assess the methodological quality of randomized controlled trials (RCTs) concerning e-therapy for mental health problems. Two reviewers independently assessed the methodological quality of the RCTs, based on a list of criteria for the methodological quality assessment as recommended by the Cochrane Back Review Group. The search yielded 14 papers that reported RCTs concerning e-therapy for mental-health problems. The methodological quality of studies included in this review was generally low. It is concluded that e-therapy may turn out to be an appropriate therapeutic entity, but the evidence needs to be more convincing. Recommendations are made concerning the method of reporting RCTs and the need to add some content items to an e-therapy study.
Fitting methods to paradigms: are ergonomics methods fit for systems thinking?
Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A
2017-02-01
The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
Problem Solving in Biology: A Methodology
ERIC Educational Resources Information Center
Wisehart, Gary; Mandell, Mark
2008-01-01
A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…
SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY
Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of
[Problem-based learning in cardiopulmonary resuscitation: basic life support].
Sardo, Pedro Miguel Garcez; Dal Sasso, Grace Terezinha Marcon
2008-12-01
Descriptive and exploratory study, aimed to develop an educational practice of Problem-Based Learning in CPR/BLS with 24 students in the third stage of the Nursing Undergraduate Course in a University in the Southern region of Brazil. The study used the PBL methodology, focused on problem situations of cardiopulmonary arrest, and was approved by the CONEP. The methodological strategies for data collection, such as participative observation and questionnaires to evaluate the learning, the educational practices and their methodology, allowed for grouping the results in: students' expectations; group activities; individual activities; practical activities; evaluation of the meetings and their methodology. The study showed that PBL allows the educator to evaluate the academic learning process in several dimensions, functioning as a motivating factor for both the educator and the student, because it allows the theoretical-practical integration in an integrated learning process.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
A software tool for dataflow graph scheduling
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1994-01-01
A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on multiple processors. The dataflow paradigm is very useful in exposing the parallelism inherent in algorithms. It provides a graphical and mathematical model which describes a partial ordering of algorithm tasks based on data precedence.
Methodological Problems on the Way to Integrative Human Neuroscience.
Kotchoubey, Boris; Tretter, Felix; Braun, Hans A; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang
2016-01-01
Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge , rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience , which will not only link different fields and levels, but also help in understanding clinical phenomena.
Methodological Problems on the Way to Integrative Human Neuroscience
Kotchoubey, Boris; Tretter, Felix; Braun, Hans A.; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang
2016-01-01
Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge, rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience, which will not only link different fields and levels, but also help in understanding clinical phenomena. PMID:27965548
Case study of a problem-based learning course of physics in a telecommunications engineering degree
NASA Astrophysics Data System (ADS)
Macho-Stadler, Erica; Jesús Elejalde-García, Maria
2013-08-01
Active learning methods can be appropriate in engineering, as their methodology promotes meta-cognition, independent learning and problem-solving skills. Problem-based learning is the educational process by which problem-solving activities and instructor's guidance facilitate learning. Its key characteristic involves posing a 'concrete problem' to initiate the learning process, generally implemented by small groups of students. Many universities have developed and used active methodologies successfully in the teaching-learning process. During the past few years, the University of the Basque Country has promoted the use of active methodologies through several teacher training programmes. In this paper, we describe and analyse the results of the educational experience using the problem-based learning (PBL) method in a physics course for undergraduates enrolled in the technical telecommunications engineering degree programme. From an instructors' perspective, PBL strengths include better student attitude in class and increased instructor-student and student-student interactions. The students emphasised developing teamwork and communication skills in a good learning atmosphere as positive aspects.
Spray drift and off-target loss reduction with a precision air-assisted sprayer
USDA-ARS?s Scientific Manuscript database
Spray drift and off-target losses are inherent problems of conventional air-assisted sprayers. Their low efficiencies cause environmental pollutions resulting in public anxieties. A new drift reduction technology incorporating laser scanning capabilities with a variable-rate air-assisted sprayer w...
Intermetropolitan Migration and the Rise of the Sunbelt.
ERIC Educational Resources Information Center
Watkins, Alfred J.
1978-01-01
Examines the historical patterns and components of the population flows that have contributed to recent metropolitan development in the sunbelt. Examines potential problems and fundamental contradictions inherent in both the continuation and cessation of the population influx. Suggests that these demographic and economic trends may seriously…
The Fifth Skill: Hearing the Unspoken Language.
ERIC Educational Resources Information Center
Vilarrubla, Montserrat
Aspects of nonverbal communication are examined as they relate to business communication and to the instruction of business language. Relevant literature on nonverbal communication is reviewed, focusing on gestures and body language and the problems inherent in interpretation of their meaning. Suggestions for educators include: training students…
Electronic Banking and the Death of Privacy.
ERIC Educational Resources Information Center
McLuhan, Marshall; Powers, Bruce
1981-01-01
Describes some of the problems for the individual inherent in the rapidly expanding computerized field of credit and banking. Proposes that electronic fund transfer systems could virtually replace the use of cash. Warns that while such systems offer wide advantages to business, they threaten the individual's privacy. (JMF)
Why Is It Harder to Lead an English Department than to Be CEO of IBM?
ERIC Educational Resources Information Center
Booth, Wayne C.
1994-01-01
Describes the problems inherent in finding competent leadership for today's college English departments. Outlines the basic kinds of traits and characteristics that would mark both ideal leaders and followers in the academy. Discusses the different kinds of leaders. (HB)
Personal Transferable Skills in Higher Education: The Problems of Implementing Good Practice.
ERIC Educational Resources Information Center
Drummond, Ian; Nixon, Iain; Wiltshire, John
1998-01-01
Promotion of effective development of personal transferable skills has had limited success in British higher education. Difficulties inherent in implementing established good practice include institutional inertia, issues of academic freedom, resources, high levels of commitment to a particular discipline, and modularization. (SK)
Searching for Accountability, Productivity, Etc., Etc. Accounting and Financial Reporting.
ERIC Educational Resources Information Center
Piotrowski, Craig
1988-01-01
School districts are being asked to provide expanded variety and quality of educational services with fewer tax dollars. Discusses the search for accountability, productivity, quality, and equity in education and alerts school business officials to the problems and conflicts inherent in such a search. (MLF)
ICCE Policy Statement on Network and Multiple Machine Software.
ERIC Educational Resources Information Center
Computing Teacher, 1983
1983-01-01
Issued to provide guidance for the resolution of problems inherent in providing and securing good educational software, this statement outlines responsibilities of educators, hardware vendors, and software developers/vendors. Sample policy statements for school districts and community colleges, suggested format for software licenses, and technical…
Discrimination against Black Students
ERIC Educational Resources Information Center
Aloud, Ashwaq; Alsulayyim, Maryam
2016-01-01
Discrimination is a structured way of abusing people based on racial differences, hence barring them from accessing wealth, political participation and engagement in many spheres of human life. Racism and discrimination are inherently rooted in institutions in the society, the problem has spread across many social segments of the society including…
ERIC Educational Resources Information Center
Kuntz, Aaron M.; Petrovic, John E.
2018-01-01
In this article we consider the material dimensions of schooling as constitutive of the possibilities inherent in "fixing" education. We begin by mapping out the problem of "fixing education," pointing to the necrophilic tendencies of contemporary education--a desire to kill what otherwise might be life-giving. In this sense,…
The Ethos, Habits, and Prerogatives of Professionalism.
ERIC Educational Resources Information Center
Rennie, Drummond
1991-01-01
Twelve cases in which physicians and medical journal editors have overstepped the bounds of their professions and abused professional privileges are presented. Many, when challenged, contended they did not see the problems inherent in the situations, suggesting the professions are not adequately defining and teaching professional ethical…
Design of Inhouse Automated Library Systems.
ERIC Educational Resources Information Center
Cortez, Edwin M.
1984-01-01
Examines six steps inherent to development of in-house automated library system: (1) problem definition, (2) requirement specifications, (3) analysis of alternatives and solutions, (4, 5) design and implementation of hardware and software, and (6) evaluation. Practical method for comparing and weighting options is illustrated and explained. A…
Layer Stripping Solutions of Inverse Seismic Problems.
1985-03-21
problems--more so than has generally been recognized. The subject of this thesis is the theoretical development of the . layer-stripping methodology , and...medium varies sharply at each interface, which would be expected to cause difficulties for the algorithm, since it was designed for a smoothy varying... methodology was applied in a novel way. The inverse problem considered in this chapter was that of reconstructing a layered medium from measurement of its
Researching Street Children: Methodological and Ethical Issues.
ERIC Educational Resources Information Center
Hutz, Claudio S.; And Others
This paper describes the ethical and methodological problems associated with studying prosocial moral reasoning of street children and children of low and high SES living with their families, and problems associated with studying sexual attitudes and behavior of street children and their knowledge of sexually transmitted diseases, especially AIDS.…
Problem-Based Learning: Lessons for Administrators, Educators and Learners
ERIC Educational Resources Information Center
Yeo, Roland
2005-01-01
Purpose: The paper aims to explore the challenges of problem-based learning (PBL) as an unconventional teaching methodology experienced by a higher learning institute in Singapore. Design/methodology/approach: The exploratory study was conducted using focus group discussions and semi-structured interviews. Four groups of people were invited to…
The Speaker Respoken: Material Rhetoric as Feminist Methodology.
ERIC Educational Resources Information Center
Collins, Vicki Tolar
1999-01-01
Presents a methodology based on the concept of "material rhetoric" that can help scholars avoid problems as they reclaim women's historical texts. Defines material rhetoric and positions it theoretically in relation to other methodologies, including bibliographical studies, reception theory, and established feminist methodologies. Illustrates…
NASA Astrophysics Data System (ADS)
Lakshmi, A.; Faheema, A. G. J.; Deodhare, Dipti
2016-05-01
Pedestrian detection is a key problem in night vision processing with a dozen of applications that will positively impact the performance of autonomous systems. Despite significant progress, our study shows that performance of state-of-the-art thermal image pedestrian detectors still has much room for improvement. The purpose of this paper is to overcome the challenge faced by the thermal image pedestrian detectors, which employ intensity based Region Of Interest (ROI) extraction followed by feature based validation. The most striking disadvantage faced by the first module, ROI extraction, is the failed detection of cloth insulted parts. To overcome this setback, this paper employs an algorithm and a principle of region growing pursuit tuned to the scale of the pedestrian. The statistics subtended by the pedestrian drastically vary with the scale and deviation from normality approach facilitates scale detection. Further, the paper offers an adaptive mathematical threshold to resolve the problem of subtracting the background while extracting cloth insulated parts as well. The inherent false positives of the ROI extraction module are limited by the choice of good features in pedestrian validation step. One such feature is curvelet feature, which has found its use extensively in optical images, but has as yet no reported results in thermal images. This has been used to arrive at a pedestrian detector with a reduced false positive rate. This work is the first venture made to scrutinize the utility of curvelet for characterizing pedestrians in thermal images. Attempt has also been made to improve the speed of curvelet transform computation. The classification task is realized through the use of the well known methodology of Support Vector Machines (SVMs). The proposed method is substantiated with qualified evaluation methodologies that permits us to carry out probing and informative comparisons across state-of-the-art features, including deep learning methods, with six standard and in-house databases. With reference to deep learning, our algorithm exhibits comparable performance. More important is that it has significant lower requirements in terms of compute power and memory, thus making it more relevant for depolyment in resource constrained platforms with significant size, weight and power constraints.
DOE Office of Scientific and Technical Information (OSTI.GOV)
al-Saffar, Sinan; Joslyn, Cliff A.; Chappell, Alan R.
As semantic datasets grow to be very large and divergent, there is a need to identify and exploit their inherent semantic structure for discovery and optimization. Towards that end, we present here a novel methodology to identify the semantic structures inherent in an arbitrary semantic graph dataset. We first present the concept of an extant ontology as a statistical description of the semantic relations present amongst the typed entities modeled in the graph. This serves as a model of the underlying semantic structure to aid in discovery and visualization. We then describe a method of ontological scaling in which themore » ontology is employed as a hierarchical scaling filter to infer different resolution levels at which the graph structures are to be viewed or analyzed. We illustrate these methods on three large and publicly available semantic datasets containing more than one billion edges each. Keywords-Semantic Web; Visualization; Ontology; Multi-resolution Data Mining;« less
Polcari, J.
2013-08-16
The signal processing concept of signal-to-noise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible information-preserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore,more » the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.« less
A methodology for the assessment of manned flight simulator fidelity
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; Malsbury, Terry N.
1989-01-01
A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.
Validation of landsurface processes in the AMIP models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, T J
The Atmospheric Model Intercomparison Project (AMIP) is a commonly accepted protocol for testing the performance of the world's atmospheric general circulation models (AGCMs) under common specifications of radiative forcings (in solar constant and carbon dioxide concentration) and observed ocean boundary conditions (Gates 1992, Gates et al. 1999). From the standpoint of landsurface specialists, the AMIP affords an opportunity to investigate the behaviors of a wide variety of land-surface schemes (LSS) that are coupled to their ''native'' AGCMs (Phillips et al. 1995, Phillips 1999). In principle, therefore, the AMIP permits consideration of an overarching question: ''To what extent does an AGCM'smore » performance in simulating continental climate depend on the representations of land-surface processes by the embedded LSS?'' There are, of course, some formidable obstacles to satisfactorily addressing this question. First, there is the dilemna of how to effectively validate simulation performance, given the present dearth of global land-surface data sets. Even if this data problem were to be alleviated, some inherent methodological difficulties would remain: in the context of the AMIP, it is not possible to validate a given LSS per se, since the associated land-surface climate simulation is a product of the coupled AGCM/LSS system. Moreover, aside from the intrinsic differences in LSS across the AMIP models, the varied representations of land-surface characteristics (e.g. vegetation properties, surface albedos and roughnesses, etc.) and related variations in land-surface forcings further complicate such an attribution process. Nevertheless, it may be possible to develop validation methodologies/statistics that are sufficiently penetrating to reveal ''signatures'' of particular ISS representations (e.g. ''bucket'' vs more complex parameterizations of hydrology) in the AMIP land-surface simulations.« less
Eilam, David; Portugali, Juval; Blumenfeld-Lieberthal, Efrat
2012-01-01
Background We set out to solve two inherent problems in the study of animal spatial cognition (i) What is a “place”?; and (ii) whether behaviors that are not revealed as differing by one methodology could be revealed as different when analyzed using a different approach. Methodology We applied network analysis to scrutinize spatial behavior of rats tested in either a symmetrical or asymmetrical layout of 4, 8, or 12 objects placed along the perimeter of a round arena. We considered locations as the units of the network (nodes), and passes between locations as the links within the network. Principal Findings While there were only minor activity differences between rats tested in the symmetrical or asymmetrical object layouts, network analysis revealed substantial differences. Viewing ‘location’ as a cluster of stopping coordinates, the key locations (large clusters of stopping coordinates) were at the objects in both layouts with 4 objects. However, in the asymmetrical layout with 4 objects, additional key locations were spaced by the rats between the objects, forming symmetry among the key locations. It was as if the rats had behaviorally imposed symmetry on the physically asymmetrical environment. Based on a previous finding that wayfinding is easier in symmetrical environments, we suggest that when the physical attributes of the environment were not symmetrical, the rats established a symmetric layout of key locations, thereby acquiring a more legible environment despite its complex physical structure. Conclusions and Significance The present study adds a behavioral definition for “location”, a term that so far has been mostly discussed according to its physical attributes or neurobiological correlates (e.g. - place and grid neurons). Moreover, network analysis enabled the assessment of the importance of a location, even when that location did not display any distinctive physical properties. PMID:22815808
Predicting sample lifetimes in creep fracture of heterogeneous materials
NASA Astrophysics Data System (ADS)
Koivisto, Juha; Ovaska, Markus; Miksic, Amandine; Laurson, Lasse; Alava, Mikko J.
2016-08-01
Materials flow—under creep or constant loads—and, finally, fail. The prediction of sample lifetimes is an important and highly challenging problem because of the inherently heterogeneous nature of most materials that results in large sample-to-sample lifetime fluctuations, even under the same conditions. We study creep deformation of paper sheets as one heterogeneous material and thus show how to predict lifetimes of individual samples by exploiting the "universal" features in the sample-inherent creep curves, particularly the passage to an accelerating creep rate. Using simulations of a viscoelastic fiber bundle model, we illustrate how deformation localization controls the shape of the creep curve and thus the degree of lifetime predictability.
Economic competition in health care: a moral assessment.
Menzel, P T
1987-02-01
Economic competition threatens equity in the delivery of health care. This essay examines four of the various ways in which it does that: the reduction of charity care, increased patient cost-sharing, "cream-skimming" of healthy subscribers, and lack of information to patients about rationed care that is not prescribed. In all four cases, society must guard against distinct inequities and injustices, but also in all four, either the particular problem is not inherent in competition or, though inherent, it is not irremediable. Competition therefore cannot be finally morally accepted or rejected as an economic structure for delivering health care without knowing what among a wide range of supplementary things our society is actually going to do with it.
Leigh, Barbara C.; Stall, Ron
2008-01-01
Recent reports have suggested that the use of alcohol or drugs is related to sexual behavior that is high-risk for HIV infection. If substance use leads to unsafe sexual activity, understanding the dynamics of this relationship can contribute to research, preventive and education efforts to contain the spread of AIDS. In this paper, we review research on the relationship between substance use and high-risk sexual behavior. We then consider the inherent limitations of the research designs used to study this relationship, outline some methodological concerns including measurement and sampling issues, and comment on causal interpretations of correlational research findings. We end with a consideration of potential avenues for avenues for future research and a discussion of implications of these findings for current AIDS prevention policies. PMID:8256876
Simulation and Analysis of Converging Shock Wave Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramsey, Scott D.; Shashkov, Mikhail J.
2012-06-21
Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the originalmore » problem, and minimally straining the general credibility of associated analysis and conclusions.« less
Against Teleology in an Honors Great Books Curriculum
ERIC Educational Resources Information Center
Harlan-Haughey, Sarah
2014-01-01
Chronologically presented courses that span centuries often catalyze unwitting buy-in to unexamined narratives of progress. While useful for helping students make connections between the human past, present, and future, Great Books honors curricula like the one used at the University of Maine have a few inherent problems that require careful…
ERIC Educational Resources Information Center
Kuo, Ben C. H.
2004-01-01
This article uses the case of Asians to highlight the collectivistic elements inherent in Asian help-seeking patterns, problem-solving styles, and stress-coping responses. It then offers recommendations for specific counseling strategies that complement Asians' collectivistic orientation. It should be noted that although generalized observations…
Teaching Analytical Chemistry to Pharmacy Students: A Combined, Iterative Approach
ERIC Educational Resources Information Center
Masania, Jinit; Grootveld, Martin; Wilson, Philippe B.
2018-01-01
Analytical chemistry has often been a difficult subject to teach in a classroom or lecture-based context. Numerous strategies for overcoming the inherently practical-based difficulties have been suggested, each with differing pedagogical theories. Here, we present a combined approach to tackling the problem of teaching analytical chemistry, with…
Secure E-Examination Systems Compared: Case Studies from Two Countries
ERIC Educational Resources Information Center
Fluck, Andrew; Adebayo, Olawale S.; Abdulhamid, Shafi'i M.
2017-01-01
Aim/Purpose: Electronic examinations have some inherent problems. Students have expressed negative opinions about electronic examinations (e-examinations) due to a fear of, or unfamiliarity with, the technology of assessment, and a lack of knowledge about the methods of e-examinations. Background: Electronic examinations are now a viable…
A Structural Approach to Unresolved Mourning in Single Parent Family Systems.
ERIC Educational Resources Information Center
Fulmer, Richard H.
1983-01-01
Considers the mother's depression as a special problem in therapy of single-parent families, resulting from unresolved mourning maintained by the family system. Offers reasons why the single-parent family's structure seems inherently vulnerable to unresolved mourning. Suggests techniques of Structural Family Therapy to facilitate mourning in such…
Genuine Inquiry: Widely Espoused Yet Rarely Enacted
ERIC Educational Resources Information Center
Le Fevre, Deidre M.; Robinson, Viviane M. J.; Sinnema, Claire E. L.
2015-01-01
The concept of inquiry is central to contemporary discussions of teacher and leader professional learning and problem solving in interpersonal contexts. However, while few would debate its value, there has been little discussion of the significant challenges inherent in engaging in genuine inquiry. In this article, we distinguish between genuine…
READING MACHINES FOR THE BLIND.
ERIC Educational Resources Information Center
FREIBERGER, HOWARD; MURPHY, EUGENE F.
AT A TECHNICAL SESSION, SIXTY-ONE PARTICIPANTS FROM THE FIELDS OF EDUCATION, INDUSTRY, GOVERNMENT, AND AGENCIES OF THE BLIND DISCUSSED RECENT DEVELOPMENTS IN THE PRODUCTION AND USE OF READING MACHINES WHICH PERMIT BLIND PERSONS GREATER INDEPENDENCE IN READING THE PRINTED PAGE. THEY ALSO EXPLORED PROBLEMS INHERENT IN THESE EFFORTS AND PROPOSED…
Political Education in the Former German Democratic Republic.
ERIC Educational Resources Information Center
Dumas, Wayne; Dumas, Alesia
1996-01-01
Investigates civic education curricular reform in the former German Democratic Republic (GDR). Discusses the problems inherent in reforming an entire educational system, from textbooks to teachers, originally designed for Marxist-Leninist purposes. Examines the German state educational structure and the role that the main political parties play in…
Marginality, Credibility, and Impression Management: The Asian Sociologist in America.
ERIC Educational Resources Information Center
Unnithan, N. Prabha
1988-01-01
Relates personal experiences of a sociologist of Asian origin in an effort to illustrate problems inherent in the process of becoming accepted as an academic sociologist. Identifies important themes of marginality, credibility, and impression management. Points out ways in which the Asian sociologists can go about achieving credibility. (KO)
The Laboratory School: Its Rise and Fall?
ERIC Educational Resources Information Center
Van Til, William
Inherent in the dream of the campus laboratory school were conflicting functions proposed for the school and conflicting perceptions on the part of the human beings involved. Students, supposedly representative, are more often more prosperous or bright or problem-prone than their age group in the general population. Parents, perceiving the school…
Approaches to Cross-Cultural Research in Art Education.
ERIC Educational Resources Information Center
Anderson, Frances E.
1979-01-01
The author defines the aims of cross-cultural research in art education and examines the problems inherent in such research, using as an illustration a summary chart of Child's cross-cultural studies of esthetic sensitivity. Emphasis is placed on the need for rigor in research design and execution. (SJL)
New Designs for Correctional Education and Training Programs.
ERIC Educational Resources Information Center
McCollum, Sylvia G.
1973-01-01
The challenge confronting creative educators concerned with using the correctional experience in positive ways is to structure an educational delivery system which takes into account the wide range of individual differences among people whose only common denominator is "serving time." Inherent is the problem of staff and public resistance to…
Cognitive Processes and Theory Development: A Reply to Spencer and Karmiloff-Smith.
ERIC Educational Resources Information Center
Gellatly, Angus
1997-01-01
Focuses on the role of enculturation in children's cognitive development by distinguishing between, and elaborating upon, three factors: (1) cultural context; (2) cognitive contents; and (3) cognitive processes. Suggests problems inherent in positing homologies between children's cognitive development and the historical development of scientific…
Changing Concepts in Forensics.
ERIC Educational Resources Information Center
Zarefsky, David
This paper discusses five theoretical concepts in general and two theoretical models in particular that are involved in forensics. The five concepts are: (1) causation, an inquiry into the reasons for ongoing processes or problems; (2) inherency, the division of a universe into its necessary features and its accidental features; (3) presumption, a…
Visual resources and the public: an empirical approach
Rachel Kaplan
1979-01-01
Visual resource management systems incorporate many assumptions about how people see the landscape. While these assumptions are not articulated, they nonetheless affect the decision process. Problems inherent in some of these assumptions are examined. Extensive research based on people's preference ratings of different settings provides insight into people's...
Ethical Implications of Technological Advances on Business Communication.
ERIC Educational Resources Information Center
Herschel, Richard T.; Andrews, Patricia Hayes
1997-01-01
Explores ethical issues heightened by use of technology, and examines a means for managing these ethical concerns. Argues that ethical problems are not inherent in technological advances, but rather it is how human beings choose to use these new tools that may lead to ethical dilemmas in business contexts. (SR)
Modeling aspen and red pine shoot growth to daily weather variations.
Donald A. Perala
1983-01-01
Quantifies daily shoot growth of quaking aspen and red pine in response to daily variation in air temperature, soil moisture, solar radiation, evapotranspiration, and inherent seasonal plant growth rhythm. Discusses potential application of shoot growth equations to silvicultural problems related to microclimatic variation. Identifies limitations and areas for...
Peer Mentoring: Encouraging Persistence in Native American Postsecondary Students
ERIC Educational Resources Information Center
Lee, Susan D.
2013-01-01
Native Americans have endured historical and contemporary challenges that have adversely affected their achievement, including in the realm of postsecondary education. The difficulties have included, but are not limited to, the problems inherent in the process of assimilation into Caucasian culture, the repercussions of Indian Boarding Schools,…
A New Metaphor for Teaching: Science Teacher as Anthropologist.
ERIC Educational Resources Information Center
Hodson, Derek
This paper addresses problems inherent in traditional science teaching and argues that the pitfalls of assimilation and exclusion can be avoided by adopting an anthropological approach: regarding scientists as a sub-cultural group with its own language and ways of thinking about, investigating, and explaining phenomena and events, its distinctive…
A Visual Test for Visual "Literacy."
ERIC Educational Resources Information Center
Messaris, Paul
Four different principles of visual manipulation constitute a minimal list of what a visually "literate" viewer should know about, but certain problems exist which are inherent in measuring viewers' awareness of each of them. The four principles are: (1) paraproxemics, or camera work which derives its effectiveness from an analogy to the…
Effective Parenting in Contemporary America: Some Cautions and Some Prescriptions.
ERIC Educational Resources Information Center
Lamb, Michael E.
This paper summarizes the components of effective parenting for which substantial empirical support is available and discusses the problems inherent in attempts to determine the characteristics of effective parents in order to amend the process of socialization through the modification of parent styles. The aspects of effective parenting discussed…
Photonic reservoir computing: a new approach to optical information processing
NASA Astrophysics Data System (ADS)
Vandoorne, Kristof; Fiers, Martin; Verstraeten, David; Schrauwen, Benjamin; Dambre, Joni; Bienstman, Peter
2010-06-01
Despite ever increasing computational power, recognition and classification problems remain challenging to solve. Recently, advances have been made by the introduction of the new concept of reservoir computing. This is a methodology coming from the field of machine learning and neural networks that has been successfully used in several pattern classification problems, like speech and image recognition. Thus far, most implementations have been in software, limiting their speed and power efficiency. Photonics could be an excellent platform for a hardware implementation of this concept because of its inherent parallelism and unique nonlinear behaviour. Moreover, a photonic implementation offers the promise of massively parallel information processing with low power and high speed. We propose using a network of coupled Semiconductor Optical Amplifiers (SOA) and show in simulation that it could be used as a reservoir by comparing it to conventional software implementations using a benchmark speech recognition task. In spite of the differences with classical reservoir models, the performance of our photonic reservoir is comparable to that of conventional implementations and sometimes slightly better. As our implementation uses coherent light for information processing, we find that phase tuning is crucial to obtain high performance. In parallel we investigate the use of a network of photonic crystal cavities. The coupled mode theory (CMT) is used to investigate these resonators. A new framework is designed to model networks of resonators and SOAs. The same network topologies are used, but feedback is added to control the internal dynamics of the system. By adjusting the readout weights of the network in a controlled manner, we can generate arbitrary periodic patterns.
A methodology to enhance electromagnetic compatibility in joint military operations
NASA Astrophysics Data System (ADS)
Buckellew, William R.
The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.
Binary phase locked loops for Omega receivers
NASA Technical Reports Server (NTRS)
Chamberlin, K.
1974-01-01
An all-digital phase lock loop (PLL) is considered because of a number of problems inherent in an employment of analog PLL. The digital PLL design presented solves these problems. A single loop measures all eight Omega time slots. Memory-aiding leads to the name of this design, the memory-aided phase lock loop (MAPLL). Basic operating principles are discussed and the superiority of MAPLL over the conventional digital phase lock loop with regard to the operational efficiency for Omega applications is demonstrated.
System dynamics and simulation of LSS
NASA Technical Reports Server (NTRS)
Ryan, R. F.
1978-01-01
Large Space Structures have many unique problems arising from mission objectives and the resulting configuration. Inherent in these configurations is a strong coupling among several of the designing disciplines. In particular, the coupling between structural dynamics and control is a key design consideration. The solution to these interactive problems requires efficient and accurate analysis, simulation and test techniques, and properly planned and conducted design trade studies. The discussion presented deals with these subjects and concludes with a brief look at some NASA capabilities which can support these technology studies.
Methodological Issues and Practical Problems in Conducting Research on Abused Children.
ERIC Educational Resources Information Center
Kinard, E. Milling
In order to inform policy and programs, research on child abuse must be not only methodologically rigorous, but also practically feasible. However, practical problems make child abuse research difficult to conduct. Definitions of abuse must be explicit and different types of abuse must be assessed separately. Study samples should be as…
ERIC Educational Resources Information Center
Soh, Kaycheng
2013-01-01
Recent research into university ranking methodologies uncovered several methodological problems among the systems currently in vogue. One of these is the discrepancy between the nominal and attained weights. The problem is the summation of unstandardized indicators for the total scores used in ranking. It is demonstrated that weight discrepancy…
A Methodological Critique of "Interventions for Boys with Conduct Problems"
ERIC Educational Resources Information Center
Kent, Ronald; And Others
1976-01-01
Kent criticizes Patterson's study on treating the behavior problems of boys, on several methodological bases concluding that more rigorous research is required in this field. Patterson answers Kent's criticisms arguing that they are not based on sound grounds. Patterson offers further evidence to support the efficacy of his treatment procedures.…
Research Methodology in Second Language Studies: Trends, Concerns, and New Directions
ERIC Educational Resources Information Center
King, Kendall A.; Mackey, Alison
2016-01-01
The field of second language studies is using increasingly sophisticated methodological approaches to address a growing number of urgent, real-world problems. These methodological developments bring both new challenges and opportunities. This article briefly reviews recent ontological and methodological debates in the field, then builds on these…
NASA Astrophysics Data System (ADS)
Ajtai, Tibor; Pinter, Mate; Utry, Noemi; Kiss-Albert, Gergely; Palagyi, Andrea; Manczinger, Laszlo; Vagvölgyi, Csaba; Szabo, Gabor; Bozoki, Zoltan
2016-04-01
In this study we present results of field measurement campaigns focusing on the in-situ characterization of absorption spectra and the health relevance of light absorbing carbonaceous (LAC) in the ambient. The absorption spectra is measured @ 266, 355, 532 and 1064 nm by our state-of-the-art four-wavelength photoacoustic instrument, while for health relevance the eco- cito and genotoxicity parameters were measured using standardized methodologies. We experimentally demonstrated a correlation between the toxicities and the measured absorption spectra quantified by its wavelength dependency. Based on this correlation, we present novel possibilities on real-time air quality monitoring. LAC is extensively studied not only because of its considerable climate effects but as a serious air pollutant too. Gradually increasing number of studies demonstrated experimentally that the health effect of LAC is more serious than it is expected based on its share in total atmospheric aerosol mass. Furthermore during many local pollution events LAC not only has dominancy but it is close to exclusivity. Altogether due to its climate and health effects many studies and proposed regulations focus on the physical, chemical and toxicological properties of LAC as well as on its source apportionment. Despites of its importance, there is not yet a widely accepted standard methodology for the real-time and selective identification of LAC. There are many different reasons of that: starting from its complex inherent physicochemical features including many unknown constituents, via masking effect of ambient on the inherent physicochemical properties taking place even in case of a short residence, ending with the lack of reliable instrumentation for its health or source relevant parameters. Therefore, the methodology and instrument development for selective and reliable identification of LAC is timely and important issues in climate and air quality researches. Recently, many studies demonstrated correlation between the chemical compositions and the absorption features of LAC which open up novel possibilities in real time source apportionment and in air quality monitoring.
Berne, Rosalyn W; Raviv, Daniel
2004-04-01
This paper introduces the Eight Dimensional Methodology for Innovative Thinking (the Eight Dimensional Methodology), for innovative problem solving, as a unified approach to case analysis that builds on comprehensive problem solving knowledge from industry, business, marketing, math, science, engineering, technology, arts, and daily life. It is designed to stimulate innovation by quickly generating unique "out of the box" unexpected and high quality solutions. It gives new insights and thinking strategies to solve everyday problems faced in the workplace, by helping decision makers to see otherwise obscure alternatives and solutions. Daniel Raviv, the engineer who developed the Eight Dimensional Methodology, and paper co-author, technology ethicist Rosalyn Berne, suggest that this tool can be especially useful in identifying solutions and alternatives for particular problems of engineering, and for the ethical challenges which arise with them. First, the Eight Dimensional Methodology helps to elucidate how what may appear to be a basic engineering problem also has ethical dimensions. In addition, it offers to the engineer a methodology for penetrating and seeing new dimensions of those problems. To demonstrate the effectiveness of the Eight Dimensional Methodology as an analytical tool for thinking about ethical challenges to engineering, the paper presents the case of the construction of the Large Binocular Telescope (LBT) on Mount Graham in Arizona. Analysis of the case offers to decision makers the use of the Eight Dimensional Methodology in considering alternative solutions for how they can proceed in their goals of exploring space. It then follows that same process through the second stage of exploring the ethics of each of those different solutions. The LBT project pools resources from an international partnership of universities and research institutes for the construction and maintenance of a highly sophisticated, powerful new telescope. It will soon mark the erection of the world's largest and most powerful optical telescope, designed to see fine detail otherwise visible only from space. It also represents a controversial engineering project that is being undertaken on land considered to be sacred by the local, native Apache people. As presented, the case features the University of Virginia, and its challenges in consideration of whether and how to join the LBT project consortium.
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
Wood, Richard T A; Griffiths, Mark D
2015-12-01
This study is one of the first to explore in detail the behaviors, attitudes and motivations of players that show no signs of at-risk or problem gambling behavior (so-called 'positive players'). Via an online survey, 1484 positive players were compared with 209 problem players identified using the Lie/Bet screen. The study identified two distinct groups of positive players defined according to their motivations to play and their engagement with responsible gambling (RG) practices. Those positive players that played most frequently employed the most personal RG strategies. Reasons that positive players gave for gambling were focused on leisure (e.g., playing for fun, being entertained, and/or winning a prize). By contrast, problem gamblers were much more focused upon modifying mood states (e.g., excitement, relaxation, depression and playing when bored or upset). The present study also suggests that online gambling is not, by default, inherently riskier than gambling in more traditional ways, as online gambling was the most popular media by which positive players gambled. Furthermore, most positive players reported that it was easier to stick to their limits when playing the National Lottery online compared to traditional retail purchasing of tickets. Problem players were significantly more likely than positive players to gamble with family and friends, suggesting that, contrary to a popular RG message, social play may not be inherently safer than gambling alone. It is proposed that players (generally) may identify more with the term 'positive play' than the term 'RG' which is frequently interpreted as being aimed at people with gambling problems, rather than all players.
NMR contributions to structural dynamics studies of intrinsically disordered proteins☆
Konrat, Robert
2014-01-01
Intrinsically disordered proteins (IDPs) are characterized by substantial conformational plasticity. Given their inherent structural flexibility X-ray crystallography is not applicable to study these proteins. In contrast, NMR spectroscopy offers unique opportunities for structural and dynamic studies of IDPs. The past two decades have witnessed significant development of NMR spectroscopy that couples advances in spin physics and chemistry with a broad range of applications. This article will summarize key advances in basic physical-chemistry and NMR methodology, outline their limitations and envision future R&D directions. PMID:24656082
Solid-state NMR and computational studies of 4-methyl-2-nitroacetanilide.
Harris, Robin K; Ghi, Phuong Y; Hammond, Robert B; Ma, Cai Yun; Roberts, Kevin J; Yates, Jonathan R; Pickard, Chris J
2006-03-01
Studies on the solid-state structure of two polymorphs of 4-methyl-2-nitroacetanilide (MNA) were conducted using magic-angle spinning (13)C, (15)N and (1)H NMR spectroscopy, together with first-principles computations of NMR shielding (including use of a program that takes explicit account of the translational symmetry inherent in crystalline structures). The effects on (13)C chemical shifts of side-chain rotations have been explored. Information derived from these studies was then incorporated within a systematic space-search methodology for elucidation of trial crystallographic structures from powder XRD.
Human error and the search for blame
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
Human error is a frequent topic in discussions about risks in using computer systems. A rational analysis of human error leads through the consideration of mistakes to standards that designers use to avoid mistakes that lead to known breakdowns. The irrational side, however, is more interesting. It conditions people to think that breakdowns are inherently wrong and that there is ultimately someone who is responsible. This leads to a search for someone to blame which diverts attention from: learning from the mistakes; seeing the limitations of current engineering methodology; and improving the discourse of design.
The historiography of medical history: from great men to archaeology.
King, C. R.
1991-01-01
The history of medicine is always written from the basis of the historian. Contemporary historiography provides an understanding of the major methods of historical analysis and their influences on the writing of medical history. Medical history in the 20th century has emphasized the historiographic methods of the history of great men, historicism, social history, and intellectual history. Each methodology has inherent biases that influence the historian's analysis of the past. Understanding the historian's biases provides the reader important tools for the interpretation of medical history. PMID:1933068
IMSF: Infinite Methodology Set Framework
NASA Astrophysics Data System (ADS)
Ota, Martin; Jelínek, Ivan
Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.
Model identification methodology for fluid-based inerters
NASA Astrophysics Data System (ADS)
Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew
2018-06-01
Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.
Prestigious Science Journals Struggle to Reach Even Average Reliability
Brembs, Björn
2018-01-01
In which journal a scientist publishes is considered one of the most crucial factors determining their career. The underlying common assumption is that only the best scientists manage to publish in a highly selective tier of the most prestigious journals. However, data from several lines of evidence suggest that the methodological quality of scientific experiments does not increase with increasing rank of the journal. On the contrary, an accumulating body of evidence suggests the inverse: methodological quality and, consequently, reliability of published research works in several fields may be decreasing with increasing journal rank. The data supporting these conclusions circumvent confounding factors such as increased readership and scrutiny for these journals, focusing instead on quantifiable indicators of methodological soundness in the published literature, relying on, in part, semi-automated data extraction from often thousands of publications at a time. With the accumulating evidence over the last decade grew the realization that the very existence of scholarly journals, due to their inherent hierarchy, constitutes one of the major threats to publicly funded science: hiring, promoting and funding scientists who publish unreliable science eventually erodes public trust in science. PMID:29515380
System Engineering for J-2X Development: The Simpler, the Better
NASA Technical Reports Server (NTRS)
Kelly, William M.; Greasley, Paul; Greene, William D.; Ackerman, Peter
2008-01-01
The Ares I and Ares V Vehicles will utilize the J-2X rocket engine developed for NASA by the Pratt and Whitney Rocketdyne Company (PWR) as the upper stage engine (USE). The J-2X is an improved higher power version of the original J-2 engine used for Apollo. System Engineering (SE) facilitates direct and open discussions of issues and problems. This simple idea is often overlooked in large, complex engineering development programs. Definition and distribution of requirements from the engine level to the component level is controlled by Allocation Reports which breaks down numerical design objectives (weight, reliability, etc.) into quanta goals for each component area. Linked databases of design and verification requirements help eliminate redundancy and potential mistakes inherent in separated systems. Another tool, the Architecture Design Description (ADD), is used to control J-2X system architecture and effectively communicate configuration changes to those involved in the design process. But the proof of an effective process is in successful program accomplishment. SE is the methodology being used to meet the challenge of completing J-2X engine certification 2 years ahead of any engine program ever developed at PWR. This paper describes the simple, better SE tools and techniques used to achieve this success.
Good coupling for the multiscale patch scheme on systems with microscale heterogeneity
NASA Astrophysics Data System (ADS)
Bunder, J. E.; Roberts, A. J.; Kevrekidis, I. G.
2017-05-01
Computational simulation of microscale detailed systems is frequently only feasible over spatial domains much smaller than the macroscale of interest. The 'equation-free' methodology couples many small patches of microscale computations across space to empower efficient computational simulation over macroscale domains of interest. Motivated by molecular or agent simulations, we analyse the performance of various coupling schemes for patches when the microscale is inherently 'rough'. As a canonical problem in this universality class, we systematically analyse the case of heterogeneous diffusion on a lattice. Computer algebra explores how the dynamics of coupled patches predict the large scale emergent macroscale dynamics of the computational scheme. We determine good design for the coupling of patches by comparing the macroscale predictions from patch dynamics with the emergent macroscale on the entire domain, thus minimising the computational error of the multiscale modelling. The minimal error on the macroscale is obtained when the coupling utilises averaging regions which are between a third and a half of the patch. Moreover, when the symmetry of the inter-patch coupling matches that of the underlying microscale structure, patch dynamics predicts the desired macroscale dynamics to any specified order of error. The results confirm that the patch scheme is useful for macroscale computational simulation of a range of systems with microscale heterogeneity.
Design and optimization of input shapers for liquid slosh suppression
NASA Astrophysics Data System (ADS)
Aboel-Hassan, Ameen; Arafa, Mustafa; Nassef, Ashraf
2009-02-01
The need for fast maneuvering and accurate positioning of flexible structures poses a control challenge. The inherent flexibility in these lightly damped systems creates large undesirable residual vibrations in response to rapid excitations. Several control approaches have been proposed to tackle this class of problems, of which the input shaping technique is appealing in many aspects. While input shaping has been widely investigated to attenuate residual vibrations in flexible structures, less attention was granted to expand its viability in further applications. The aim of this work is to develop a methodology for applying input shaping techniques to suppress sloshing effects in open moving containers to facilitate safe and fast point-to-point movements. The liquid behavior is modeled using finite element analysis. The input shaper parameters are optimized to find the commands that would result in minimum residual vibration. Other objectives, such as improved robustness, and motion constraints such as deflection limiting are also addressed in the optimization scheme. Numerical results are verified on an experimental setup consisting of a small motor-driven water tank undergoing rectilinear motion, while measuring both the tank motion and free surface displacement of the water. The results obtained suggest that input shaping is an effective method for liquid slosh suppression.
1997-01-01
An estimated 1 of 3 Americans uses some form of complementary and alternative medicine (CAM), such as acupuncture, homeopathy, or herbal medicine. In 1995, the National Institutes of Health Office of Alternative Medicine convened an expert panel to examine the role of clinical practice guidelines in CAM. The panel concluded that CAM practices currently are unsuitable for the development of evidence-based practice guidelines, in part because of the lack of relevant outcomes data from well-designed clinical trials. Moreover, the notions of standardization and appropriateness, inherent in guideline development, face challenging methodologic problems when applied to CAM, which considers many different treatment practices appropriate and encourages highly individualized care. Due to different belief systems and divergent theories about the nature of health and illness, CAM disciplines have fundamental differences in how they define target conditions, causes of disease, interventions, and outcome measures of effectiveness. These differences are even more striking when compared with those used by Western medicine. The panel made a series of recommendations on strategies to strengthen the evidence base for future guideline development in CAM and to meet better the current information needs of clinicians, patients, and guideline developers who seek information about CAM treatments.
Data-free and data-driven spectral perturbations for RANS UQ
NASA Astrophysics Data System (ADS)
Edeling, Wouter; Mishra, Aashwin; Iaccarino, Gianluca
2017-11-01
Despite recent developments in high-fidelity turbulent flow simulations, RANS modeling is still vastly used by industry, due to its inherent low cost. Since accuracy is a concern in RANS modeling, model-form UQ is an essential tool for assessing the impacts of this uncertainty on quantities of interest. Applying the spectral decomposition to the modeled Reynolds-Stress Tensor (RST) allows for the introduction of decoupled perturbations into the baseline intensity (kinetic energy), shape (eigenvalues), and orientation (eigenvectors). This constitutes a natural methodology to evaluate the model form uncertainty associated to different aspects of RST modeling. In a predictive setting, one frequently encounters an absence of any relevant reference data. To make data-free predictions with quantified uncertainty we employ physical bounds to a-priori define maximum spectral perturbations. When propagated, these perturbations yield intervals of engineering utility. High-fidelity data opens up the possibility of inferring a distribution of uncertainty, by means of various data-driven machine-learning techniques. We will demonstrate our framework on a number of flow problems where RANS models are prone to failure. This research was partially supported by the Defense Advanced Research Projects Agency under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo), and the DOE PSAAP-II program.
3D modeling of building indoor spaces and closed doors from imagery and point clouds.
Díaz-Vilariño, Lucía; Khoshelham, Kourosh; Martínez-Sánchez, Joaquín; Arias, Pedro
2015-02-03
3D models of indoor environments are increasingly gaining importance due to the wide range of applications to which they can be subjected: from redesign and visualization to monitoring and simulation. These models usually exist only for newly constructed buildings; therefore, the development of automatic approaches for reconstructing 3D indoors from imagery and/or point clouds can make the process easier, faster and cheaper. Among the constructive elements defining a building interior, doors are very common elements and their detection can be very useful either for knowing the environment structure, to perform an efficient navigation or to plan appropriate evacuation routes. The fact that doors are topologically connected to walls by being coplanar, together with the unavoidable presence of clutter and occlusions indoors, increases the inherent complexity of the automation of the recognition process. In this work, we present a pipeline of techniques used for the reconstruction and interpretation of building interiors based on point clouds and images. The methodology analyses the visibility problem of indoor environments and goes in depth with door candidate detection. The presented approach is tested in real data sets showing its potential with a high door detection rate and applicability for robust and efficient envelope reconstruction.
Methodology to estimate the relative pressure field from noisy experimental velocity data
NASA Astrophysics Data System (ADS)
Bolin, C. D.; Raguin, L. G.
2008-11-01
The determination of intravascular pressure fields is important to the characterization of cardiovascular pathology. We present a two-stage method that solves the inverse problem of estimating the relative pressure field from noisy velocity fields measured by phase contrast magnetic resonance imaging (PC-MRI) on an irregular domain with limited spatial resolution, and includes a filter for the experimental noise. For the pressure calculation, the Poisson pressure equation is solved by embedding the irregular flow domain into a regular domain. To lessen the propagation of the noise inherent to the velocity measurements, three filters - a median filter and two physics-based filters - are evaluated using a 2-D Couette flow. The two physics-based filters outperform the median filter for the estimation of the relative pressure field for realistic signal-to-noise ratios (SNR = 5 to 30). The most accurate pressure field results from a filter that applies in a least-squares sense three constraints simultaneously: consistency between measured and filtered velocity fields, divergence-free and additional smoothness conditions. This filter leads to a 5-fold gain in accuracy for the estimated relative pressure field compared to without noise filtering, in conditions consistent with PC-MRI of the carotid artery: SNR = 5, 20 x 20 discretized flow domain (25 X 25 computational domain).
The Beliefs of Teachers and Daycare Staff regarding Children of Divorce: A Q Methodological Study
ERIC Educational Resources Information Center
Overland, Klara; Thorsen, Arlene Arstad; Storksen, Ingunn
2012-01-01
This Q methodological study explores beliefs of daycare staff and teachers regarding young children's reactions related to divorce. The Q factor analysis resulted in two viewpoints. Participants on the viewpoint "Child problems" believe that children show various emotional and behavioral problems related to divorce, while those on the "Structure…
The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.
ERIC Educational Resources Information Center
Filinov, Nikolay B.; Ruchkina, Svetlana
2002-01-01
The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…
Integration of PBL Methodologies into Online Learning Courses and Programs
ERIC Educational Resources Information Center
van Oostveen, Roland; Childs, Elizabeth; Flynn, Kathleen; Clarkson, Jessica
2014-01-01
Problem-based learning (PBL) challenges traditional views of teaching and learning as the learner determines, to a large extent with support from a skilled facilitator, what topics will be explored, to what depth and which processes will be used. This paper presents the implementation of problem-based learning methodologies in an online Bachelor's…
NASA Astrophysics Data System (ADS)
Ajtai, Tibor; Utry, Noemi; Pinter, Mate; Kiss-Albert, Gergely; Smausz, Tomi; Konya, Zoltan; Hopp, Bela; Szabo, Gabor; Bozoki, Zoltan
2016-04-01
We present the investigation of the inherent, spectral features of laser generated and chemically characterized residential coal aerosols generated in our recently introduced laser ablation based LAC generator. The optical absorption and the scattering features of the generated aerosol were investigated by our state-of-the-art multi wavelength PAS instrument (4λ-PAS) and a multi wavelength cosinus sensor (Aurora 3000). The quantified wavelength dependency (AAE and SAE) are deduced from the measured data. Finally, relationship between the optical and the thermochemical characteristics is revealed. Atmospheric light absorbing carbonaceous particulate matter (LAC) is in the middle of scientific interest especially because of its climatic and adverse health relevance. The latest scientific assessments identified atmospheric soot as the second most important anthropogenic emission regarding its climatic effect and as one of the most harmful atmospheric constituents based on its health aspects. LAC dominantly originates from anthropogenic sources, so its real time and selective identification is also essential for the means of its legal regulation. Despite of its significance the inherent properties of LAC are rarely described and the available data is widely spread even in the case of the most intensively studied black or elementary carbon. Therefore, the investigation of the inherent climate and health relevant properties of atmospheric soot is a highly actual issue. Moreover investigation of the optical and toxic properties of LAC originating from the combustion of household coals is almost completely missing from literature. There are two major reasons for that. Firstly, the characteristic parameters of soot are complex and vary in a wide range and depend not only on the initial burning conditions and the type of fuels but also the ambient factors. The other is the lack of a soot standard material and a generator which are suitable for modelling the real atmospheric black carbon and making the controlled generation of real atmospheric soot particulate possible. The most commonly used and commercially available methodologies only partially fulfil these requirements; therefore introducing alternative, improved methodologies is a highly relevant scientific goal.
NASA Technical Reports Server (NTRS)
Nakajima, Yukio; Padovan, Joe
1987-01-01
In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.
Safety Isn't Always First: A Disturbing Look at Chemistry Books.
ERIC Educational Resources Information Center
Manning, Pat; Newman, Alan R.
1986-01-01
Discusses the problem of serious dangers in current and backlist chemistry experiment books. Discarding of older books and careful evaluation of the dangers inherent in newer books are recommended. Safe alternatives are suggested, including some criteria for evaluating dangers, and a safer approach used by a current author. (EM)
Homeostatic Systems--Mechanisms for Survival. Science IV.
ERIC Educational Resources Information Center
Pfeiffer, Carl H.
The two student notebooks in this set provide the basic outline and assignments for the fourth and last year of a senior high school unified science program which builds on the technical third year course, Science IIIA (see SE 012 149). An introductory section considers the problems of survival inherent in living systems, matter-energy…
The Use of Electronic Book Theft Detection Systems in Libraries.
ERIC Educational Resources Information Center
Witt, Thomas B.
1996-01-01
Although electronic book theft detection systems can be a deterrent to library material theft, no electronic system is foolproof, and a total security program is necessary to ensure collection security. Describes how book theft detection systems work, their effectiveness, and the problems inherent in technology. A total security program considers…
American Colleges' Missteps Raise Questions about Overseas Partnerships
ERIC Educational Resources Information Center
Fischer, Karin
2012-01-01
Several stumbles by American colleges in setting up programs with foreign partners have called attention to problems inherent in making such arrangements. State University of New York Empire State College has allowed a university in Albania to deliver diplomas in its name. In North Dakota, state auditors issued a scathing review of dual-degree…
A Strategy for Program Evaluation.
ERIC Educational Resources Information Center
Leinhardt, Gaea
This paper proposes a strategy for in-house evaluations in the context of an educational research and development facility. The obstacles in conducting an evaluation of colleagues' programs are discussed; these fall into two categories. First, there is a set of problems that relate to conflicts inherent in judging the work of a colleague without…
Minority Language Standardisation and the Role of Users
ERIC Educational Resources Information Center
Lane, Pia
2015-01-01
Developing a standard for a minority language is not a neutral process; this has consequences for the status of the language and how the language users relate to the new standard. A potential inherent problem with standardisation is whether the language users themselves will accept and identify with the standard. When standardising minority…
Problems and Prospects of Open and Distance Education in Nigeria
ERIC Educational Resources Information Center
Yusuf, Mudasiru Olalere
2006-01-01
Distance education as a mean of providing access to education, particularly tertiary level education, has gained great prominence in the world. Nigeria has taken giant steps of recent to introduce open and distance education programme. This paper explores the major terms inherent in open and distance education, its potentials, possible factors…
No Small World: Visions and Revisions of World Literature.
ERIC Educational Resources Information Center
Carroll, Michael Thomas, Ed.
This collection of essays deals with world literature. The essays are focused on four primary goals: to map the conceptual and cultural problems inherent in common educational approaches to the subject which sometimes see world literature as a metanarrative of Western culture; to suggest new genres and perspectives; to consider specific curricular…
The Analysis of Discourse as Evaluation of Productive Thinking.
ERIC Educational Resources Information Center
Tripp, D. H.
This paper provides a thorough description of a method of analyzing and scoring group discussions from a particular point of view. After discussing shortcomings of traditional methods of reporting data from group discussions and problems inherent in the use of paper-and-pencil creativity tests, the author describes a method which was developed as…
Applied Mathematics, Tenth Grade. A Resource Manual.
ERIC Educational Resources Information Center
Baltimore County Public Schools, Towson, MD.
This resource manual is designed for use with tenth grade boys whose main interest lies in the shop and industrial arts areas. The course emphasizes mathematical problems inherent in various trades and industries. The primary objective is to motivate the student to apply, improve, and increase his computational skills. The manual is divided into…
Virtual Cerebral Ventricular System: An MR-Based Three-Dimensional Computer Model
ERIC Educational Resources Information Center
Adams, Christina M.; Wilson, Timothy D.
2011-01-01
The inherent spatial complexity of the human cerebral ventricular system, coupled with its deep position within the brain, poses a problem for conceptualizing its anatomy. Cadaveric dissection, while considered the gold standard of anatomical learning, may be inadequate for learning the anatomy of the cerebral ventricular system; even with…
An Action Learning Project on Diversity: Pitfalls and Possibilities.
ERIC Educational Resources Information Center
Hite, Linda M.
1997-01-01
In a college course on diversity in the workplace, students' experiences with conducting a cultural audit of the university as a workplace illustrate the dilemmas that can arise when students conduct action research in a real client system. Despite the inherent problems, the project resulted in significant student learning about the subject and…
USDA-ARS?s Scientific Manuscript database
Soil erosion and nutrient loss from surface runoff and sub-surface flows are critical problems for croplands in the United States. Assessing cropland vulnerability to runoff and leaching is needed for watershed or regional land use and land management planning and conservation resources allocation. ...
USDA-ARS?s Scientific Manuscript database
Mixing models have been used to predict sediment source contributions. The inherent problem of the mixing models limited the number of sediment sources. The objective of this study is to develop and evaluate a new method using Discriminant Function Analysis (DFA) to fingerprint sediment source contr...
Medical Student and Junior Doctors' Tolerance of Ambiguity: Development of a New Scale
ERIC Educational Resources Information Center
Hancock, Jason; Roberts, Martin; Monrouxe, Lynn; Mattick, Karen
2015-01-01
The practice of medicine involves inherent ambiguity, arising from limitations of knowledge, diagnostic problems, complexities of treatment and outcome and unpredictability of patient response. Research into doctors' tolerance of ambiguity is hampered by poor conceptual clarity and inadequate measurement scales. We aimed to create and pilot a…
Policies and Practices in the Bibliographic Control of United States Government Publications.
ERIC Educational Resources Information Center
Crowers, Clifford P., Ed.
1974-01-01
In an attempt to clarify the indexing and announcing controls for government documents, this issue of the Drexel Library Quarterly presents background information on several of the information controlling and access agencies, describes their operations, and points out their inherent problems and weaknesses. The agencies covered are the Government…
What Do You Find? Students Investigating Patterns in Pascal's Triangle
ERIC Educational Resources Information Center
Obara, Samuel
2012-01-01
In this paper, students used problem-solving skills to investigate what patterns exist in the Pascal triangle and incorporated technology using Geometer's Sketchpad (GSP) in the process. Students came up with patterns such as natural numbers, triangular numbers, and Fibonacci numbers. Although the patterns inherent in Pascal's triangle may seem…
Update on Research and Leadership, Fall 2001-Spring 2002.
ERIC Educational Resources Information Center
Barnett, Elisabeth, Ed.
2001-01-01
This issue of On Research and Leadership Update (v13 n1) focuses on the concerns surrounding dual enrollment and dual credit. "Dual Enrollment Programs: Assessing the American Dream," by Katherine Boswell, addresses the problems inherent in development of these programs when institutions fail to collaborate with one another in an effective way.…
ERIC Educational Resources Information Center
Barfield, Kenny D.
One of three related documents exploring the problems inherent to current high school forensic coaching, this paper explores the issue of risk in debate and how this risk can be reduced. The paper first examines how the 'risk of losing' affects coaches and debaters alike, noting that in providing adequate direction by helping to test the evidence…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, R.R.
1986-01-01
This report presents information on the Integral Fast Reactor and its role in the future. Information is presented in the areas of: inherent safety; other virtues of sodium-cooled breeder; and solving LWR fuel cycle problems with IFR technologies. (JDB)
ERIC Educational Resources Information Center
Nielsen, Mathias Wullum
2016-01-01
Academic debates addressing the persistent gender gap in science reveal considerable contestation of the relevance and extent of the problem. Particular attention has been given to the question of whether women's high attrition rates should be ascribed to the structural and cultural barriers inherent to the academic system or instead…
Integrating Pharmacology Topics in High School Biology and Chemistry Classes Improves Performance
ERIC Educational Resources Information Center
Schwartz-Bloom, Rochelle D.; Halpin, Myra J.
2003-01-01
Although numerous programs have been developed for Grade Kindergarten through 12 science education, evaluation has been difficult owing to the inherent problems conducting controlled experiments in the typical classroom. Using a rigorous experimental design, we developed and tested a novel program containing a series of pharmacology modules (e.g.,…
Shaping the Army’s Information Technology Acquisition Workforce in an Era of Outsourcing
2009-03-12
United States testified before the House of Representatives Armed Services Committee on this problem, stating, “Unless the federal government pays the... borderline inherently governmental.” 21 Whether or not the work should be done in-house had become a moot question. There weren’t enough skilled
ERIC Educational Resources Information Center
Fortuin, Karen P. J.; van Koppen, C. S. A.; Leemans, Rik
2011-01-01
Conceptual models are useful for facing the challenges of environmental sciences curriculum and course developers and students. These challenges are inherent to the interdisciplinary and problem-oriented character of environmental sciences curricula. In this article, we review the merits of conceptual models in facing these challenges. These…
Issues Raised by the Follow Through Evaluation.
ERIC Educational Resources Information Center
House, Ernest R.; Hutchins, Elizabeth J.
This paper presents a discussion of issues raised in the evaluation of Project Follow Through reported by Abt Associates. The paper suggests that many of the problems inherent in the design of both the program and the evaluation stem from the underlying assumption that one educational model could be found which would best alleviate the educational…
Language and Culture in the Multiethnic Community: Spoken Language Assessment.
ERIC Educational Resources Information Center
Matluck, Joseph H.; Mace-Matluck, Betty J.
This paper discusses the sociolinguistic problems inherent in multilingual testing, and the accompanying dangers of cultural bias in either the visuals or the language used in a given test. The first section discusses English-speaking Americans' perception of foreign speakers in terms of: (1) physical features; (2) speech, specifically vocabulary,…
The Net Neutrality Debate: The Basics
ERIC Educational Resources Information Center
Greenfield, Rich
2006-01-01
Rich Greenfield examines the basics of today's net neutrality debate that is likely to be an ongoing issue for society. Greenfield states the problems inherent in the definition of "net neutrality" used by Common Cause: "Network neutrality is the principle that Internet users should be able to access any web content they choose and…
ERIC Educational Resources Information Center
Geber, Beverly
1993-01-01
There are inherent problems when unskilled or semiskilled workers are retrained for high skilled jobs that do not and will not exist. Although the consensus is that smarter workers will make the nation more competitive in the world market, the occupation that will add the most jobs by the year 2005 is retail clerk. (JOW)
Summary appraisals of the Nation's ground-water resources; Ohio region
Bloyd, Richard M.
1974-01-01
Rapid advance of techniques in ground-water hydrology during recent years has provided methods which the hydrologist can use for evaluating planned ground-water development. Therefore, the manager can resolve the inherent problems that historically have bred caution when this part of our total water resource was considered for development.
Unintended consequences of carbon enhancement in agricultural soils: The N2O problem
USDA-ARS?s Scientific Manuscript database
The potential of agricultural soils to accumulate C as a means of removing greenhouse gases (GHGs) from the atmosphere is complicated by the inherent coupling of the C and N cycles in soil. Practices that increase soil C content can have the unintended consequence of stimulating N mineralization, ni...
On the Validity of Educational Evaluation and Its Construction
ERIC Educational Resources Information Center
Huang, Xiaoping; Hu, Zhongfeng
2015-01-01
The main problem of the educational evaluation validity is that it just copies the conceptual framework system of validity from educational measurement to its own conceptual system. The validity conceptual system that fits the need of theory and practice of educational evaluation has not been established yet. According to the inherent attributive…
Multiscale Reactive Molecular Dynamics
2012-08-15
biology cannot be described without considering electronic and nuclear-level dynamics and their coupling to slower, cooperative motions of the system ...coupling to slower, cooperative motions of the system . These inherently multiscale problems require computationally efficient and accurate methods to...condensed phase systems with computational efficiency orders of magnitudes greater than currently possible with ab initio simulation methods, thus
Integration Moves Backward in the 70s
ERIC Educational Resources Information Center
Coffin, Gregory C.
1973-01-01
Suggests that the failure of (1) school boards and superintendents to recognize the evil inherent in segragated schools -- both black and white -- and their lack of courage in dealing with the problem and (2) educators to recognize the subtle and not so subtle racial bias of their curriculum, curricular materials, personnel, and staffing practices…
American college of gastroenterology monograph on the management of irritable bowel syndrome.
Camilleri, Michael
2015-04-01
This editorial reviews a recently published guideline on management of irritable bowel syndrome. The guideline illustrates problems arising from the quality of clinical trials used in systematic reviews and the potential impact of the inherent weaknesses of those trials on rating the strength of evidence and the resulting recommendations.
Effects of risk attitudes on extended attack fire management decisionmaking
Donald G. MacGregor; Armando González-Cabán
2009-01-01
Fire management inherently involves the assessment and management of risk, and decision making under uncertainty. Although organizational standards and guides are an important determinant of how decision problems are structured and framed, decision makers may view risk-based decisions from a perspective that is unique to their background and experience. Previous...
Iron Deficiency in Preschool Children with Autistic Spectrum Disorders
ERIC Educational Resources Information Center
Bilgic, Ayhan; Gurkan, Kagan; Turkoglu, Serhat; Akca, Omer Faruk; Kilic, Birim Gunay; Uslu, Runa
2010-01-01
Iron deficiency (ID) causes negative outcomes on psychomotor and behavioral development of infants and young children. Children with autistic spectrum disorders (ASD) are under risk for ID and this condition may increase the severity of psychomotor and behavioral problems, some of which already inherently exist in these children. In the present…
ERIC Educational Resources Information Center
Shawer, Saad
2013-01-01
This paper examines why communicative language teaching (CLT) fails to improve student learning in certain contexts by assessing two adult educators' communicative and noncommunicative practices through qualitative case studies, interviews, and participant observations. Results show no inherent CLT problems that prevent teachers from grasping…
The Integration of Multimedia and Field Experience.
ERIC Educational Resources Information Center
Dawson, George
A professor of science education at Florida State University shares his experiences with the growth of the field of environmental education and the problems inherent in trying to teach formal environmental education outdoors. Although field experience is best, it must be limited in most situations since logistics get in the way. Technology can…
ERIC Educational Resources Information Center
Horner, Beth
1984-01-01
Discusses aspects inherent in maintaining a library position while developing a separate freelance career as exemplified by personal experiences as a children's librarian and freelance storyteller. Potential problems (fatigue, clear boundaries, scheduling) and advantages for the individual (financial security, professional contacts) and the…
[Feminists' approach to population problems: new paradigm or Utopia?].
Kono, S
1997-05-01
The author first notes that, partly because of events occurring at the International Conference on Population and Development that took place in Cairo in 1994, a consensus has emerged that population programs based on a philosophy of empowering women and focusing on reproductive health are more likely to be effective than programs that focus on providing family planning services and achieving demographic targets. Some reservations about this consensus are then expressed. The author points out the difficulties inherent in widening the mandate of family planning programs in an era of diminished resources for international assistance, the past success of such programs in reducing fertility with limited resources, and the inherent contradictions in following a laissez-faire attitude toward reproduction in such regions as Sub-Saharan Africa, where economies and political systems are often in crisis, health services are minimal, and desired levels of fertility both way above current levels and far above the replacement level. While not challenging the value of the Cairo philosophy, the need to move from rhetoric to reality in the face of the world's current population problems is stressed.
Shakeri, Heman; Sahneh, Faryad Darabi; Scoglio, Caterina; Poggi-Corradini, Pietro; Preciado, Victor M
2015-06-01
Launching a prevention campaign to contain the spread of infection requires substantial financial investments; therefore, a trade-off exists between suppressing the epidemic and containing costs. Information exchange among individuals can occur as physical contacts (e.g., word of mouth, gatherings), which provide inherent possibilities of disease transmission, and non-physical contacts (e.g., email, social networks), through which information can be transmitted but the infection cannot be transmitted. Contact network (CN) incorporates physical contacts, and the information dissemination network (IDN) represents non-physical contacts, thereby generating a multilayer network structure. Inherent differences between these two layers cause alerting through CN to be more effective but more expensive than IDN. The constraint for an epidemic to die out derived from a nonlinear Perron-Frobenius problem that was transformed into a semi-definite matrix inequality and served as a constraint for a convex optimization problem. This method guarantees a dying-out epidemic by choosing the best nodes for adopting preventive behaviors with minimum monetary resources. Various numerical simulations with network models and a real-world social network validate our method.
Introducing soft systems methodology plus (SSM+): why we need it and what it can contribute.
Braithwaite, Jeffrey; Hindle, Don; Iedema, Rick; Westbrook, Johanna I
2002-01-01
There are many complicated and seemingly intractable problems in the health care sector. Past ways to address them have involved political responses, economic restructuring, biomedical and scientific studies, and managerialist or business-oriented tools. Few methods have enabled us to develop a systematic response to problems. Our version of soft systems methodology, SSM+, seems to improve problem solving processes by providing an iterative, staged framework that emphasises collaborative learning and systems redesign involving both technical and cultural fixes.
Sinc-Galerkin estimation of diffusivity in parabolic problems
NASA Technical Reports Server (NTRS)
Smith, Ralph C.; Bowers, Kenneth L.
1991-01-01
A fully Sinc-Galerkin method for the numerical recovery of spatially varying diffusion coefficients in linear partial differential equations is presented. Because the parameter recovery problems are inherently ill-posed, an output error criterion in conjunction with Tikhonov regularization is used to formulate them as infinite-dimensional minimization problems. The forward problems are discretized with a sinc basis in both the spatial and temporal domains thus yielding an approximate solution which displays an exponential convergence rate and is valid on the infinite time interval. The minimization problems are then solved via a quasi-Newton/trust region algorithm. The L-curve technique for determining an approximate value of the regularization parameter is briefly discussed, and numerical examples are given which show the applicability of the method both for problems with noise-free data as well as for those whose data contains white noise.
Designing an IMAC system using TeraNet
NASA Astrophysics Data System (ADS)
Mun, In K.; Hilal, S. K.; Andrews, M. C.; Gidron, Rafael
1992-07-01
Even though considerable progresses have been made with communication technology, one of the more difficult problems facing in installing a comprehensive clinically effective Image Management and Communication (IMAC) system for a hospital is the communication problem. Most existing systems are based on Ethernet or Token-ring net. Some of the newer systems are being installed using FDDL. All these systems have inherent problems like communication speed, control of bandwidth usage, or/and poor performance under heavy traffic. In order to overcome these difficulties, we are designing a complete IMAC system based on a novel network known as TeraNet, being developed at Center for Telecommunication Research, Columbia University.
NASA flight cell and battery issues
NASA Technical Reports Server (NTRS)
Schulze, N. R.
1989-01-01
The author presents the important battery and cell problems, encompassing both test failures and accidents, which were encountered during the past year. Practical issues facing programs, which have to be considered in the development of a battery program strategy, are addressed. The problems of one program, the GRO (Gamma Ray Observatory), during the past year are focused on to illustrate the fundamental types of battery problems that occur. Problems encountered by other programs are briefly mentioned to complete the accounting. Two major categories of issues are defined, namely, whose which are quality and design related, i.e., problems having inherent manufacturing-process-related aspects with an impact on cell reliability, and these which are accident triggered or man induced, i.e., those operational issues having an impact on battery and cell reliability.
NASA Astrophysics Data System (ADS)
Torres-Arredondo, M.-A.; Sierra-Pérez, Julián; Cabanes, Guénaël
2016-05-01
The process of measuring and analysing the data from a distributed sensor network all over a structural system in order to quantify its condition is known as structural health monitoring (SHM). For the design of a trustworthy health monitoring system, a vast amount of information regarding the inherent physical characteristics of the sources and their propagation and interaction across the structure is crucial. Moreover, any SHM system which is expected to transition to field operation must take into account the influence of environmental and operational changes which cause modifications in the stiffness and damping of the structure and consequently modify its dynamic behaviour. On that account, special attention is paid in this paper to the development of an efficient SHM methodology where robust signal processing and pattern recognition techniques are integrated for the correct interpretation of complex ultrasonic waves within the context of damage detection and identification. The methodology is based on an acousto-ultrasonics technique where the discrete wavelet transform is evaluated for feature extraction and selection, linear principal component analysis for data-driven modelling and self-organising maps for a two-level clustering under the principle of local density. At the end, the methodology is experimentally demonstrated and results show that all the damages were detectable and identifiable.
Research in assessment: consensus statement and recommendations from the Ottawa 2010 Conference.
Schuwirth, Lambert; Colliver, Jerry; Gruppen, Larry; Kreiter, Clarence; Mennin, Stewart; Onishi, Hirotaka; Pangaro, Louis; Ringsted, Charlotte; Swanson, David; Van Der Vleuten, Cees; Wagner-Menghin, Michaela
2011-01-01
Medical education research in general is a young scientific discipline which is still finding its own position in the scientific range. It is rooted in both the biomedical sciences and the social sciences, each with their own scientific language. A more unique feature of medical education (and assessment) research is that it has to be both locally and internationally relevant. This is not always easy and sometimes leads to purely ideographic descriptions of an assessment procedure with insufficient general lessons or generalised scientific knowledge being generated or vice versa. For medical educational research, a plethora of methodologies is available to cater to many different research questions. This article contains consensus positions and suggestions on various elements of medical education (assessment) research. Overarching is the position that without a good theoretical underpinning and good knowledge of the existing literature, good research and sound conclusions are impossible to produce, and that there is no inherently superior methodology, but that the best methodology is the one most suited to answer the research question unambiguously. Although the positions should not be perceived as dogmas, they should be taken as very serious recommendations. Topics covered are: types of research, theoretical frameworks, designs and methodologies, instrument properties or psychometrics, costs/acceptability, ethics, infrastructure and support.
Shen, Chun; Hu, Yan; Li, Fei
2018-04-16
We have read Shadmani et al.'s comments with appreciation for their interest in our study[1]. They pointed out three methodological issues. The first one is the inherent limitation of cross-sectional studies. We absolutely agree with them that it is not possible to establish a true cause and effect relationship in cross-sectional studies. That's why we stated "a cross-sectional study" in the title, never used confusing terms such as "predictor", "risk factor" in the paper and have discussed this limitation in the Discussion. However, cross-sectional studies with large sample size are helpful to identify risk factors of health-related status, and are widely used in epidemiological studies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Fundamental Algorithms of the Goddard Battery Model
NASA Technical Reports Server (NTRS)
Jagielski, J. M.
1985-01-01
The Goddard Space Flight Center (GSFC) is currently producing a computer model to predict Nickel Cadmium (NiCd) performance in a Low Earth Orbit (LEO) cycling regime. The model proper is currently still in development, but the inherent, fundamental algorithms (or methodologies) of the model are defined. At present, the model is closely dependent on empirical data and the data base currently used is of questionable accuracy. Even so, very good correlations have been determined between model predictions and actual cycling data. A more accurate and encompassing data base has been generated to serve dual functions: show the limitations of the current data base, and be inbred in the model properly for more accurate predictions. The fundamental algorithms of the model, and the present data base and its limitations, are described and a brief preliminary analysis of the new data base and its verification of the model's methodology are presented.
ERPs and Psychopathology. I. Behavioral process issues.
Roth, W T; Tecce, J J; Pfefferbaum, A; Rosenbloom, M; Callaway, E
1984-01-01
The clinical study of ERPs has an inherent defect--a self-selection of clinical populations that hampers equating of clinically defined groups on factors extraneous to the independent variables. Such ex post facto studies increase the likelihood of confounding variables in the interpretation of findings. Hence, the development of lawful relationships between clinical variables and ERPs is impeded and the fulfillment of description, explanation, prediction, and control in brain science is thwarted. Proper methodologies and theory development can increase the likelihood of establishing these lawful relationships. One methodology of potential value in the clinical application of ERPs, particularly in studies of aging, is that of divided attention. Two promising theoretical developments in the understanding of brain functioning and aging are the distraction-arousal hypothesis and the controlled-automatic attention model. The evaluation of ERPs in the study of brain-behavior relations in clinical populations might be facilitated by the differentiation of concurrent, predictive, content, and construct validities.
NASA Technical Reports Server (NTRS)
Palosz, B.; Grzanka, E.; Stelmakh, S.; Gierlotka, S.; Weber, H.-P.; Proffen, T.; Palosz, W.
2002-01-01
The real atomic structure of nanocrystals determines unique, key properties of the materials. Determination of the structure presents a challenge due to inherent limitations of standard powder diffraction techniques when applied to nanocrystals. Alternate methodology of the structural analysis of nanocrystals (several nanometers in size) based on Bragg-like scattering and called the "apparent lattice parameter" (alp) is proposed. Application of the alp methodology to examination of the core-shell model of nanocrystals will be presented. The results of application of the alp method to structural analysis of several nanopowders were complemented by those obtained by determination of the Atomic Pair Distribution Function, PDF. Based on synchrotron and neutron diffraction data measured in a large diffraction vector of up to Q = 25 Angstroms(exp -1), the surface stresses in nanocrystalline diamond and SiC were evaluated.
NASA Astrophysics Data System (ADS)
Welty, N.; Rudolph, M.; Schäfer, F.; Apeldoorn, J.; Janovsky, R.
2013-07-01
This paper presents a computational methodology to predict the satellite system-level effects resulting from impacts of untrackable space debris particles. This approach seeks to improve on traditional risk assessment practices by looking beyond the structural penetration of the satellite and predicting the physical damage to internal components and the associated functional impairment caused by untrackable debris impacts. The proposed method combines a debris flux model with the Schäfer-Ryan-Lambert ballistic limit equation (BLE), which accounts for the inherent shielding of components positioned behind the spacecraft structure wall. Individual debris particle impact trajectories and component shadowing effects are considered and the failure probabilities of individual satellite components as a function of mission time are calculated. These results are correlated to expected functional impairment using a Boolean logic model of the system functional architecture considering the functional dependencies and redundancies within the system.
Emerging and recurrent issues in drug development.
Anello, C
This paper reviews several emerging and recurrent issues relating to the drug development process. These emerging issues include changes to the FDA regulatory environment, internationalization of drug development, advances in computer technology and visualization tools, and efforts to incorporate meta-analysis methodology. Recurrent issues include: renewed interest in statistical methods for handling subgroups in the design and analysis of clinical trials; renewed interest in alternatives to the 'intention-to-treat' analysis in the presence of non-compliance in randomized clinical trials; renewed interest in methodology to address the multiplicities resulting from a variety of sources inherent in the drug development process, and renewed interest in methods to assure data integrity. These emerging and recurrent issues provide a continuing challenge to the international community of statisticians involved in drug development. Moreover, the involvement of statisticians with different perspectives continues to enrich the field and contributes to improvement in the public health.
A facile route to ketene-functionalized polymers for general materials applications
NASA Astrophysics Data System (ADS)
Leibfarth, Frank A.; Kang, Minhyuk; Ham, Myungsoo; Kim, Joohee; Campos, Luis M.; Gupta, Nalini; Moon, Bongjin; Hawker, Craig J.
2010-03-01
Function matters in materials science, and methodologies that provide paths to multiple functionality in a single step are to be prized. Therefore, we introduce a robust and efficient strategy for exploiting the versatile reactivity of ketenes in polymer chemistry. New monomers for both radical and ring-opening metathesis polymerization have been developed, which take advantage of Meldrum's acid as both a synthetic building block and a thermolytic precursor to dialkyl ketenes. The ketene-functionalized polymers are directly detected by their characteristic infrared absorption and are found to be stable under ambient conditions. The inherent ability of ketenes to provide crosslinking via dimerization and to act as reactive chemical handles via addition, provides simple methodology for application in complex materials challenges. Such versatile characteristics are illustrated by covalently attaching and patterning a dye through microcontact printing. The strategy highlights the significant opportunities afforded by the traditionally neglected ketene functional group in polymer chemistry.
Cuthbertson, Philip; Lauder, William; Steele, Rebekah; Cleary, Sonja; Bradshaw, Julie
2004-07-01
This study reports a comparative survey of mature students undertaking pre-registration undergraduate nursing education in Australia and Scotland. The study aimed to explore comparisons between the course-related and course-related financial difficulties faced by mature students in two very different educational and funding systems. Financial stress is a predictor of both physical and mental health problems. A similar pattern of course-related problems were reported by both Australian and Scottish students with the exception that Scottish students experienced more problems with childcare and caring for elderly relatives. Course-related problems may be inherent in the nature of undergraduate nursing education although the relatively time-intensive nature of Scottish curricula may explain the childcare and elderly relatives difference. Scottish students reported higher overall financial-related problems but Australian students reported more problems with funding placements. These findings have implications for both curriculum designers and policy makers.
Deb, Kalyanmoy; Sinha, Ankur
2010-01-01
Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.
The bedrock electrical conductivity map of the UK
NASA Astrophysics Data System (ADS)
Beamish, David
2013-09-01
Airborne electromagnetic (AEM) surveys, when regionally extensive, may sample a wide-range of geological formations. The majority of AEM surveys can provide estimates of apparent (half-space) conductivity and such derived data provide a mapping capability. Depth discrimination of the geophysical mapping information is controlled by the bandwidth of each particular system. The objective of this study is to assess the geological information contained in accumulated frequency-domain AEM survey data from the UK where existing geological mapping can be considered well-established. The methodology adopted involves a simple GIS-based, spatial join of AEM and geological databases. A lithology-based classification of bedrock is used to provide an inherent association with the petrophysical rock parameters controlling bulk conductivity. At a scale of 1:625k, the UK digital bedrock geological lexicon comprises just 86 lithological classifications compared with 244 standard lithostratigraphic assignments. The lowest common AEM survey frequency of 3 kHz is found to provide an 87% coverage (by area) of the UK formations. The conductivities of the unsampled classes have been assigned on the basis of inherent lithological associations between formations. The statistical analysis conducted uses over 8 M conductivity estimates and provides a new UK national scale digital map of near-surface bedrock conductivity. The new baseline map, formed from central moments of the statistical distributions, allows assessments/interpretations of data exhibiting departures from the norm. The digital conductivity map developed here is believed to be the first such UK geophysical map compilation for over 75 years. The methodology described can also be applied to many existing AEM data sets.
Opfer, Roland; Suppa, Per; Kepp, Timo; Spies, Lothar; Schippling, Sven; Huppertz, Hans-Jürgen
2016-05-01
Fully-automated regional brain volumetry based on structural magnetic resonance imaging (MRI) plays an important role in quantitative neuroimaging. In clinical trials as well as in clinical routine multiple MRIs of individual patients at different time points need to be assessed longitudinally. Measures of inter- and intrascanner variability are crucial to understand the intrinsic variability of the method and to distinguish volume changes due to biological or physiological effects from inherent noise of the methodology. To measure regional brain volumes an atlas based volumetry (ABV) approach was deployed using a highly elastic registration framework and an anatomical atlas in a well-defined template space. We assessed inter- and intrascanner variability of the method in 51 cognitively normal subjects and 27 Alzheimer dementia (AD) patients from the Alzheimer's Disease Neuroimaging Initiative by studying volumetric results of repeated scans for 17 compartments and brain regions. Median percentage volume differences of scan-rescans from the same scanner ranged from 0.24% (whole brain parenchyma in healthy subjects) to 1.73% (occipital lobe white matter in AD), with generally higher differences in AD patients as compared to normal subjects (e.g., 1.01% vs. 0.78% for the hippocampus). Minimum percentage volume differences detectable with an error probability of 5% were in the one-digit percentage range for almost all structures investigated, with most of them being below 5%. Intrascanner variability was independent of magnetic field strength. The median interscanner variability was up to ten times higher than the intrascanner variability. Copyright © 2016 Elsevier Inc. All rights reserved.
William H. Cooke; Dennis M. Jacobs
2002-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....
Capturing security requirements for software systems.
El-Hadary, Hassan; El-Kassas, Sherif
2014-07-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.
Capturing security requirements for software systems
El-Hadary, Hassan; El-Kassas, Sherif
2014-01-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514
Roca, Judith; Reguant, Mercedes; Canet, Olga
2016-11-01
Teaching strategies are essential in order to facilitate meaningful learning and the development of high-level thinking skills in students. To compare three teaching methodologies (problem-based learning, case-based teaching and traditional methods) in terms of the learning outcomes achieved by nursing students. This quasi-experimental research was carried out in the Nursing Degree programme in a group of 74 students who explored the subject of The Oncology Patient through the aforementioned strategies. A performance test was applied based on Bloom's Revised Taxonomy. A significant correlation was found between the intragroup theoretical and theoretical-practical dimensions. Likewise, intergroup differences were related to each teaching methodology. Hence, significant differences were estimated between the traditional methodology (x-=9.13), case-based teaching (x-=12.96) and problem-based learning (x-=14.84). Problem-based learning was shown to be the most successful learning method, followed by case-based teaching and the traditional methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modelling of the rotational moulding process for the manufacture of plastic products
NASA Astrophysics Data System (ADS)
Khoon, Lim Kok
The present research is mainly focused on two-dimensional non-linear thermal modelling, numerical procedures and software development for the rotational moulding process. The RotoFEM program is developed for the rotational moulding process using finite element procedures. The program is written in the MATLAB environment. The research includes the development of new slip flow models, phase change study, warpage study and process analyses. A new slip flow methodology is derived for the heat transfer problem inside the enclosed rotating mould during the heating stage of the tumbling powder. The methodology enables the discontinuous powder to be modelled by the continuous-based finite element method. The Galerkin Finite Element Method is incorporated with the lumped-parameter system and the coincident node technique in finding the multi-interacting heat transfer solutions inside the mould. Two slip flow models arise from the slip flow methodology; they are SDM (single-layered deposition method) and MDM (multi-layered deposition method). These two models have differences in their thermal description for the internal air energy balance and the computational procedure for the deposition of the molten polymer. The SDM model assumes the macroscopic deposition of the molten polymer bed exists only between the bed and the inner mould surface. On the other hand, the MDM model allows the layer-by-layer deposition of the molten polymer bed macroscopically. In addition, the latter has a more detailed heat transfer description for the internal air inside the mould during the powder heating cycle. In slip flow models, the semi-implicit approach has been introduced to solve the final quasi-equilibrium internal air temperature during the heating cycle. A notable feature of this slip flow methodology is that the slip flow models are capable of producing good results for the internal air at the heating powder stage, without the consideration of the powder movement and changeable powder mass. This makes the modelling of the rotational moulding process much simpler. In the simulation of the cooling stage in rotational moulding, the thermal aspects of the inherent warpage problem and external-internal cooling method have been explored. The predicted internal air temperature profiles have shown that the less apparent crystallization plateau in the experimental internal air in practice could be related to warpage. Various phase change algorithms have been reviewed and compared, and thus the most convenient and considerable effective algorithm is proposed. The dimensional analysis method, expressed by means of dimensionless combinations of physical, boundary, and time variables, is utilized to study the dependence of the key thermal parameters on the processing times of rotational moulding. Lastly, the predicted results have been compared with the experimental results from two different external resources. The predicted temperature profiles of the internal air, oven times and other process conditions are consistent with the available data.
NASA Technical Reports Server (NTRS)
Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun
1994-01-01
A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.
Problems related to the integration of fault tolerant aircraft electronic systems
NASA Technical Reports Server (NTRS)
Bannister, J. A.; Adlakha, V.; Triyedi, K.; Alspaugh, T. A., Jr.
1982-01-01
Problems related to the design of the hardware for an integrated aircraft electronic system are considered. Taxonomies of concurrent systems are reviewed and a new taxonomy is proposed. An informal methodology intended to identify feasible regions of the taxonomic design space is described. Specific tools are recommended for use in the methodology. Based on the methodology, a preliminary strawman integrated fault tolerant aircraft electronic system is proposed. Next, problems related to the programming and control of inegrated aircraft electronic systems are discussed. Issues of system resource management, including the scheduling and allocation of real time periodic tasks in a multiprocessor environment, are treated in detail. The role of software design in integrated fault tolerant aircraft electronic systems is discussed. Conclusions and recommendations for further work are included.
Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henneke, Dennis W.; Robinson, James
In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less
Meaning and Problems of Planning
ERIC Educational Resources Information Center
Brieve, Fred J.; Johnston, A. P.
1973-01-01
Examines the educational planning process. Discusses what planning is, how methodological planning can work in education, misunderstandings about planning, and difficulties in applying the planning methodology. (DN)
Moisture adsorption in optical coatings
NASA Technical Reports Server (NTRS)
Macleod, H. Angus
1988-01-01
The thin film filter is a very large aperture component which is exceedingly useful because of its small size, flexibility and ease of mounting. Thin film components, however, do have defects of performance and especially of stability which can cause problems in systems, particularly where long-term measurements are being made. Of all of the problems, those associated with moisture absorption are the most serious. Moisture absorption occurs in the pore-shaped voids inherent in the columnar structure of the layers. Ion-assisted deposition is a promising technique for substantially reducing moisture adsorption effects in thin film structures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasso, C.H.; Gallardo, M.
2006-01-15
The conclusions extracted from a recent study of the excitation of giant dipole resonances in nuclei at relativistic bombarding energies open the way for a further simplification of the problem. It consists in the elimination of the relativistic scalar and vector electromagnetic potentials and the familiar numerical difficulties associated with their presence in the calculation scheme. The inherent advantage of a reformulation of the problem of relativistic Coulomb excitation of giant dipole resonances along these lines is discussed.
Content-addressable read/write memories for image analysis
NASA Technical Reports Server (NTRS)
Snyder, W. E.; Savage, C. D.
1982-01-01
The commonly encountered image analysis problems of region labeling and clustering are found to be cases of search-and-rename problem which can be solved in parallel by a system architecture that is inherently suitable for VLSI implementation. This architecture is a novel form of content-addressable memory (CAM) which provides parallel search and update functions, allowing speed reductions down to constant time per operation. It has been proposed in related investigations by Hall (1981) that, with VLSI, CAM-based structures with enhanced instruction sets for general purpose processing will be feasible.
School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.
ERIC Educational Resources Information Center
Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others
1998-01-01
Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…
The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin; Zhang, Yizhuo
This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.
NASA Astrophysics Data System (ADS)
Alemany, Kristina
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
Use of Invariant Manifolds for Transfers Between Three-Body Systems
NASA Technical Reports Server (NTRS)
Beckman, Mark; Howell, Kathleen
2003-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits does not exist. This paper presents the initial approaches to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing 7-dimensional invariant manifold data are presented. Some particular solutions are presented for the transfer problem, though the emphasis is on developing methodology for solving the general problem.
Representations of Invariant Manifolds for Applications in Three-Body Systems
NASA Technical Reports Server (NTRS)
Howell, K.; Beckman, M.; Patterson, C.; Folta, D.
2004-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits is currently being studied. This paper presents an initial approach to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing the invariant manifold data are presented. Some particular solutions are presented for two types of transfer problems, though the emphasis is on developing the methodology for solving the general problem.
Hypothesis Testing as an Act of Rationality
NASA Astrophysics Data System (ADS)
Nearing, Grey
2017-04-01
Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.
[Methodological problems in the scientific research on HIV /AIDS in Bolivia].
Hita, Susana Ramírez
2013-05-01
This paper discusses the methodological problems in the scientific research on HIV/AIDS in Bolivia, both in the areas of epidemiology and social sciences. Studies associated with this research served as the basis for the implementation of health programs run by The Global Fund, The Pan-American Health Organization, International Cooperation, Non-Governmental Organizations and the Bolivian Ministry of Health and Sports. An analysis of the methodological contradictions and weaknesses was made by reviewing the bibliography of the studies and by conducting qualitative methodological research, that was focused on the quality of health care available to people living with HIV/AIDS in public hospitals and health centers, and looked at how programs targeted at this sector of the population are designed and delivered. In this manner, it was possible to observe the shortcomings of the methodological design in the epidemiological and social science studies which serve as the basis for the implementation of these health programs.
In Defense of Educators: The Problem of Idea Quality, Not "Teacher Quality"
ERIC Educational Resources Information Center
Hirsch, E. D., Jr.
2017-01-01
People who emphasize teaching quality and the central importance of teachers are right to do so. Where some go wrong is in thinking that teacher quality is an innate characteristic. The effectiveness of a teacher is not some inherent competence, as the phrase "teacher quality" suggests. Teacher effectiveness is contextual. Why has the…
USDA-ARS?s Scientific Manuscript database
One of the most important and least understood properties of carbohydrates is their conformational profile in solution. The study of carbohydrates in solution is a most difficult computational problem, a result of the many soft conformational variables (hydroxyl groups) inherent in the structures of...
"Well, You Are the One Who Decides": Attempting Shared Decision Making at the End of Aphasia Therapy
ERIC Educational Resources Information Center
Isaksen, Jytte
2018-01-01
Clinical borderlands manifest themselves through encounters between people deemed to be in need of health care and health care providers (Mattingly, 2010). This article addresses the problem of inherent asymmetry in the clinical discourse between clinical providers, such as speech-language pathologists (SLPs), and persons with aphasia.…
ERIC Educational Resources Information Center
Velten, Justin; Mokhtari, Kouider
2016-01-01
In this brief report, we share three challenges we encountered when designing and implementing an after school intervention program for an ethnically diverse group of middle grade underachieving readers. We also offer practical solutions to help guide middle school teams in anticipating and addressing potential problems when putting in place…
Teacher Renewal. Professional Issues, Personal Choices.
ERIC Educational Resources Information Center
Bolin, Frances S., Ed.; Falk, Judith McConnell, Ed.
The 16 essays in this book assesss the problems inherent in the role of the teacher and offer perspectives on issues affecting life in the classroom. The three chapters in part one, "The Teacher and Teaching as a Vocation," are addressed to improving the teacher by attending to the inner needs of the individual. Teachers are seen as persons in…
CLON: Overlay Networks and Gossip Protocols for Cloud Environments
NASA Astrophysics Data System (ADS)
Matos, Miguel; Sousa, António; Pereira, José; Oliveira, Rui; Deliot, Eric; Murray, Paul
Although epidemic or gossip-based multicast is a robust and scalable approach to reliable data dissemination, its inherent redundancy results in high resource consumption on both links and nodes. This problem is aggravated in settings that have costlier or resource constrained links as happens in Cloud Computing infrastructures composed by several interconnected data centers across the globe.
The Politics of Storytelling: Unfolding the Multiple Layers of Politics in (P)AR Publications
ERIC Educational Resources Information Center
Santos, Doris
2012-01-01
In the social sciences, inquiry into the relationship between storytelling and politics is based on a notion of historical continuity. One problem is the possible trap of inevitability inherent in this notion--that something which happened "had to happen". Hannah Arendt's conception of political theory as storytelling overcomes this trap,…
Modeling, Simulation, and Characterization of Distributed Multi-Agent Systems
2012-01-01
capabilities (vision, LIDAR , differential global positioning, ultrasonic proximity sensing, etc.), the agents comprising a MAS tend to have somewhat lesser...on the simultaneous localization and mapping ( SLAM ) problem [19]. SLAM acknowledges that externally-provided localization information is not...continually-updated mapping databases, generates a comprehensive representation of the spatial and spectral environment. Many times though, inherent SLAM
ERIC Educational Resources Information Center
Martin, Nina
2017-01-01
Many dance artists in their first encounters with improvisational dance making begin not only to learn how to compose spontaneously, but also to gain skills for coping with the uncertainties inherent in the form. This article suggests helpful dance scores for beginning students of physical improvisation and those who teach improvisational…
Transition Processes from College to Career.
ERIC Educational Resources Information Center
Hettich, Paul
The transition from college to career is one of the most challenging jobs an individual will experience. This is particularly true for students who have limited work experience. The fact that 50-80% of new college graduates leave their first job within three years may be due to poor career planning and problems inherent in the college-to-work…