Roca, Judith; Reguant, Mercedes; Canet, Olga
2016-11-01
Teaching strategies are essential in order to facilitate meaningful learning and the development of high-level thinking skills in students. To compare three teaching methodologies (problem-based learning, case-based teaching and traditional methods) in terms of the learning outcomes achieved by nursing students. This quasi-experimental research was carried out in the Nursing Degree programme in a group of 74 students who explored the subject of The Oncology Patient through the aforementioned strategies. A performance test was applied based on Bloom's Revised Taxonomy. A significant correlation was found between the intragroup theoretical and theoretical-practical dimensions. Likewise, intergroup differences were related to each teaching methodology. Hence, significant differences were estimated between the traditional methodology (x-=9.13), case-based teaching (x-=12.96) and problem-based learning (x-=14.84). Problem-based learning was shown to be the most successful learning method, followed by case-based teaching and the traditional methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.
A methodology to enhance electromagnetic compatibility in joint military operations
NASA Astrophysics Data System (ADS)
Buckellew, William R.
The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.
Montecinos, P; Rodewald, A M
1994-06-01
The aim this work was to assess and compare the achievements of medical students, subjected to problem based learning methodology. The information and comprehension categories of Bloom were tested in 17 medical students in four different occasions during the physiopathology course, using a multiple choice knowledge test. There was a significant improvement in the number of correct answers towards the end of the course. It is concluded that these medical students obtained adequate learning achievements in the information subcategory of Bloom using problem based learning methodology, during the physiopathology course.
Methodological Issues Related to the Use of P Less than 0.05 in Health Behavior Research
ERIC Educational Resources Information Center
Duryea, Elias; Graner, Stephen P.; Becker, Jeremy
2009-01-01
This paper reviews methodological issues related to the use of P less than 0.05 in health behavior research and suggests how application and presentation of statistical significance may be improved. Assessment of sample size and P less than 0.05, the file drawer problem, the Law of Large Numbers and the statistical significance arguments in…
2017-03-06
design of antenna and radar systems, energy absorption and scattering by rough-surfaces. This work has lead to significant new methodologies , including...problems in the field of electromagnetic propagation and scattering, with applicability to design of antenna and radar systems, energy absorption...and scattering by rough-surfaces. This work has lead to significant new methodologies , including introduction of a certain Windowed Green Function
Approach to Teaching Research Methodology for Information Technology
ERIC Educational Resources Information Center
Steenkamp, Annette Lerine; McCord, Samual Alan
2007-01-01
The paper reports on an approach to teaching a course in information technology research methodology in a doctoral program, the Doctor of Management in Information Technology (DMIT), in which research, with focus on finding innovative solutions to problems found in practice, comprises a significant part of the degree. The approach makes a…
Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies
López, Julio
2018-01-01
We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections. PMID:29670667
Mining EEG with SVM for Understanding Cognitive Underpinnings of Math Problem Solving Strategies.
Bosch, Paul; Herrera, Mauricio; López, Julio; Maldonado, Sebastián
2018-01-01
We have developed a new methodology for examining and extracting patterns from brain electric activity by using data mining and machine learning techniques. Data was collected from experiments focused on the study of cognitive processes that might evoke different specific strategies in the resolution of math problems. A binary classification problem was constructed using correlations and phase synchronization between different electroencephalographic channels as characteristics and, as labels or classes, the math performances of individuals participating in specially designed experiments. The proposed methodology is based on using well-established procedures of feature selection, which were used to determine a suitable brain functional network size related to math problem solving strategies and also to discover the most relevant links in this network without including noisy connections or excluding significant connections.
ERIC Educational Resources Information Center
Bachore, Zelalem
2012-01-01
Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…
Common Methodological Problems in Research on the Addictions.
ERIC Educational Resources Information Center
Nathan, Peter E.; Lansky, David
1978-01-01
Identifies common problems in research on the addictions and offers suggestions for remediating these methodological problems. The addictions considered include alcoholism and drug dependencies. Problems considered are those arising from inadequate, incomplete, or biased reviews of relevant literatures and methodological shortcomings of subject…
Choi, Keum-Hyeong; Buskey, Wendy; Johnson, Bonita
2010-07-01
The main purpose of this study was to investigate how receiving personal counseling at a university counseling center helps students deal with their personal problems and facilitates academic functioning. To that end, this study used both clinical and academic outcome measures that are relevant to the practice of counseling provided at a counseling center and its unique function in an institution of higher education. In addition, this study used the clinical significance methodology (N. S. Jacobson & P. Truax, 1991) that takes into account clients' differences in making clinically reliable and significant change. Pre-intake and post-termination surveys, including the Outcome Questionnaire (M. J. Lambert, K. Lunnen, V. Umphress, N. Hansen, & G. Burlingame, 1994), were completed by 78 clients, and the responses were analyzed using clinical significance methodology. The results revealed that those who made clinically reliable and significant change (i.e., the recovered group) reported the highest level of improvement in academic commitment to their educational goals and problem resolution, compared with those who did not make clinically significant change. The implications of the findings on practice for counseling at university counseling centers and for administrators in higher education institutions are discussed. (c) 2010 APA, all rights reserved.
2001-01-01
incorporate airbags , under the used vehicle provision. NHTSA has not developed such standards because it has not identified significant problems with...might incorporate airbags . NHTSA has not developed such standards because it has not identified significant problems with occupant restraint systems...Appendix I: Scope and Methodology 24 Appendix II: State Legislation Governing Aftermarket Crash Parts and Recycled Airbags 27 Figures Figure 1: Replacement
Methodological problems with gamma-ray burst hardness/intensity correlations
NASA Technical Reports Server (NTRS)
Schaefer, Bradley E.
1993-01-01
The hardness and intensity are easily measured quantities for all gamma-ray bursts (GRBs), and so, many past and current studies have sought correlations between them. This Letter presents many serious methodological problems with the practical definitions for both hardness and intensity. These difficulties are such that significant correlations can be easily introduced as artifacts of the reduction procedure. In particular, cosmological models of GRBs cannot be tested with hardness/intensity correlations with current instrumentation and the time evolution of the hardness in a given burst may be correlated with intensity for reasons that are unrelated to intrinsic change in the spectral shape.
ERIC Educational Resources Information Center
Tuohilampi, Laura
2016-01-01
Mathematics related affect turn from positive to negative during comprehensive school years worldwide. There is a clear need to find solutions to the problem. However, some gaps and problems appear in the methodologies and the common approaches used in the field. This article discusses five studies addressing affective development, challenges some…
An Existential Perspective on Death Anxiety, Retirement, and Related Research Problems.
Osborne, John W
2017-06-01
Aspects of existentialism relevant to existence and death anxiety (DA) are discussed. Included are the "thrownness" of existence, being-with-others, the motivational influence of inevitable death, the search for meaning, making the most of existence by taking responsibility for one's own life, and coping with existential isolation. The attempted separation of DA from object anxiety is a significant difficulty. The correlations among age, gender, and DA are variable. Personality and role-oriented problems in the transition to retirement are discussed along with Erikson's notion of "generativity" as an expression of the energy and purpose of mid-life. Furthermore, methodological and linguistic problems in DA research are considered. The article suggests qualitative methodologies as an interpersonal means of exploring DA within the contexts of psychotherapy and counselling.
Robust Feedback Control of Flow Induced Structural Radiation of Sound
NASA Technical Reports Server (NTRS)
Heatwole, Craig M.; Bernhard, Robert J.; Franchek, Matthew A.
1997-01-01
A significant component of the interior noise of aircraft and automobiles is a result of turbulent boundary layer excitation of the vehicular structure. In this work, active robust feedback control of the noise due to this non-predictable excitation is investigated. Both an analytical model and experimental investigations are used to determine the characteristics of the flow induced structural sound radiation problem. The problem is shown to be broadband in nature with large system uncertainties associated with the various operating conditions. Furthermore the delay associated with sound propagation is shown to restrict the use of microphone feedback. The state of the art control methodologies, IL synthesis and adaptive feedback control, are evaluated and shown to have limited success for solving this problem. A robust frequency domain controller design methodology is developed for the problem of sound radiated from turbulent flow driven plates. The control design methodology uses frequency domain sequential loop shaping techniques. System uncertainty, sound pressure level reduction performance, and actuator constraints are included in the design process. Using this design method, phase lag was added using non-minimum phase zeros such that the beneficial plant dynamics could be used. This general control approach has application to lightly damped vibration and sound radiation problems where there are high bandwidth control objectives requiring a low controller DC gain and controller order.
NASA Technical Reports Server (NTRS)
Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.
1989-01-01
A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.
Pedagogy and/or technology: Making difference in improving students' problem solving skills
NASA Astrophysics Data System (ADS)
Hrepic, Zdeslav; Lodder, Katherine; Shaw, Kimberly A.
2013-01-01
Pen input computers combined with interactive software may have substantial potential for promoting active instructional methodologies and for facilitating students' problem solving ability. An excellent example is a study in which introductory physics students improved retention, conceptual understanding and problem solving abilities when one of three weekly lectures was replaced with group problem solving sessions facilitated with Tablet PCs and DyKnow software [1,2]. The research goal of the present study was to isolate the effect of the methodology itself (using additional time to teach problem solving) from that of the involved technology. In Fall 2011 we compared the performance of students taking the same introductory physics lecture course while enrolled in two separate problem-solving sections. One section used pen-based computing to facilitate group problem solving while the other section used low-tech methods for one third of the semester (covering Kinematics), and then traded technologies for the middle third of the term (covering Dynamics). Analysis of quiz, exam and standardized pre-post test results indicated no significant difference in scores of the two groups. Combining this result with those of previous studies implies primacy of pedagogy (collaborative problem solving itself) over technology for student learning in problem solving recitations.
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar
2016-01-01
This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.
Problem solving using soft systems methodology.
Land, L
This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.
Interpretation methodology and analysis of in-flight lightning data
NASA Technical Reports Server (NTRS)
Rudolph, T.; Perala, R. A.
1982-01-01
A methodology is presented whereby electromagnetic measurements of inflight lightning stroke data can be understood and extended to other aircraft. Recent measurements made on the NASA F106B aircraft indicate that sophisticated numerical techniques and new developments in corona modeling are required to fully understand the data. Thus the problem is nontrivial and successful interpretation can lead to a significant understanding of the lightning/aircraft interaction event. This is of particular importance because of the problem of lightning induced transient upset of new technology low level microcircuitry which is being used in increasing quantities in modern and future avionics. Inflight lightning data is analyzed and lightning environments incident upon the F106B are determined.
Assessing the Need for Semi-Dependent Housing for the Elderly
ERIC Educational Resources Information Center
Newcomer, Robert J.; And Others
1976-01-01
The need for quantitative information on the depth of the semi-dependent housing problems of the elderly is significant. This paper reports the findings and methodology of a 21-state market feasibility analysis. (Author)
Analysis and Reduction of Complex Networks Under Uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghanem, Roger G
2014-07-31
This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC teammore » consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.« less
PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.
Chao, Edward C.T.
1983-01-01
This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.
An integer programming approach to a real-world recyclable waste collection problem in Argentina.
Braier, Gustavo; Durán, Guillermo; Marenco, Javier; Wesner, Francisco
2017-05-01
This article reports on the use of mathematical programming techniques to optimise the routes of a recyclable waste collection system servicing Morón, a large municipality outside Buenos Aires, Argentina. The truck routing problem posed by the system is a particular case of the generalised directed open rural postman problem. An integer programming model is developed with a solving procedure built around a subtour-merging algorithm and the addition of subtour elimination constraints. The route solutions generated by the proposed methodology perform significantly better than the previously used, manually designed routes, the main improvement being that coverage of blocks within the municipality with the model solutions is 100% by construction, whereas with the manual routes as much as 16% of the blocks went unserviced. The model-generated routes were adopted by the municipality in 2014 and the national government is planning to introduce the methodology elsewhere in the country.
Evaluation of the HARDMAN comparability methodology for manpower, personnel and training
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.
1984-01-01
The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.
Waugh, Sheldon
2015-02-05
The use of detailed methodologies and legitimate settings justifications in spatial analysis is imperative to locating areas of significance. Studies missing this action may enact interventions in improper areas.
NASA Astrophysics Data System (ADS)
Alemany, Kristina
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
Fractional Programming for Communication Systems—Part II: Uplink Scheduling via Matching
NASA Astrophysics Data System (ADS)
Shen, Kaiming; Yu, Wei
2018-05-01
This two-part paper develops novel methodologies for using fractional programming (FP) techniques to design and optimize communication systems. Part I of this paper proposes a new quadratic transform for FP and treats its application for continuous optimization problems. In this Part II of the paper, we study discrete problems, such as those involving user scheduling, which are considerably more difficult to solve. Unlike the continuous problems, discrete or mixed discrete-continuous problems normally cannot be recast as convex problems. In contrast to the common heuristic of relaxing the discrete variables, this work reformulates the original problem in an FP form amenable to distributed combinatorial optimization. The paper illustrates this methodology by tackling the important and challenging problem of uplink coordinated multi-cell user scheduling in wireless cellular systems. Uplink scheduling is more challenging than downlink scheduling, because uplink user scheduling decisions significantly affect the interference pattern in nearby cells. Further, the discrete scheduling variable needs to be optimized jointly with continuous variables such as transmit power levels and beamformers. The main idea of the proposed FP approach is to decouple the interaction among the interfering links, thereby permitting a distributed and joint optimization of the discrete and continuous variables with provable convergence. The paper shows that the well-known weighted minimum mean-square-error (WMMSE) algorithm can also be derived from a particular use of FP; but our proposed FP-based method significantly outperforms WMMSE when discrete user scheduling variables are involved, both in term of run-time efficiency and optimizing results.
Overset Grid Methods Applied to Nonlinear Potential Flows
NASA Technical Reports Server (NTRS)
Holst, Terry; Kwak, Dochan (Technical Monitor)
2000-01-01
The objectives of this viewgraph presentation are to develop Chimera-based potential methodology which is compatible with overflow and overflow infrastructure, creating options for an advanced problem solving environment and to significantly reduce turnaround time for aerodynamic analysis and design (primarily cruise conditions).
General Methodology for Designing Spacecraft Trajectories
NASA Technical Reports Server (NTRS)
Condon, Gerald; Ocampo, Cesar; Mathur, Ravishankar; Morcos, Fady; Senent, Juan; Williams, Jacob; Davis, Elizabeth C.
2012-01-01
A methodology for designing spacecraft trajectories in any gravitational environment within the solar system has been developed. The methodology facilitates modeling and optimization for problems ranging from that of a single spacecraft orbiting a single celestial body to that of a mission involving multiple spacecraft and multiple propulsion systems operating in gravitational fields of multiple celestial bodies. The methodology consolidates almost all spacecraft trajectory design and optimization problems into a single conceptual framework requiring solution of either a system of nonlinear equations or a parameter-optimization problem with equality and/or inequality constraints.
Decision-problem state analysis methodology
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
A NEW APPROACH AND METHODOLOGIES FOR CHARACTERIZING THE HYDROGEOLOGIC PROPERTIES OF AQUIFERS
In the authors' opinion, the ability of hydrologists to perform field measurements of aquifer hydraulic properties must be enhanced if we are to improve significantly our capacity to solve ground water contamination problems at Superfund and other sites. Therefore, the primar...
NEW APPROACH AND METHODOLOGIES FOR CHARACTERIZING THE HYDROGEOLOGIC PROPERTIES OF AQUIFERS
In the authors' opinion, the ability of hydrologists to perform field measurements of aquifer hydraulic properties must be enhanced if we are to improve significantly our capacity to solve ground water contamination problems at Superfund and other sites. herefore, the primary pur...
Software life cycle methodologies and environments
NASA Technical Reports Server (NTRS)
Fridge, Ernest
1991-01-01
Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.
ERIC Educational Resources Information Center
Gauthier, Benoit; And Others
1997-01-01
Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)
How to Select the most Relevant Roughness Parameters of a Surface: Methodology Research Strategy
NASA Astrophysics Data System (ADS)
Bobrovskij, I. N.
2018-01-01
In this paper, the foundations for new methodology creation which provides solving problem of surfaces structure new standards parameters huge amount conflicted with necessary actual floors quantity of surfaces structure parameters which is related to measurement complexity decreasing are considered. At the moment, there is no single assessment of the importance of a parameters. The approval of presented methodology for aerospace cluster components surfaces allows to create necessary foundation, to develop scientific estimation of surfaces texture parameters, to obtain material for investigators of chosen technological procedure. The methods necessary for further work, the creation of a fundamental reserve and development as a scientific direction for assessing the significance of microgeometry parameters are selected.
Hitchings, Julia E.; Spoth, Richard L.
2010-01-01
Conduct problems are strong positive predictors of substance use and problem substance use among teens, whereas predictive associations of depressed mood with these outcomes are mixed. Conduct problems and depressed mood often co-occur, and such co-occurrence may heighten risk for negative outcomes. Thus, this study examined the interaction of conduct problems and depressed mood at age 11 in relation to substance use and problem use at age 18, and possible mediation through peer substance use at age 16. Analyses of multirater longitudinal data collected from 429 rural youths (222 girls) and their families were conducted using a methodology for testing latent variable interactions. The link between the conduct problems X depressed mood interaction and adolescent substance use was negative and statistically significant. Unexpectedly, positive associations of conduct problems with substance use were stronger at lower levels of depressed mood. A significant negative interaction in relation to peer substance use also was observed, and the estimated indirect effect of the interaction on adolescent use through peer use as a mediator was statistically significant. Findings illustrate the complexity of multiproblem youth. PMID:18455886
Anderson, Devon E; Watts, Bradley V
2013-09-01
Despite innumerable attempts to eliminate the postoperative retention of surgical sponges, the medical error persists in operating rooms worldwide and places significant burden on patient safety, quality of care, financial resources, and hospital/physician reputation. The failure of countless solutions, from new sponge counting methods to radio labeled sponges, to truly eliminate the event in the operating room requires that the emerging field of health-care delivery science find innovative ways to approach the problem. Accordingly, the VA National Center for Patient Safety formed a unique collaboration with a team at the Thayer School of Engineering at Dartmouth College to evaluate the retention of surgical sponges after surgery and find a solution. The team used an engineering problem solving methodology to develop the best solution. To make the operating room a safe environment for patients, the team identified a need to make the sponge itself safe for use as opposed to resolving the relatively innocuous counting methods. In evaluation of this case study, the need for systematic engineering evaluation to resolve problems in health-care delivery becomes clear.
A NEW APPROACH AND METHODOLOGIES FOR CHARACTERIZING THE HYDROGEOLOGIC PROPERTIES OF AQUIFERS
In the authors' opinion, the ability of hydrologists to perform field measurements of aquifer hydraulic properties must be enhanced if we are to improve significantly our capacity to solve ground water contamination problems at Superfund and other sites. Therefore, the primary pu...
Analysis of individual risk belief structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tonn, B.E.; Travis, C.B.; Arrowood, L.
An interactive computer program developed at Oak Ridge National Laboratory is presented as a methodology to model individualized belief structures. The logic and general strategy of the model is presented for two risk topics: AIDs and toxic waste. Subjects identified desirable and undesirable consequences for each topic and formulated an associative rule linking topic and consequence in either a causal or correlational framework. Likelihood estimates, generated by subjects in several formats (probability, odds statements, etc.), constituted one outcome measure. Additionally, source of belief (personal experience, news media, etc.) and perceived personal and societal impact are reviewed. Briefly, subjects believe thatmore » AIDs causes significant emotional problems, and to a lesser degree, physical health problems whereas toxic waste causes significant environmental problems.« less
On multi-site damage identification using single-site training data
NASA Astrophysics Data System (ADS)
Barthorpe, R. J.; Manson, G.; Worden, K.
2017-11-01
This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.
Twenty Years of Cultural Imperialism Research: Some Conceptual and Methodological Problems.
ERIC Educational Resources Information Center
Burrowes, Carl Patrick
While the notion of "cultural imperialism" has received significant attention in communication studies since the early 1970s, researchers have ignored analyses of message systems and audience cultivation in favor of institutional analysis. Likewise, researchers have concentrated on the technologies, media products and processes of…
Breastfeeding Is Positively Associated with Child Intelligence Even Net of Parental IQ
ERIC Educational Resources Information Center
Kanazawa, Satoshi
2015-01-01
Some previous reviews conclude that breastfeeding is not significantly associated with increased intelligence in children once mother's IQ is statistically controlled. The conclusion may potentially have both theoretical and methodological problems. The National Child Development Study allows the examination of the effect of breastfeeding on…
Eco-analytical Methodology in Environmental Problems Monitoring
NASA Astrophysics Data System (ADS)
Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.
2017-01-01
Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.
A variable-gain output feedback control design methodology
NASA Technical Reports Server (NTRS)
Halyo, Nesim; Moerder, Daniel D.; Broussard, John R.; Taylor, Deborah B.
1989-01-01
A digital control system design technique is developed in which the control system gain matrix varies with the plant operating point parameters. The design technique is obtained by formulating the problem as an optimal stochastic output feedback control law with variable gains. This approach provides a control theory framework within which the operating range of a control law can be significantly extended. Furthermore, the approach avoids the major shortcomings of the conventional gain-scheduling techniques. The optimal variable gain output feedback control problem is solved by embedding the Multi-Configuration Control (MCC) problem, previously solved at ICS. An algorithm to compute the optimal variable gain output feedback control gain matrices is developed. The algorithm is a modified version of the MCC algorithm improved so as to handle the large dimensionality which arises particularly in variable-gain control problems. The design methodology developed is applied to a reconfigurable aircraft control problem. A variable-gain output feedback control problem was formulated to design a flight control law for an AFTI F-16 aircraft which can automatically reconfigure its control strategy to accommodate failures in the horizontal tail control surface. Simulations of the closed-loop reconfigurable system show that the approach produces a control design which can accommodate such failures with relative ease. The technique can be applied to many other problems including sensor failure accommodation, mode switching control laws and super agility.
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
Standing on the shoulders of giants: improving medical image segmentation via bias correction.
Wang, Hongzhi; Das, Sandhitsu; Pluta, John; Craige, Caryne; Altinay, Murat; Avants, Brian; Weiner, Michael; Mueller, Susanne; Yushkevich, Paul
2010-01-01
We propose a simple strategy to improve automatic medical image segmentation. The key idea is that without deep understanding of a segmentation method, we can still improve its performance by directly calibrating its results with respect to manual segmentation. We formulate the calibration process as a bias correction problem, which is addressed by machine learning using training data. We apply this methodology on three segmentation problems/methods and show significant improvements for all of them.
Wang, Long; Zou, Wei; Chi, Qing-bin
2009-06-01
In order to explore the problems and countermeasure in the methodology of acupuncture and moxibustion clinical researches at present, clinical research literatures about acupuncture and moxibustion (Acup-Mox) published in recent years in our country were reviewed. For the urgent need of the current internationalization of Acup-Mox, the authors proposed the model of clinical research on Acup-Mox, which should strictly stick to the international standard and fully embody traditional Chinese medicine characteristics in the intervention measures of acupuncture. It is indicated that innovation of the methodology about clinical researches of Acup-Mox has great significance in improving the quality of clinical research on Acup-Mox in our country and promoting internationalization of Acup-Mox.
Determining Training Device Requirements in Army Aviation Systems
NASA Technical Reports Server (NTRS)
Poumade, M. L.
1984-01-01
A decision making methodology which applies the systems approach to the training problem is discussed. Training is viewed as a total system instead of a collection of individual devices and unrelated techniques. The core of the methodology is the use of optimization techniques such as the transportation algorithm and multiobjective goal programming with training task and training device specific data. The role of computers, especially automated data bases and computer simulation models, in the development of training programs is also discussed. The approach can provide significant training enhancement and cost savings over the more traditional, intuitive form of training development and device requirements process. While given from an aviation perspective, the methodology is equally applicable to other training development efforts.
NASA Astrophysics Data System (ADS)
Mashood, K. K.; Singh, Vijay A.
2013-09-01
Research suggests that problem-solving skills are transferable across domains. This claim, however, needs further empirical substantiation. We suggest correlation studies as a methodology for making preliminary inferences about transfer. The correlation of the physics performance of students with their performance in chemistry and mathematics in highly competitive problem-solving examinations was studied using a massive database. The sample sizes ranged from hundreds to a few hundred thousand. Encouraged by the presence of significant correlations, we interviewed 20 students to explore the pedagogic potential of physics in imparting transferable problem-solving skills. We report strategies and practices relevant to physics employed by these students which foster transfer.
Bhalli, Muhammad Asif; Khan, Ishtiaq Ali; Sattar, Abdul
2015-01-01
Researchers have categorized the learning styles in many ways. Kolb proposed a classification of learner's styles as convergers, divergers, assimilators and accommodators. Honey and Mumford simplified learning styles as activists, reflectors, theorists and pragmatists. Neil Fleming's VARK model (Visual, Auditory, Read/write and Kinesthetic) is also popular. This study was carried out to determine the frequency of learning styles (Honey and Mumford) of medical students and its correlation with preferred teaching methodologies and academic achievements. A total of 77 medical students of 4th year MBBS were selected through non-probability convenient sampling for this study. Honey and Mumford's learning style questionnaire, and a 2nd questionnaire to know their preference for different teaching methodologies were distributed to the students. Learning styles were identified and correlated with preferred teaching methodologies and academic achievements by Chi-square test. Mean age of the medical students was 22.75 ± 1.05 years. Twenty one (27.3%) participants were males and 56 (72.7%) females. By learning styles, 7 (9.1%) medical students were activists, 36 (46.8%) reflectors, 13 (16.9%) theorists and 21 (27.3%) were pragmatists. Out of 77 students, 22 preferred interactive lectures; 16, small group discussion; 20 problem based learning, 10 preferred demonstration on models. Only 01 students preferred one-way lecture as the best teaching methodology. No significant correlation was found between learning styles and preferred teaching methodologies and learning styles and academic scores. Most of the medical students had reflector (46.8%) and pragmatist (27.3%) learning styles. Majority preferred interactive lectures (28.57%) and problem based learning (25.98%) as teaching methodologies. Aligning our instructional strategies with learning styles of the medical students will improve learning and academic performance.
Methodological issues in the study of violence against women
Ruiz‐Pérez, Isabel; Plazaola‐Castaño, Juncal; Vives‐Cases, Carmen
2007-01-01
The objective of this paper is to review the methodological issues that arise when studying violence against women as a public health problem, focusing on intimate partner violence (IPV), since this is the form of violence that has the greatest consequences at a social and political level. The paper focuses first on the problems of defining what is meant by IPV. Secondly, the paper describes the difficulties in assessing the magnitude of the problem. Obtaining reliable data on this type of violence is a complex task, because of the methodological issues derived from the very nature of the phenomenon, such as the private, intimate context in which this violence often takes place, which means the problem cannot be directly observed. Finally, the paper examines the limitations and bias in research on violence, including the lack of consensus with regard to measuring events that may or may not represent a risk factor for violence against women or the methodological problem related to the type of sampling used in both aetiological and prevalence studies. PMID:18000113
Employee Concerns and Counselor Role: A Factor Analysis.
ERIC Educational Resources Information Center
Mazer, Gilbert E.
A significant step in the direction of improving employee counselor services in an industrial setting is offered. The research combined survey and factor analytic methodology to empirically identify sources of employee concerns or stress and to measure the tendency of employees to use counseling services in connection with these problem areas. The…
Training effectiveness assessment: Methodological problems and issues
NASA Technical Reports Server (NTRS)
Cross, Kenneth D.
1992-01-01
The U.S. military uses a large number of simulators to train and sustain the flying skills of helicopter pilots. Despite the enormous resources required to purchase, maintain, and use those simulators, little effort has been expended in assessing their training effectiveness. One reason for this is the lack of an evaluation methodology that yields comprehensive and valid data at a practical cost. Some of these methodological problems and issues that arise in assessing simulator training effectiveness, as well as problems with the classical transfer-of-learning paradigm were discussed.
Tractenberg, Saulo G; Levandowski, Mateus L; de Azeredo, Lucas Araújo; Orso, Rodrigo; Roithmann, Laura G; Hoffmann, Emerson S; Brenhouse, Heather; Grassi-Oliveira, Rodrigo
2016-09-01
Early life stress (ELS) developmental effects have been widely studied by preclinical researchers. Despite the growing body of evidence from ELS models, such as the maternal separation paradigm, the reported results have marked inconsistencies. The maternal separation model has several methodological pitfalls that could influence the reliability of its results. Here, we critically review 94 mice studies that addressed the effects of maternal separation on behavioural outcomes. We also discuss methodological issues related to the heterogeneity of separation protocols and the quality of reporting methods. Our findings indicate a lack of consistency in maternal separation effects: major studies of behavioural and biological phenotypes failed to find significant deleterious effects. Furthermore, we identified several specific variations in separation methodological procedures. These methodological variations could contribute to the inconsistency of maternal separation effects by producing different degrees of stress exposure in maternal separation-reared pups. These methodological problems, together with insufficient reporting, might lead to inaccurate and unreliable effect estimates in maternal separation studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ethical and Legal Implications of the Methodological Crisis in Neuroimaging.
Kellmeyer, Philipp
2017-10-01
Currently, many scientific fields such as psychology or biomedicine face a methodological crisis concerning the reproducibility, replicability, and validity of their research. In neuroimaging, similar methodological concerns have taken hold of the field, and researchers are working frantically toward finding solutions for the methodological problems specific to neuroimaging. This article examines some ethical and legal implications of this methodological crisis in neuroimaging. With respect to ethical challenges, the article discusses the impact of flawed methods in neuroimaging research in cognitive and clinical neuroscience, particularly with respect to faulty brain-based models of human cognition, behavior, and personality. Specifically examined is whether such faulty models, when they are applied to neurological or psychiatric diseases, could put patients at risk, and whether this places special obligations on researchers using neuroimaging. In the legal domain, the actual use of neuroimaging as evidence in United States courtrooms is surveyed, followed by an examination of ways that the methodological problems may create challenges for the criminal justice system. Finally, the article reviews and promotes some promising ideas and initiatives from within the neuroimaging community for addressing the methodological problems.
Investigating gender differences in alcohol problems: a latent trait modeling approach.
Nichol, Penny E; Krueger, Robert F; Iacono, William G
2007-05-01
Inconsistent results have been found in research investigating gender differences in alcohol problems. Previous studies of gender differences used a wide range of methodological techniques, as well as limited assortments of alcohol problems. Parents (1,348 men and 1,402 women) of twins enrolled in the Minnesota Twin Family Study answered questions about a wide range of alcohol problems. A latent trait modeling technique was used to evaluate gender differences in the probability of endorsement at the problem level and for the overall 105-problem scale. Of the 34 problems that showed significant gender differences, 29 were more likely to be endorsed by men than women with equivalent overall alcohol problem levels. These male-oriented symptoms included measures of heavy drinking, duration of drinking, tolerance, and acting out behaviors. Nineteen symptoms were denoted for removal to create a scale that favored neither gender in assessment. Significant gender differences were found in approximately one-third of the symptoms assessed and in the overall scale. Further examination of the nature of gender differences in alcohol problem symptoms should be undertaken to investigate whether a gender-neutral scale should be created or if men and women should be assessed with separate criteria for alcohol dependence and abuse.
Ma, Nylanda; Roberts, Rachel; Winefield, Helen; Furber, Gareth
2015-02-01
While the importance of looking at the entire family system in the context of child and adolescent mental health is well recognised, siblings of children with mental health problems (MHPs) are often overlooked. The existing literature on the mental health of these siblings needs to be reviewed. A systematic search located publications from 1990 to 2011 in four electronic databases. Thirty-nine relevant studies reported data on the prevalence of psychopathology in siblings of target children with MHPs. Siblings of target children had higher rates of at least one type of psychopathology than comparison children. Risk of psychopathology varied across the type of MHP in the target child. Other covariates included sibling age and gender and parental psychopathology. Significant variations and limitations in methodology were found in the existing literature. Methodological guidelines for future studies are outlined. Implications for clinicians, parents, and for future research are discussed.
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains. PMID:22164064
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains.
NASA Technical Reports Server (NTRS)
Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.
Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence
Han, Paul K. J.
2014-01-01
The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891
Advanced Methodology for Simulation of Complex Flows Using Structured Grid Systems
NASA Technical Reports Server (NTRS)
Steinthorsson, Erlendur; Modiano, David
1995-01-01
Detailed simulations of viscous flows in complicated geometries pose a significant challenge to current capabilities of Computational Fluid Dynamics (CFD). To enable routine application of CFD to this class of problems, advanced methodologies are required that employ (a) automated grid generation, (b) adaptivity, (c) accurate discretizations and efficient solvers, and (d) advanced software techniques. Each of these ingredients contributes to increased accuracy, efficiency (in terms of human effort and computer time), and/or reliability of CFD software. In the long run, methodologies employing structured grid systems will remain a viable choice for routine simulation of flows in complex geometries only if genuinely automatic grid generation techniques for structured grids can be developed and if adaptivity is employed more routinely. More research in both these areas is urgently needed.
[Methodological problems in the use of information technologies in physical education].
Martirosov, E G; Zaĭtseva, G A
2000-01-01
The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.
Engineering management of large scale systems
NASA Technical Reports Server (NTRS)
Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.
1989-01-01
The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.
ERIC Educational Resources Information Center
Lennon, Mary Clare; Appelbaum, Lauren D.; Aber, J. Lawrence; McCaskie, Katherine
This study examined public attitudes toward the most vulnerable of the poor--those who experience significant personal or situational problems that can create obstacles to employment. Using both a factorial survey methodology and a general attitude survey, researchers gathered information about public opinion toward people in need, low-income…
Methodology in Training Future Technology and Engineering Teachers in the USA
ERIC Educational Resources Information Center
Androshchuk, Iryna; Androshchuk, Ihor
2017-01-01
In the article, the defined problem has been justified and the significance of studying foreign experience in training future technology and engineering teachers in the USA has been determined. Particular attention has been paid to explanation of methods and forms of organization of future technology and engineering teachers' training in the USA.…
Conjugate gradient based projection - A new explicit methodology for frictional contact
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Li, Maocheng; Sha, Desong
1993-01-01
With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.
Spinal Cord Injury-Induced Dysautonomia via Plasticity in Paravertebral Sympathetic Postganglionic
2017-10-01
their near anatomical inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent...inaccessibility. We have solved the accessibility problem with a strategic methodological advance. We will determine the extent to which paravertebral
Human Prenatal Effects: Methodological Problems and Some Suggested Solutions
ERIC Educational Resources Information Center
Copans, Stuart A.
1974-01-01
Briefly reviews the relevant literature on human prenatal effects, describes some of the possible designs for such studies; and discusses some of the methodological problem areas: sample choice, measurement of prenatal variables, monitoring of labor and delivery, and neonatal assessment. (CS)
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
Overcoming an obstacle in expanding a UMLS semantic type extent.
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2012-02-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. Copyright © 2011 Elsevier Inc. All rights reserved.
Overcoming an Obstacle in Expanding a UMLS Semantic Type Extent
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2011-01-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. PMID:21925287
Artificial intelligence and design: Opportunities, research problems and directions
NASA Technical Reports Server (NTRS)
Amarel, Saul
1990-01-01
The issues of industrial productivity and economic competitiveness are of major significance in the U.S. at present. By advancing the science of design, and by creating a broad computer-based methodology for automating the design of artifacts and of industrial processes, we can attain dramatic improvements in productivity. It is our thesis that developments in computer science, especially in Artificial Intelligence (AI) and in related areas of advanced computing, provide us with a unique opportunity to push beyond the present level of computer aided automation technology and to attain substantial advances in the understanding and mechanization of design processes. To attain these goals, we need to build on top of the present state of AI, and to accelerate research and development in areas that are especially relevant to design problems of realistic complexity. We propose an approach to the special challenges in this area, which combines 'core work' in AI with the development of systems for handling significant design tasks. We discuss the general nature of design problems, the scientific issues involved in studying them with the help of AI approaches, and the methodological/technical issues that one must face in developing AI systems for handling advanced design tasks. Looking at basic work in AI from the perspective of design automation, we identify a number of research problems that need special attention. These include finding solution methods for handling multiple interacting goals, formation problems, problem decompositions, and redesign problems; choosing representations for design problems with emphasis on the concept of a design record; and developing approaches for the acquisition and structuring of domain knowledge with emphasis on finding useful approximations to domain theories. Progress in handling these research problems will have major impact both on our understanding of design processes and their automation, and also on several fundamental questions that are of intrinsic concern to AI. We present examples of current AI work on specific design tasks, and discuss new directions of research, both as extensions of current work and in the context of new design tasks where domain knowledge is either intractable or incomplete. The domains discussed include Digital Circuit Design, Mechanical Design of Rotational Transmissions, Design of Computer Architectures, Marine Design, Aircraft Design, and Design of Chemical Processes and Materials. Work in these domains is significant on technical grounds, and it is also important for economic and policy reasons.
NASA Astrophysics Data System (ADS)
Delahunty, Thomas; Seery, Niall; Lynch, Raymond
2018-04-01
Currently, there is significant interest being directed towards the development of STEM education to meet economic and societal demands. While economic concerns can be a powerful driving force in advancing the STEM agenda, care must be taken that such economic imperative does not promote research approaches that overemphasize pragmatic application at the expense of augmenting the fundamental knowledge base of the discipline. This can be seen in the predominance of studies investigating problem solving approaches and procedures, while neglecting representational and conceptual processes, within the literature. Complementing concerns about STEM graduates' problem solving capabilities, raised within the pertinent literature, this paper discusses a novel methodological approach aimed at investigating the cognitive elements of problem conceptualization. The intention is to demonstrate a novel method of data collection that overcomes some of the limitations cited in classic problem solving research while balancing a search for fundamental understanding with the possibility of application. The methodology described in this study employs an electroencephalographic (EEG) headset, as part of a mixed methods approach, to gather objective evidence of students' cognitive processing during problem solving epochs. The method described provides rich evidence of students' cognitive representations of problems during episodes of applied reasoning. The reliability and validity of the EEG method is supported by the stability of the findings across the triangulated data sources. The paper presents a novel method in the context of research within STEM education and demonstrates an effective procedure for gathering rich evidence of cognitive processing during the early stages of problem conceptualization.
Use of multilevel modeling for determining optimal parameters of heat supply systems
NASA Astrophysics Data System (ADS)
Stennikov, V. A.; Barakhtenko, E. A.; Sokolov, D. V.
2017-07-01
The problem of finding optimal parameters of a heat-supply system (HSS) is in ensuring the required throughput capacity of a heat network by determining pipeline diameters and characteristics and location of pumping stations. Effective methods for solving this problem, i.e., the method of stepwise optimization based on the concept of dynamic programming and the method of multicircuit optimization, were proposed in the context of the hydraulic circuit theory developed at Melentiev Energy Systems Institute (Siberian Branch, Russian Academy of Sciences). These methods enable us to determine optimal parameters of various types of piping systems due to flexible adaptability of the calculation procedure to intricate nonlinear mathematical models describing features of used equipment items and methods of their construction and operation. The new and most significant results achieved in developing methodological support and software for finding optimal parameters of complex heat supply systems are presented: a new procedure for solving the problem based on multilevel decomposition of a heat network model that makes it possible to proceed from the initial problem to a set of interrelated, less cumbersome subproblems with reduced dimensionality; a new algorithm implementing the method of multicircuit optimization and focused on the calculation of a hierarchical model of a heat supply system; the SOSNA software system for determining optimum parameters of intricate heat-supply systems and implementing the developed methodological foundation. The proposed procedure and algorithm enable us to solve engineering problems of finding the optimal parameters of multicircuit heat supply systems having large (real) dimensionality, and are applied in solving urgent problems related to the optimal development and reconstruction of these systems. The developed methodological foundation and software can be used for designing heat supply systems in the Central and the Admiralty regions in St. Petersburg, the city of Bratsk, and the Magistral'nyi settlement.
E-therapy for mental health problems: a systematic review.
Postel, Marloes G; de Haan, Hein A; De Jong, Cor A J
2008-09-01
The widespread availability of the Internet offers opportunities for improving access to therapy for people with mental health problems. There is a seemingly infinite supply of Internet-based interventions available on the World Wide Web. The aim of the present study is to systematically assess the methodological quality of randomized controlled trials (RCTs) concerning e-therapy for mental health problems. Two reviewers independently assessed the methodological quality of the RCTs, based on a list of criteria for the methodological quality assessment as recommended by the Cochrane Back Review Group. The search yielded 14 papers that reported RCTs concerning e-therapy for mental-health problems. The methodological quality of studies included in this review was generally low. It is concluded that e-therapy may turn out to be an appropriate therapeutic entity, but the evidence needs to be more convincing. Recommendations are made concerning the method of reporting RCTs and the need to add some content items to an e-therapy study.
Fitting methods to paradigms: are ergonomics methods fit for systems thinking?
Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A
2017-02-01
The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.
Increasing accuracy in the assessment of motion sickness: A construct methodology
NASA Technical Reports Server (NTRS)
Stout, Cynthia S.; Cowings, Patricia S.
1993-01-01
The purpose is to introduce a new methodology that should improve the accuracy of the assessment of motion sickness. This construct methodology utilizes both subjective reports of motion sickness and objective measures of physiological correlates to assess motion sickness. Current techniques and methods used in the framework of a construct methodology are inadequate. Current assessment techniques for diagnosing motion sickness and space motion sickness are reviewed, and attention is called to the problems with the current methods. Further, principles of psychophysiology that when applied will probably resolve some of these problems are described in detail.
Effect of a PBL teaching method on learning about nursing care for patients with depression.
Arrue, Marta; Ruiz de Alegría, Begoña; Zarandona, Jagoba; Hoyos Cillero, Itziar
2017-05-01
Depression is a worldwide public health problem that requires the attention of qualified health professionals. The training of skilled nurses is a challenge for nursing instructors due to the complexity of this pathology. The aim was to analyse the declarative and argumentative knowledge acquired about depression by students receiving traditional expository instruction versus students receiving problem-based learning instruction. Quasi-experimental study with pre-test and post-test design in experimental and control group to measure differences in the improvement of declarative and argumentative knowledge. Non parametric tests were used to compare the scores between the experimental group and the control group, and between the pre-test and post-test in each group. 114 students participated in the study. Implementation of the study took place during the 2014-2015 academic year in the third year of the Nursing undergraduate degree courses in the University of the Basque Country (UPV/EHU) as part of the Mental Health Nursing subject. The data indicated that there were no statistically significant differences between the two methodologies in regard to declarative knowledge in the care of patients with depression. Nevertheless, the argumentative capacity of the experimental group improved significantly with the problem-based learning methodology (p=0.000). The results of the implementation indicated that problem-based learning was a satisfactory tool for the acquisition of argumentative capacity in depression nursing care. Still, working examples of teaching sequences that bridge the gap between general clinical practice and classroom practice remain an important goal for continuing research in nursing education. Copyright © 2017 Elsevier Ltd. All rights reserved.
Problem Solving in Biology: A Methodology
ERIC Educational Resources Information Center
Wisehart, Gary; Mandell, Mark
2008-01-01
A methodology is described that teaches science process by combining informal logic and a heuristic for rating factual reliability. This system facilitates student hypothesis formation, testing, and evaluation of results. After problem solving with this scheme, students are asked to examine and evaluate arguments for the underlying principles of…
SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY
Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of
A problem-oriented approach to journal selection for hospital libraries.
Delman, B S
1982-01-01
This paper describes a problem-oriented approach to journal selection (PAJS), including general methodology, theoretical terms, and a brief description of results when the system was applied in three different hospitals. The PAJS system relates the objective information which the MEDLARS data base offers about the universe of biomedical literature to objective, problem-oriented information supplied by the hospital's medical records. The results were manipulated quantitatively to determine (1) the relevance of various journals to each of the hospital's defined significant information problems and (2) the overall utility of each journal to the institution as a whole. The utility information was plotted on a graph to identify the collection of journal titles which would be most useful to the given hospital. Attempts made to verify certain aspects of the whole process are also described. The results suggest that the methodology is generally able to provide an effective library response. The system optimizes resources vis-a-vis information and can be used for both budget allocation and justification. It offers an algorithm to which operations researchers can apply any one of a variety of mathematical programming methods. Although originally intended for librarians in the community hospital environment, the PAJS system is generalizable and has application potential in a variety of special library settings. PMID:6758893
[Problem-based learning in cardiopulmonary resuscitation: basic life support].
Sardo, Pedro Miguel Garcez; Dal Sasso, Grace Terezinha Marcon
2008-12-01
Descriptive and exploratory study, aimed to develop an educational practice of Problem-Based Learning in CPR/BLS with 24 students in the third stage of the Nursing Undergraduate Course in a University in the Southern region of Brazil. The study used the PBL methodology, focused on problem situations of cardiopulmonary arrest, and was approved by the CONEP. The methodological strategies for data collection, such as participative observation and questionnaires to evaluate the learning, the educational practices and their methodology, allowed for grouping the results in: students' expectations; group activities; individual activities; practical activities; evaluation of the meetings and their methodology. The study showed that PBL allows the educator to evaluate the academic learning process in several dimensions, functioning as a motivating factor for both the educator and the student, because it allows the theoretical-practical integration in an integrated learning process.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Methodological Problems on the Way to Integrative Human Neuroscience.
Kotchoubey, Boris; Tretter, Felix; Braun, Hans A; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang
2016-01-01
Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge , rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience , which will not only link different fields and levels, but also help in understanding clinical phenomena.
Methodological Problems on the Way to Integrative Human Neuroscience
Kotchoubey, Boris; Tretter, Felix; Braun, Hans A.; Buchheim, Thomas; Draguhn, Andreas; Fuchs, Thomas; Hasler, Felix; Hastedt, Heiner; Hinterberger, Thilo; Northoff, Georg; Rentschler, Ingo; Schleim, Stephan; Sellmaier, Stephan; Tebartz Van Elst, Ludger; Tschacher, Wolfgang
2016-01-01
Neuroscience is a multidisciplinary effort to understand the structures and functions of the brain and brain-mind relations. This effort results in an increasing amount of data, generated by sophisticated technologies. However, these data enhance our descriptive knowledge, rather than improve our understanding of brain functions. This is caused by methodological gaps both within and between subdisciplines constituting neuroscience, and the atomistic approach that limits the study of macro- and mesoscopic issues. Whole-brain measurement technologies do not resolve these issues, but rather aggravate them by the complexity problem. The present article is devoted to methodological and epistemic problems that obstruct the development of human neuroscience. We neither discuss ontological questions (e.g., the nature of the mind) nor review data, except when it is necessary to demonstrate a methodological issue. As regards intradisciplinary methodological problems, we concentrate on those within neurobiology (e.g., the gap between electrical and chemical approaches to neurophysiological processes) and psychology (missing theoretical concepts). As regards interdisciplinary problems, we suggest that core disciplines of neuroscience can be integrated using systemic concepts that also entail human-environment relations. We emphasize the necessity of a meta-discussion that should entail a closer cooperation with philosophy as a discipline of systematic reflection. The atomistic reduction should be complemented by the explicit consideration of the embodiedness of the brain and the embeddedness of humans. The discussion is aimed at the development of an explicit methodology of integrative human neuroscience, which will not only link different fields and levels, but also help in understanding clinical phenomena. PMID:27965548
Case study of a problem-based learning course of physics in a telecommunications engineering degree
NASA Astrophysics Data System (ADS)
Macho-Stadler, Erica; Jesús Elejalde-García, Maria
2013-08-01
Active learning methods can be appropriate in engineering, as their methodology promotes meta-cognition, independent learning and problem-solving skills. Problem-based learning is the educational process by which problem-solving activities and instructor's guidance facilitate learning. Its key characteristic involves posing a 'concrete problem' to initiate the learning process, generally implemented by small groups of students. Many universities have developed and used active methodologies successfully in the teaching-learning process. During the past few years, the University of the Basque Country has promoted the use of active methodologies through several teacher training programmes. In this paper, we describe and analyse the results of the educational experience using the problem-based learning (PBL) method in a physics course for undergraduates enrolled in the technical telecommunications engineering degree programme. From an instructors' perspective, PBL strengths include better student attitude in class and increased instructor-student and student-student interactions. The students emphasised developing teamwork and communication skills in a good learning atmosphere as positive aspects.
Layer Stripping Solutions of Inverse Seismic Problems.
1985-03-21
problems--more so than has generally been recognized. The subject of this thesis is the theoretical development of the . layer-stripping methodology , and...medium varies sharply at each interface, which would be expected to cause difficulties for the algorithm, since it was designed for a smoothy varying... methodology was applied in a novel way. The inverse problem considered in this chapter was that of reconstructing a layered medium from measurement of its
Kumar, Dinesh; Singh, Uday Shankar; Solanki, Rajanikant
2015-07-01
Early undergraduate exposure to research helps in producing physicians who are better equipped to meet their professional needs especially the analytical skills. To assess the effectiveness and acceptability of small group method in teaching research methodology. Sixth semester medical undergraduates (III MBBS-part1) of a self-financed rural medical college. The workshop was of two full days duration consisting of daily two sessions by faculty for 30 minutes, followed by group activity of about four hours and presentation by students at the end of the day. A simple 8 steps approach was used. These steps are Identify a Problem, Refine the Problem, Determine a Solution, Frame the Question, Develop a Protocol, Take Action, Write the Report and Share your Experience. A Pre-test and post-test assessment was carried out using a questionnaire followed by anonymous feedback at the end of the workshop. The responses were evaluated by blinded evaluator. There were 95 (94.8%) valid responses out of the 99 students, who attended the workshop. The mean Pre-test and post-test scores were 4.21 and 10.37 respectively and the differences were found to be significant using Wilcoxon Sign Rank test (p<0.001). The median feedback score regarding relevance, skill learning, quality of facilitation, gain in knowledge was four and that of experience of group learning was 5 on a Likert scale of 1-5.There were no significant differences between male and female students in terms of Pre-test, post-test scores and overall gain in scores. Participatory research methodology workshop can play a significant role in teaching research to undergraduate students in an interesting manner. However, the long term effect of such workshops needs to be evaluated.
Combining users' activity survey and simulators to evaluate human activity recognition systems.
Azkune, Gorka; Almeida, Aitor; López-de-Ipiña, Diego; Chen, Liming
2015-04-08
Evaluating human activity recognition systems usually implies following expensive and time-consuming methodologies, where experiments with humans are run with the consequent ethical and legal issues. We propose a novel evaluation methodology to overcome the enumerated problems, which is based on surveys for users and a synthetic dataset generator tool. Surveys allow capturing how different users perform activities of daily living, while the synthetic dataset generator is used to create properly labelled activity datasets modelled with the information extracted from surveys. Important aspects, such as sensor noise, varying time lapses and user erratic behaviour, can also be simulated using the tool. The proposed methodology is shown to have very important advantages that allow researchers to carry out their work more efficiently. To evaluate the approach, a synthetic dataset generated following the proposed methodology is compared to a real dataset computing the similarity between sensor occurrence frequencies. It is concluded that the similarity between both datasets is more than significant.
NASA Technical Reports Server (NTRS)
Hopkins, Dale A.; Patnaik, Surya N.
2000-01-01
A preliminary aircraft engine design methodology is being developed that utilizes a cascade optimization strategy together with neural network and regression approximation methods. The cascade strategy employs different optimization algorithms in a specified sequence. The neural network and regression methods are used to approximate solutions obtained from the NASA Engine Performance Program (NEPP), which implements engine thermodynamic cycle and performance analysis models. The new methodology is proving to be more robust and computationally efficient than the conventional optimization approach of using a single optimization algorithm with direct reanalysis. The methodology has been demonstrated on a preliminary design problem for a novel subsonic turbofan engine concept that incorporates a wave rotor as a cycle-topping device. Computations of maximum thrust were obtained for a specific design point in the engine mission profile. The results (depicted in the figure) show a significant improvement in the maximum thrust obtained using the new methodology in comparison to benchmark solutions obtained using NEPP in a manual design mode.
Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis
2005-04-01
Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.
Integrated Controls-Structures Design Methodology: Redesign of an Evolutionary Test Structure
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Joshi, Suresh M.
1997-01-01
An optimization-based integrated controls-structures design methodology for a class of flexible space structures is described, and the phase-0 Controls-Structures-Integration evolutionary model, a laboratory testbed at NASA Langley, is redesigned using this integrated design methodology. The integrated controls-structures design is posed as a nonlinear programming problem to minimize the control effort required to maintain a specified line-of-sight pointing performance, under persistent white noise disturbance. Static and dynamic dissipative control strategies are employed for feedback control, and parameters of these controllers are considered as the control design variables. Sizes of strut elements in various sections of the CEM are used as the structural design variables. Design guides for the struts are developed and employed in the integrated design process, to ensure that the redesigned structure can be effectively fabricated. The superiority of the integrated design methodology over the conventional design approach is demonstrated analytically by observing a significant reduction in the average control power needed to maintain specified pointing performance with the integrated design approach.
Researching Street Children: Methodological and Ethical Issues.
ERIC Educational Resources Information Center
Hutz, Claudio S.; And Others
This paper describes the ethical and methodological problems associated with studying prosocial moral reasoning of street children and children of low and high SES living with their families, and problems associated with studying sexual attitudes and behavior of street children and their knowledge of sexually transmitted diseases, especially AIDS.…
Problem-Based Learning: Lessons for Administrators, Educators and Learners
ERIC Educational Resources Information Center
Yeo, Roland
2005-01-01
Purpose: The paper aims to explore the challenges of problem-based learning (PBL) as an unconventional teaching methodology experienced by a higher learning institute in Singapore. Design/methodology/approach: The exploratory study was conducted using focus group discussions and semi-structured interviews. Four groups of people were invited to…
The Speaker Respoken: Material Rhetoric as Feminist Methodology.
ERIC Educational Resources Information Center
Collins, Vicki Tolar
1999-01-01
Presents a methodology based on the concept of "material rhetoric" that can help scholars avoid problems as they reclaim women's historical texts. Defines material rhetoric and positions it theoretically in relation to other methodologies, including bibliographical studies, reception theory, and established feminist methodologies. Illustrates…
A methodology for the assessment of manned flight simulator fidelity
NASA Technical Reports Server (NTRS)
Hess, Ronald A.; Malsbury, Terry N.
1989-01-01
A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.
Overview of Meta-Analyses of the Prevention of Mental Health, Substance Use and Conduct Problems
Sandler, Irwin; Wolchik, Sharlene A.; Cruden, Gracelyn; Mahrer, Nicole E.; Ahn, Soyeon; Brincks, Ahnalee; Brown, C. Hendricks
2014-01-01
This paper presents findings from an overview of meta-analyses of the effects of prevention and promotion programs to prevent mental health, substance use and conduct problems. The review of 48 meta-analyses found small but significant effects to reduce depression, anxiety, anti-social behavior and substance use. Further, the effects are sustained over time. Meta-analyses often found that the effects were heterogeneous. A conceptual model is proposed to guide the study of moderators of program effects in future meta-analyses and methodological issues in synthesizing findings across preventive interventions are discussed. PMID:24471372
A Comparison of Methods to Analyze Aquatic Heterotrophic Flagellates of Different Taxonomic Groups.
Jeuck, Alexandra; Nitsche, Frank; Wylezich, Claudia; Wirth, Olaf; Bergfeld, Tanja; Brutscher, Fabienne; Hennemann, Melanie; Monir, Shahla; Scherwaß, Anja; Troll, Nicole; Arndt, Hartmut
2017-08-01
Heterotrophic flagellates contribute significantly to the matter flux in aquatic and terrestrial ecosystems. Still today their quantification and taxonomic classification bear several problems in field studies, though these methodological problems seem to be increasingly ignored in current ecological studies. Here we describe and test different methods, the live-counting technique, different fixation techniques, cultivation methods like the liquid aliquot method (LAM), and a molecular single cell survey called aliquot PCR (aPCR). All these methods have been tested either using aquatic field samples or cultures of freshwater and marine taxa. Each of the described methods has its advantages and disadvantages, which have to be considered in every single case. With the live-counting technique a detection of living cells up to morphospecies level is possible. Fixation of cells and staining methods are advantageous due to the possible long-term storage and observation of samples. Cultivation methods (LAM) offer the possibility of subsequent molecular analyses, and aPCR tools might complete the deficiency of LAM in terms of the missing detection of non-cultivable flagellates. In summary, we propose a combination of several investigation techniques reducing the gap between the different methodological problems. Copyright © 2017 Elsevier GmbH. All rights reserved.
Ergonomic initiatives at Inmetro: measuring occupational health and safety.
Drucker, L; Amaral, M; Carvalheira, C
2012-01-01
This work studies biomechanical hazards to which the workforce of Instituto Nacional de Metrologia, Qualidade e Tecnologia Industrial (Inmetro) is exposed. It suggests a model for ergonomic evaluation of work, based on the concepts of resilience engineering which take into consideration the institute's ability to manage risk and deal with its consequences. Methodology includes the stages of identification, inventory, analysis, and risk management. Diagnosis of the workplace uses as parameters the minimal criteria stated in Brazilian legislation. The approach has several prospectives and encompasses the points of view of public management, safety engineering, physical therapy and ergonomics-oriented design. The suggested solution integrates all aspects of the problem: biological, psychological, sociological and organizational. Results obtained from a pilot Project allow to build a significant sample of Inmetro's workforce, identifying problems and validating the methodology employed as a tool to be applied to the whole institution. Finally, this work intends to draw risk maps and support goals and methods based on resiliency engineering to assess environmental and ergonomic risk management.
McKay, J R; Weiss, R V
2001-04-01
This article is an initial report from a review of alcohol and drug treatment studies with follow-ups of 2 years or more. The goals of the review are to examine the stability of substance use outcomes and the factors that moderate or mediate these outcomes. Results from 12 studies that generated multiple research reports are presented, and methodological problems encountered in the review are discussed. Substance use outcomes at the group level were generally stable, although moderate within-subject variation in substance use status over time was observed. Of factors assessed at baseline, psychiatric severity was a significant predictor of outcome in the highest percentage of reports, although the nature of the relationship varied. Stronger motivation and coping at baseline also consistently predicted better drinking outcomes. Better progress while in treatment, and the performance of pro-recovery behaviors and low problem severity in associated areas following treatment, consistently predicted better substance use outcomes.
Methodological Issues and Practical Problems in Conducting Research on Abused Children.
ERIC Educational Resources Information Center
Kinard, E. Milling
In order to inform policy and programs, research on child abuse must be not only methodologically rigorous, but also practically feasible. However, practical problems make child abuse research difficult to conduct. Definitions of abuse must be explicit and different types of abuse must be assessed separately. Study samples should be as…
ERIC Educational Resources Information Center
Soh, Kaycheng
2013-01-01
Recent research into university ranking methodologies uncovered several methodological problems among the systems currently in vogue. One of these is the discrepancy between the nominal and attained weights. The problem is the summation of unstandardized indicators for the total scores used in ranking. It is demonstrated that weight discrepancy…
A Methodological Critique of "Interventions for Boys with Conduct Problems"
ERIC Educational Resources Information Center
Kent, Ronald; And Others
1976-01-01
Kent criticizes Patterson's study on treating the behavior problems of boys, on several methodological bases concluding that more rigorous research is required in this field. Patterson answers Kent's criticisms arguing that they are not based on sound grounds. Patterson offers further evidence to support the efficacy of his treatment procedures.…
Research Methodology in Second Language Studies: Trends, Concerns, and New Directions
ERIC Educational Resources Information Center
King, Kendall A.; Mackey, Alison
2016-01-01
The field of second language studies is using increasingly sophisticated methodological approaches to address a growing number of urgent, real-world problems. These methodological developments bring both new challenges and opportunities. This article briefly reviews recent ontological and methodological debates in the field, then builds on these…
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
ERIC Educational Resources Information Center
De Vecchis, Gino; Pasquinelli D'Allegra, Daniela; Pesaresi, Cristiano
2011-01-01
During the last few years the Italian school system has seen significant changes but geography continues to be considered a boring and un-useful discipline by public institutions. The main problem is the widespread geographic illiteracy and the fact that very often people do not know the objectives, methodology and tools of geographical studies.…
Berne, Rosalyn W; Raviv, Daniel
2004-04-01
This paper introduces the Eight Dimensional Methodology for Innovative Thinking (the Eight Dimensional Methodology), for innovative problem solving, as a unified approach to case analysis that builds on comprehensive problem solving knowledge from industry, business, marketing, math, science, engineering, technology, arts, and daily life. It is designed to stimulate innovation by quickly generating unique "out of the box" unexpected and high quality solutions. It gives new insights and thinking strategies to solve everyday problems faced in the workplace, by helping decision makers to see otherwise obscure alternatives and solutions. Daniel Raviv, the engineer who developed the Eight Dimensional Methodology, and paper co-author, technology ethicist Rosalyn Berne, suggest that this tool can be especially useful in identifying solutions and alternatives for particular problems of engineering, and for the ethical challenges which arise with them. First, the Eight Dimensional Methodology helps to elucidate how what may appear to be a basic engineering problem also has ethical dimensions. In addition, it offers to the engineer a methodology for penetrating and seeing new dimensions of those problems. To demonstrate the effectiveness of the Eight Dimensional Methodology as an analytical tool for thinking about ethical challenges to engineering, the paper presents the case of the construction of the Large Binocular Telescope (LBT) on Mount Graham in Arizona. Analysis of the case offers to decision makers the use of the Eight Dimensional Methodology in considering alternative solutions for how they can proceed in their goals of exploring space. It then follows that same process through the second stage of exploring the ethics of each of those different solutions. The LBT project pools resources from an international partnership of universities and research institutes for the construction and maintenance of a highly sophisticated, powerful new telescope. It will soon mark the erection of the world's largest and most powerful optical telescope, designed to see fine detail otherwise visible only from space. It also represents a controversial engineering project that is being undertaken on land considered to be sacred by the local, native Apache people. As presented, the case features the University of Virginia, and its challenges in consideration of whether and how to join the LBT project consortium.
Aeroelastic optimization methodology for viscous and turbulent flows
NASA Astrophysics Data System (ADS)
Barcelos Junior, Manuel Nascimento Dias
2007-12-01
In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.
Quality of systematic reviews in pediatric oncology--a systematic review.
Lundh, Andreas; Knijnenburg, Sebastiaan L; Jørgensen, Anders W; van Dalen, Elvira C; Kremer, Leontien C M
2009-12-01
To ensure evidence-based decision making in pediatric oncology systematic reviews are necessary. The objective of our study was to evaluate the methodological quality of all currently existing systematic reviews in pediatric oncology. We identified eligible systematic reviews through a systematic search of the literature. Data on clinical and methodological characteristics of the included systematic reviews were extracted. The methodological quality of the included systematic reviews was assessed using the overview quality assessment questionnaire, a validated 10-item quality assessment tool. We compared the methodological quality of systematic reviews published in regular journals with that of Cochrane systematic reviews. We included 117 systematic reviews, 99 systematic reviews published in regular journals and 18 Cochrane systematic reviews. The average methodological quality of systematic reviews was low for all ten items, but the quality of Cochrane systematic reviews was significantly higher than systematic reviews published in regular journals. On a 1-7 scale, the median overall quality score for all systematic reviews was 2 (range 1-7), with a score of 1 (range 1-7) for systematic reviews in regular journals compared to 6 (range 3-7) in Cochrane systematic reviews (p<0.001). Most systematic reviews in the field of pediatric oncology seem to have serious methodological flaws leading to a high risk of bias. While Cochrane systematic reviews were of higher methodological quality than systematic reviews in regular journals, some of them also had methodological problems. Therefore, the methodology of each individual systematic review should be scrutinized before accepting its results.
IMSF: Infinite Methodology Set Framework
NASA Astrophysics Data System (ADS)
Ota, Martin; Jelínek, Ivan
Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.
Mining data from hemodynamic simulations for generating prediction and explanation models.
Bosnić, Zoran; Vračar, Petar; Radović, Milos D; Devedžić, Goran; Filipović, Nenad D; Kononenko, Igor
2012-03-01
One of the most common causes of human death is stroke, which can be caused by carotid bifurcation stenosis. In our work, we aim at proposing a prototype of a medical expert system that could significantly aid medical experts to detect hemodynamic abnormalities (increased artery wall shear stress). Based on the acquired simulated data, we apply several methodologies for1) predicting magnitudes and locations of maximum wall shear stress in the artery, 2) estimating reliability of computed predictions, and 3) providing user-friendly explanation of the model's decision. The obtained results indicate that the evaluated methodologies can provide a useful tool for the given problem domain. © 2012 IEEE
Application of the HARDMAN methodology to the single channel ground-airborne radio system (SINCGARS)
NASA Astrophysics Data System (ADS)
Balcom, J.; Park, J.; Toomer, L.; Feng, T.
1984-12-01
The HARDMAN methodology is designed to assess the human resource requirements early in the weapon system acquisition process. In this case, the methodology was applied to the family of radios known as SINCGARS (Single Channel Ground-Airborne Radio System). At the time of the study, SINCGARS was approaching the Full-Scale Development phase, with 2 contractors in competition. Their proposed systems were compared with a composite baseline comparison (reference) system. The systems' manpower, personnel and training requirements were compared. Based on RAM data, the contractors' MPT figures showed a significant reduction from the figures derived for the baseline comparison system. Differences between the two contractors were relatively small. Impact and some tradeoff analyses were hindered by data access problems. Tactical radios, manpower and personnel requirements analysis, impact and tradeoff analysis, human resource sensitivity, training requirements analysis, human resources in LCSMM, and logistics analyses are discussed.
An analysis of IGBP global land-cover characterization process
Loveland, Thomas R.; Zhu, Zhiliang; Ohlen, Donald O.; Brown, Jesslyn F.; Reed, Bradley C.; Yang, Limin
1999-01-01
The international Geosphere Biosphere Programme (IGBP) has called for the development of improved global land-cover data for use in increasingly sophisticated global environmental models. To meet this need, the staff of the U.S. Geological Survey and the University of Nebraska-Lincoln developed and applied a global land-cover characterization methodology using 1992-1993 1-km resolution Advanced Very High Resolution Radiometer (AVHRR) and other spatial data. The methodology, based on unsupervised classification with extensive postclassification refinement, yielded a multi-layer database consisting of eight land-cover data sets, descriptive attributes, and source data. An independent IGBP accuracy assessment reports a global accuracy of 73.5 percent, and continental results vary from 63 percent to 83 percent. Although data quality, methodology, interpreter performance, and logistics affected the results, significant problems were associated with the relationship between AVHRR data and fine-scale, spectrally similar land-cover patterns in complex natural or disturbed landscapes.
Interactive multi-mode blade impact analysis
NASA Technical Reports Server (NTRS)
Alexander, A.; Cornell, R. W.
1978-01-01
The theoretical methodology used in developing an analysis for the response of turbine engine fan blades subjected to soft-body (bird) impacts is reported, and the computer program developed using this methodology as its basis is described. This computer program is an outgrowth of two programs that were previously developed for the purpose of studying problems of a similar nature (a 3-mode beam impact analysis and a multi-mode beam impact analysis). The present program utilizes an improved missile model that is interactively coupled with blade motion which is more consistent with actual observations. It takes into account local deformation at the impact area, blade camber effects, and the spreading of the impacted missile mass on the blade surface. In addition, it accommodates plate-type mode shapes. The analysis capability in this computer program represents a significant improvement in the development of the methodology for evaluating potential fan blade materials and designs with regard to foreign object impact resistance.
Invited Commentary: The Need for Cognitive Science in Methodology.
Greenland, Sander
2017-09-15
There is no complete solution for the problem of abuse of statistics, but methodological training needs to cover cognitive biases and other psychosocial factors affecting inferences. The present paper discusses 3 common cognitive distortions: 1) dichotomania, the compulsion to perceive quantities as dichotomous even when dichotomization is unnecessary and misleading, as in inferences based on whether a P value is "statistically significant"; 2) nullism, the tendency to privilege the hypothesis of no difference or no effect when there is no scientific basis for doing so, as when testing only the null hypothesis; and 3) statistical reification, treating hypothetical data distributions and statistical models as if they reflect known physical laws rather than speculative assumptions for thought experiments. As commonly misused, null-hypothesis significance testing combines these cognitive problems to produce highly distorted interpretation and reporting of study results. Interval estimation has so far proven to be an inadequate solution because it involves dichotomization, an avenue for nullism. Sensitivity and bias analyses have been proposed to address reproducibility problems (Am J Epidemiol. 2017;186(6):646-647); these methods can indeed address reification, but they can also introduce new distortions via misleading specifications for bias parameters. P values can be reframed to lessen distortions by presenting them without reference to a cutoff, providing them for relevant alternatives to the null, and recognizing their dependence on all assumptions used in their computation; they nonetheless require rescaling for measuring evidence. I conclude that methodological development and training should go beyond coverage of mechanistic biases (e.g., confounding, selection bias, measurement error) to cover distortions of conclusions produced by statistical methods and psychosocial forces. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Trajectory Correction and Locomotion Analysis of a Hexapod Walking Robot with Semi-Round Rigid Feet
Zhu, Yaguang; Jin, Bo; Wu, Yongsheng; Guo, Tong; Zhao, Xiangmo
2016-01-01
Aimed at solving the misplaced body trajectory problem caused by the rolling of semi-round rigid feet when a robot is walking, a legged kinematic trajectory correction methodology based on the Least Squares Support Vector Machine (LS-SVM) is proposed. The concept of ideal foothold is put forward for the three-dimensional kinematic model modification of a robot leg, and the deviation value between the ideal foothold and real foothold is analyzed. The forward/inverse kinematic solutions between the ideal foothold and joint angular vectors are formulated and the problem of direct/inverse kinematic nonlinear mapping is solved by using the LS-SVM. Compared with the previous approximation method, this correction methodology has better accuracy and faster calculation speed with regards to inverse kinematics solutions. Experiments on a leg platform and a hexapod walking robot are conducted with multi-sensors for the analysis of foot tip trajectory, base joint vibration, contact force impact, direction deviation, and power consumption, respectively. The comparative analysis shows that the trajectory correction methodology can effectively correct the joint trajectory, thus eliminating the contact force influence of semi-round rigid feet, significantly improving the locomotion of the walking robot and reducing the total power consumption of the system. PMID:27589766
Evaluation of Healthcare Interventions and Big Data: Review of Associated Data Issues.
Asche, Carl V; Seal, Brian; Kahler, Kristijan H; Oehrlein, Elisabeth M; Baumgartner, Meredith Greer
2017-08-01
Although the analysis of 'big data' holds tremendous potential to improve patient care, there remain significant challenges before it can be realized. Accuracy and completeness of data, linkage of disparate data sources, and access to data are areas that require particular focus. This article discusses these areas and shares strategies to promote progress. Improvement in clinical coding, innovative matching methodologies, and investment in data standardization are potential solutions to data validation and linkage problems. Challenges to data access still require significant attention with data ownership, security needs, and costs representing significant barriers to access.
Learning physics: A comparative analysis between instructional design methods
NASA Astrophysics Data System (ADS)
Mathew, Easow
The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p < .01) in academic performance between students who participated in traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in the control group which may indicate that traditional teaching methods are more effective in females, whereas there was no significant difference noted in the experimental group between male and female participants. There was a statistically significant and negative relationship (r = -.61, p = .01) between age and physics pretest scores in the control group. No statistical analyses yielded significantly different average academic performance values in either group as delineated by ethnicity.
What is the evidence for retrieval problems in the elderly?
White, N; Cunningham, W R
1982-01-01
To determine whether older adults experience particular problems with retrieval, groups of young and elderly adults were given free recall and recognition tests of supraspan lists of unrelated words. Analysis of number of words correctly recalled and recognized yielded a significant age by retention test interaction: greater age differences were observed for recall than for recognition. In a second analysis of words recalled and recognized, corrected for guessing, the interaction disappeared. It was concluded that previous interpretations that age by retention test interactions are indicative of retrieval problems of the elderly may have been confounded by methodological problems. Furthermore, it was suggested that researchers in aging and memory need to be explicit in identifying their underlying models of error processes when analyzing recognition scores: different error models may lead to different results and interpretations.
The Beliefs of Teachers and Daycare Staff regarding Children of Divorce: A Q Methodological Study
ERIC Educational Resources Information Center
Overland, Klara; Thorsen, Arlene Arstad; Storksen, Ingunn
2012-01-01
This Q methodological study explores beliefs of daycare staff and teachers regarding young children's reactions related to divorce. The Q factor analysis resulted in two viewpoints. Participants on the viewpoint "Child problems" believe that children show various emotional and behavioral problems related to divorce, while those on the "Structure…
The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.
ERIC Educational Resources Information Center
Filinov, Nikolay B.; Ruchkina, Svetlana
2002-01-01
The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…
Integration of PBL Methodologies into Online Learning Courses and Programs
ERIC Educational Resources Information Center
van Oostveen, Roland; Childs, Elizabeth; Flynn, Kathleen; Clarkson, Jessica
2014-01-01
Problem-based learning (PBL) challenges traditional views of teaching and learning as the learner determines, to a large extent with support from a skilled facilitator, what topics will be explored, to what depth and which processes will be used. This paper presents the implementation of problem-based learning methodologies in an online Bachelor's…
NASA Technical Reports Server (NTRS)
Nakajima, Yukio; Padovan, Joe
1987-01-01
In a three-part series of papers, a generalized finite element methodology is formulated to handle traveling load problems involving large deformation fields in structure composed of viscoelastic media. The main thrust of this paper is to develop an overall finite element methodology and associated solution algorithms to handle the transient aspects of moving problems involving contact impact type loading fields. Based on the methodology and algorithms formulated, several numerical experiments are considered. These include the rolling/sliding impact of tires with road obstructions.
Immediate-type hypersensitivity reactions and hypnosis: problems in methodology.
Laidlaw, T M; Richardson, D H; Booth, R J; Large, R G
1994-08-01
Hypnosis has been used to ameliorate skin test reactivity in studies dating back to the 1930s. This study using modern methodology and statistical analyses sets out to test the hypothesis that it was possible to decrease reactions to histamine by hypnotic suggestion. Five subjects, all asthmatic and untrained in hypnosis, were given three hypnotic sessions where they were asked to control their reactions to histamine administered by the Pepys technique to forearm skin. These sessions were to be compared with three non-hypnotic sessions. The flare sizes but not wheal sizes were found to be significantly reduced after the hypnosis sessions, compared to sessions without hypnosis. Skin temperature was correlated with the size of reactions. The day upon which the sessions took place contributed significant amounts of the remaining unexplained variance, giving rise to questions about what could cause these day to day changes.
Ethical issues in cancer screening and prevention.
Plutynski, Anya
2012-06-01
November 2009's announcement of the USPSTF's recommendations for screening for breast cancer raised a firestorm of objections. Chief among them were that the panel had insufficiently valued patients' lives or allowed cost considerations to influence recommendations. The publicity about the recommendations, however, often either simplified the actual content of the recommendations or bypassed significant methodological issues, which a philosophical examination of both the science behind screening recommendations and their import reveals. In this article, I discuss two of the leading ethical considerations at issue in screening recommendations: respect for patient autonomy and beneficence and then turn to the most significant methodological issues raised by cancer screening: the potential biases that may infect a trial of screening effectiveness, the problem of base rates in communicating risk, and the trade-offs involved in a judgment of screening effectiveness. These issues reach more broadly, into the use of "evidence-based" medicine generally, and have important implications for informed consent.
[Road traffic injuries among youth: measuring the impact of an educational intervention].
Hidalgo-Solórzano, Elisa; Híjar, Martha; Mora-Flores, Gerardo; Treviño-Siller, Sandra; Inclán-Valadez, Cristina
2008-01-01
To analyze the impact of an educative intervention intended to increase the knowledge of causes and risk factors associated with road traffic iinjries in the city of Cuernavaca. A quasi-experimental study design was administered to students from 16 to 19 years old in colleges and universities in the city of Cuernavaca. The educative intervention included radio spots, banners, pamphlets, posters and cards. The measure of impact was established as changes in knowledge about speed, alcohol and the use of seat belts, using factor analysis methodologies. A significant change in the level of knowledge (p= 0.000) was observed in 700 students from 16 institutions. Educative interventions represent an initial strategy for changes in knowledge and population behaviours. The present study offers an appropriate methodology to measure short-term changes in knowledge about risk factors associated with a significant problem affecting Mexican youth.
New methodology for fast prediction of wheel wear evolution
NASA Astrophysics Data System (ADS)
Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.
2017-07-01
In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.
Introducing soft systems methodology plus (SSM+): why we need it and what it can contribute.
Braithwaite, Jeffrey; Hindle, Don; Iedema, Rick; Westbrook, Johanna I
2002-01-01
There are many complicated and seemingly intractable problems in the health care sector. Past ways to address them have involved political responses, economic restructuring, biomedical and scientific studies, and managerialist or business-oriented tools. Few methods have enabled us to develop a systematic response to problems. Our version of soft systems methodology, SSM+, seems to improve problem solving processes by providing an iterative, staged framework that emphasises collaborative learning and systems redesign involving both technical and cultural fixes.
Childhood constipation as an emerging public health problem
Rajindrajith, Shaman; Devanarayana, Niranga Manjuri; Crispus Perera, Bonaventure Jayasiri; Benninga, Marc Alexander
2016-01-01
Functional constipation (FC) is a significant health problem in children and contrary to common belief, has serious ramifications on the lives of children and their families. It is defined by the Rome criteria which encourage the use of multiple clinical features for diagnosis. FC in children has a high prevalence (0.7%-29%) worldwide, both in developed and developing countries. Biopsychosocial risk factors such as psychological stress, poor dietary habits, obesity and child maltreatment are commonly identified predisposing factors for FC. FC poses a significant healthcare burden on the already overstretched health budgets of many countries in terms of out-patient care, in-patient care, expenditure for investigations and prescriptions. Complications are common and range from minor psychological disturbances, to lower health-related quality of life. FC in children also has a significant impact on families. Many paediatric clinical trials have poor methodological quality, and drugs proved to be useful in adults, are not effective in relieving symptoms in children. A significant proportion of inadequately treated children have similar symptoms as adults. These factors show that constipation is an increasing public health problem across the world with a significant medical, social and economic impact. This article highlights the potential public health impact of FC and the possibility of overcoming this problem by concentrating on modifiable risk factors rather than expending resources on high cost investigations and therapeutic modalities. PMID:27570423
Childhood constipation as an emerging public health problem.
Rajindrajith, Shaman; Devanarayana, Niranga Manjuri; Crispus Perera, Bonaventure Jayasiri; Benninga, Marc Alexander
2016-08-14
Functional constipation (FC) is a significant health problem in children and contrary to common belief, has serious ramifications on the lives of children and their families. It is defined by the Rome criteria which encourage the use of multiple clinical features for diagnosis. FC in children has a high prevalence (0.7%-29%) worldwide, both in developed and developing countries. Biopsychosocial risk factors such as psychological stress, poor dietary habits, obesity and child maltreatment are commonly identified predisposing factors for FC. FC poses a significant healthcare burden on the already overstretched health budgets of many countries in terms of out-patient care, in-patient care, expenditure for investigations and prescriptions. Complications are common and range from minor psychological disturbances, to lower health-related quality of life. FC in children also has a significant impact on families. Many paediatric clinical trials have poor methodological quality, and drugs proved to be useful in adults, are not effective in relieving symptoms in children. A significant proportion of inadequately treated children have similar symptoms as adults. These factors show that constipation is an increasing public health problem across the world with a significant medical, social and economic impact. This article highlights the potential public health impact of FC and the possibility of overcoming this problem by concentrating on modifiable risk factors rather than expending resources on high cost investigations and therapeutic modalities.
NASA Astrophysics Data System (ADS)
Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel
2016-08-01
In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving teaching materials and assess their effectiveness in improving students' ability to solve problems in university-level physics. Firstly, we analyze the effect of using simulation-based materials in the development of students' skills in employing procedures that are typically used in the scientific method of problem-solving. We found that a significant percentage of the experimental students used expert-type scientific procedures such as qualitative analysis of the problem, making hypotheses, and analysis of results. At the end of the course, only a minority of the students persisted with habits based solely on mathematical equations. Secondly, we compare the effectiveness in terms of problem-solving of the experimental group students with the students who are taught conventionally. We found that the implementation of the problem-solving strategy improved experimental students' results regarding obtaining a correct solution from the academic point of view, in standard textbook problems. Thirdly, we explore students' satisfaction with simulation-based problem-solving teaching materials and we found that the majority appear to be satisfied with the methodology proposed and took on a favorable attitude to learning problem-solving. The research was carried out among first-year Engineering Degree students.
Deb, Kalyanmoy; Sinha, Ankur
2010-01-01
Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.
Galvez, Gino; Turbin, Mitchel B.; Thielman, Emily J.; Istvan, Joseph A.; Andrews, Judy A.; Henry, James A.
2012-01-01
Objectives Measurement of outcomes has become increasingly important to assess the benefit of audiologic rehabilitation, including hearing aids, in adults. Data from questionnaires, however, are based on retrospective recall of events and experiences, and often can be inaccurate. Questionnaires also do not capture the daily variation that typically occurs in relevant events and experiences. Clinical researchers in a variety of fields have turned to a methodology known as ecological momentary assessment (EMA) to assess quotidian experiences associated with health problems. The objective of this study was to determine the feasibility of using EMA to obtain real-time responses from hearing aid users describing their experiences with challenging hearing situations. Design This study required three phases: (1) develop EMA methodology to assess hearing difficulties experienced by hearing aid users; (2) utilize focus groups to refine the methodology; and (3) test the methodology with 24 hearing aid users. Phase 3 participants carried a personal digital assistant (PDA) 12 hr per day for 2 wk. The PDA alerted participants to respond to questions four times a day. Each assessment started with a question to determine if a hearing problem was experienced since the last alert. If “yes,” then up to 23 questions (depending on contingent response branching) obtained details about the situation. If “no,” then up to 11 questions obtained information that would help to explain why hearing was not a problem. Each participant completed the Hearing Handicap Inventory for the Elderly (HHIE) both before and after the 2-wk EMA testing period to evaluate for “reactivity” (exacerbation of self-perceived hearing problems that could result from the repeated assessments). Results Participants responded to the alerts with a 77% compliance rate, providing a total of 991 completed momentary assessments (mean = 43.1 per participant). A substantial amount of data was obtained with the methodology. Notably, participants reported a “hearing problem situation since the last alert” 37.6% of the time (372 responses). The most common problem situation involved “face-to-face conversation” (53.8% of the time). The next most common problem situation was “telephone conversation” (17.2%) followed by “TV, radio, iPod, etc.” (15.3%), “environmental sounds” (9.7%), and “movies, lecture, etc.” (4.0%). Comparison of pre- and post-EMA mean HHIE scores revealed no significant difference (p>.05), indicating that reactivity did not occur for this group. It should be noted, however, that 37.5% of participants reported a greater sense of awareness regarding their hearing loss and use of hearing aids. Conclusions Results showed participants were compliant, gave positive feedback, and did not demonstrate reactivity based on pre- and post-HHIE scores. We conclude that EMA methodology is feasible with patients who use hearing aids and could potentially inform hearing healthcare (HHC) services. The next step is to develop and evaluate EMA protocols that provide detailed daily patient information to audiologists at each stage of HHC. The advantages of such an approach would be to obtain real-life outcome measures, and to determine within- and between-day variability in outcomes and associated factors. Such information currently is not available from patients who seek and use HHC services. PMID:22531573
Galvez, Gino; Turbin, Mitchel B; Thielman, Emily J; Istvan, Joseph A; Andrews, Judy A; Henry, James A
2012-01-01
Measurement of outcomes has become increasingly important to assess the benefit of audiologic rehabilitation, including hearing aids, in adults. Data from questionnaires, however, are based on retrospective recall of events and experiences, and often can be inaccurate. Questionnaires also do not capture the daily variation that typically occurs in relevant events and experiences. Clinical researchers in a variety of fields have turned to a methodology known as ecological momentary assessment (EMA) to assess quotidian experiences associated with health problems. The objective of this study was to determine the feasibility of using EMA to obtain real-time responses from hearing aid users describing their experiences with challenging hearing situations. This study required three phases: (1) develop EMA methodology to assess hearing difficulties experienced by hearing aid users; (2) make use of focus groups to refine the methodology; and (3) test the methodology with 24 hearing aid users. Phase 3 participants carried a personal digital assistant 12 hr per day for 2 weeks. The personal digital assistant alerted participants to respond to questions four times a day. Each assessment started with a question to determine whether a hearing problem was experienced since the last alert. If "yes," then up to 23 questions (depending on contingent response branching) obtained details about the situation. If "no," then up to 11 questions obtained information that would help to explain why hearing was not a problem. Each participant completed the Hearing Handicap Inventory for the Elderly (HHIE) both before and after the 2-week EMA testing period to evaluate for "reactivity" (exacerbation of self-perceived hearing problems that could result from the repeated assessments). Participants responded to the alerts with a 77% compliance rate, providing a total of 991 completed momentary assessments (mean = 43.1 per participant). A substantial amount of data were obtained with the methodology. It is important to note that participants reported a "hearing problem situation since the last alert" 37.6% of the time (372 responses). The most common problem situation involved "face-to-face conversation" (53.8% of the time). The next most common problem situation was "telephone conversation" (17.2%) followed by "TV, radio, iPod, etc." (15.3%), "environmental sounds" (9.7%), and "movies, lecture, etc." (4.0%). Comparison of pre- and post-EMA mean HHIE scores revealed no significant difference (p > 0.05), indicating that reactivity did not occur for this group. It should be noted, however, that 37.5% of participants reported a greater sense of awareness regarding their hearing loss and use of hearing aids. Results showed participants were compliant, gave positive feedback, and did not demonstrate reactivity based on pre- and post-HHIE scores. We conclude that EMA methodology is feasible with patients who use hearing aids and could potentially inform hearing healthcare (HHC) services. The next step is to develop and evaluate EMA protocols that provide detailed daily patient information to audiologists at each stage of HHC. The advantages of such an approach would be to obtain real-life outcome measures, and to determine within- and between-day variability in outcomes and associated factors. Such information at present is not available from patients who seek and use HHC services.
[Genetic variation of geographical provenance of Pinus massoniana--review and analysis].
Li, D; Peng, S
2000-04-01
Pinus massoniana is a significant tree species constituting the subtropical forests in China. Based on morphological, physio-ecological, chromosome, and molecular levels, the genetic variation of geographical provenance of P. massoniana and its distribution were reviewed, and the methodologies on genetic diversity and the genetic variation patterns of geographical provenance of P. massoniana were synthetically analyzed. The Key problems on molecular ecology of P. massoniana were discussed.
William H. Cooke; Dennis M. Jacobs
2002-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....
Capturing security requirements for software systems.
El-Hadary, Hassan; El-Kassas, Sherif
2014-07-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way.
Capturing security requirements for software systems
El-Hadary, Hassan; El-Kassas, Sherif
2014-01-01
Security is often an afterthought during software development. Realizing security early, especially in the requirement phase, is important so that security problems can be tackled early enough before going further in the process and avoid rework. A more effective approach for security requirement engineering is needed to provide a more systematic way for eliciting adequate security requirements. This paper proposes a methodology for security requirement elicitation based on problem frames. The methodology aims at early integration of security with software development. The main goal of the methodology is to assist developers elicit adequate security requirements in a more systematic way during the requirement engineering process. A security catalog, based on the problem frames, is constructed in order to help identifying security requirements with the aid of previous security knowledge. Abuse frames are used to model threats while security problem frames are used to model security requirements. We have made use of evaluation criteria to evaluate the resulting security requirements concentrating on conflicts identification among requirements. We have shown that more complete security requirements can be elicited by such methodology in addition to the assistance offered to developers to elicit security requirements in a more systematic way. PMID:25685514
Self-Contained Automated Methodology for Optimal Flow Control
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, Roy A.; Erlebacherl, Gordon; Hussaini, M. Yousuff
1997-01-01
This paper describes a self-contained, automated methodology for active flow control which couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields and controls (e.g., actuators), may be determined. The problem of boundary layer instability suppression through wave cancellation is used as the initial validation case to test the methodology. Here, the objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc. The present methodology has been extended to three dimensions and may potentially be applied to separation control, re-laminarization, and turbulence control applications using one to many sensors and actuators.
Seed defective reduction in automotive Electro-Deposition Coating Process of truck cabin
NASA Astrophysics Data System (ADS)
Sonthilug, Aekkalag; Chutima, Parames
2018-02-01
The case study company is one of players in Thailand’s Automotive Industry who manufacturing truck and bus for both domestic and international market. This research focuses on a product quality problem about seed defects occurred in the Electro-Deposition Coating Process of truck cabin. The 5-phase of Six Sigma methodology including D-Define, M-Measure, A-Analyze, I-Improve, and C-Control is applied to this research to identify root causes of problem for setting new parameters of each significant factor. After the improvement, seed defects in this process is reduced from 9,178 defects per unit to 876 defects per unit (90% improvement)
On estimation of secret message length in LSB steganography in spatial domain
NASA Astrophysics Data System (ADS)
Fridrich, Jessica; Goljan, Miroslav
2004-06-01
In this paper, we present a new method for estimating the secret message length of bit-streams embedded using the Least Significant Bit embedding (LSB) at random pixel positions. We introduce the concept of a weighted stego image and then formulate the problem of determining the unknown message length as a simple optimization problem. The methodology is further refined to obtain more stable and accurate results for a wide spectrum of natural images. One of the advantages of the new method is its modular structure and a clean mathematical derivation that enables elegant estimator accuracy analysis using statistical image models.
NASA Technical Reports Server (NTRS)
Banks, H. T.; Brown, D. E.; Metcalf, Vern L.; Silcox, R. J.; Smith, Ralph C.; Wang, Yun
1994-01-01
A problem of continued interest concerns the control of vibrations in a flexible structure and the related problem of reducing structure-borne noise in structural acoustic systems. In both cases, piezoceramic patches bonded to the structures have been successfully used as control actuators. Through the application of a controlling voltage, the patches can be used to reduce structural vibrations which in turn lead to methods for reducing structure-borne noise. A PDE-based methodology for modeling, estimating physical parameters, and implementing a feedback control scheme for problems of this type is discussed. While the illustrating example is a circular plate, the methodology is sufficiently general so as to be applicable in a variety of structural and structural acoustic systems.
Problems related to the integration of fault tolerant aircraft electronic systems
NASA Technical Reports Server (NTRS)
Bannister, J. A.; Adlakha, V.; Triyedi, K.; Alspaugh, T. A., Jr.
1982-01-01
Problems related to the design of the hardware for an integrated aircraft electronic system are considered. Taxonomies of concurrent systems are reviewed and a new taxonomy is proposed. An informal methodology intended to identify feasible regions of the taxonomic design space is described. Specific tools are recommended for use in the methodology. Based on the methodology, a preliminary strawman integrated fault tolerant aircraft electronic system is proposed. Next, problems related to the programming and control of inegrated aircraft electronic systems are discussed. Issues of system resource management, including the scheduling and allocation of real time periodic tasks in a multiprocessor environment, are treated in detail. The role of software design in integrated fault tolerant aircraft electronic systems is discussed. Conclusions and recommendations for further work are included.
Meaning and Problems of Planning
ERIC Educational Resources Information Center
Brieve, Fred J.; Johnston, A. P.
1973-01-01
Examines the educational planning process. Discusses what planning is, how methodological planning can work in education, misunderstandings about planning, and difficulties in applying the planning methodology. (DN)
School-Based Methylphenidate Placebo Protocols: Methodological and Practical Issues.
ERIC Educational Resources Information Center
Hyman, Irwin A.; Wojtowicz, Alexandra; Lee, Kee Duk; Haffner, Mary Elizabeth; Fiorello, Catherine A.; And Others
1998-01-01
Focuses on methodological issues involved in choosing instruments to monitor behavior, once a comprehensive evaluation has suggested trials on Ritalin. Case examples illustrate problems of teacher compliance in filling out measures, supplying adequate placebos, and obtaining physical cooperation. Emerging school-based methodologies are discussed…
The Research of Improving the Particleboard Glue Dosing Process Based on TRIZ Analysis
NASA Astrophysics Data System (ADS)
Yu, Huiling; Fan, Delin; Zhang, Yizhuo
This research creates a design methodology by synthesizing the Theory of Inventive Problem Solving (TRIZ) and cascade control based on Smith predictor. The particleboard glue supplying and dosing system case study defines the problem and the solution using the methodology proposed in the paper. Status difference existing in the gluing dosing process of particleboard production usually causes gluing volume inaccurately. In order to solve the problem above, we applied the TRIZ technical contradiction and inventive principle to improve the key process of particleboard production. The improving method mapped inaccurate problem to TRIZ technical contradiction, the prior action proposed Smith predictor as the control algorithm in the glue dosing system. This research examines the usefulness of a TRIZ based problem-solving process designed to improve the problem-solving ability of users in addressing difficult or reoccurring problems and also testify TRIZ is practicality and validity. Several suggestions are presented on how to approach this problem.
Use of Invariant Manifolds for Transfers Between Three-Body Systems
NASA Technical Reports Server (NTRS)
Beckman, Mark; Howell, Kathleen
2003-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits does not exist. This paper presents the initial approaches to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing 7-dimensional invariant manifold data are presented. Some particular solutions are presented for the transfer problem, though the emphasis is on developing methodology for solving the general problem.
Representations of Invariant Manifolds for Applications in Three-Body Systems
NASA Technical Reports Server (NTRS)
Howell, K.; Beckman, M.; Patterson, C.; Folta, D.
2004-01-01
The Lunar L1 and L2 libration points have been proposed as gateways granting inexpensive access to interplanetary space. To date, only individual solutions to the transfer between three-body systems have been found. The methodology to solve the problem for arbitrary three-body systems and entire families of orbits is currently being studied. This paper presents an initial approach to solve the general problem for single and multiple impulse transfers. Two different methods of representing and storing the invariant manifold data are presented. Some particular solutions are presented for two types of transfer problems, though the emphasis is on developing the methodology for solving the general problem.
Lean six sigma methodologies improve clinical laboratory efficiency and reduce turnaround times.
Inal, Tamer C; Goruroglu Ozturk, Ozlem; Kibar, Filiz; Cetiner, Salih; Matyar, Selcuk; Daglioglu, Gulcin; Yaman, Akgun
2018-01-01
Organizing work flow is a major task of laboratory management. Recently, clinical laboratories have started to adopt methodologies such as Lean Six Sigma and some successful implementations have been reported. This study used Lean Six Sigma to simplify the laboratory work process and decrease the turnaround time by eliminating non-value-adding steps. The five-stage Six Sigma system known as define, measure, analyze, improve, and control (DMAIC) is used to identify and solve problems. The laboratory turnaround time for individual tests, total delay time in the sample reception area, and percentage of steps involving risks of medical errors and biological hazards in the overall process are measured. The pre-analytical process in the reception area was improved by eliminating 3 h and 22.5 min of non-value-adding work. Turnaround time also improved for stat samples from 68 to 59 min after applying Lean. Steps prone to medical errors and posing potential biological hazards to receptionists were reduced from 30% to 3%. Successful implementation of Lean Six Sigma significantly improved all of the selected performance metrics. This quality-improvement methodology has the potential to significantly improve clinical laboratories. © 2017 Wiley Periodicals, Inc.
Sparse and stable Markowitz portfolios.
Brodie, Joshua; Daubechies, Ingrid; De Mol, Christine; Giannone, Domenico; Loris, Ignace
2009-07-28
We consider the problem of portfolio selection within the classical Markowitz mean-variance framework, reformulated as a constrained least-squares regression problem. We propose to add to the objective function a penalty proportional to the sum of the absolute values of the portfolio weights. This penalty regularizes (stabilizes) the optimization problem, encourages sparse portfolios (i.e., portfolios with only few active positions), and allows accounting for transaction costs. Our approach recovers as special cases the no-short-positions portfolios, but does allow for short positions in limited number. We implement this methodology on two benchmark data sets constructed by Fama and French. Using only a modest amount of training data, we construct portfolios whose out-of-sample performance, as measured by Sharpe ratio, is consistently and significantly better than that of the naïve evenly weighted portfolio.
[Methodological problems in the scientific research on HIV /AIDS in Bolivia].
Hita, Susana Ramírez
2013-05-01
This paper discusses the methodological problems in the scientific research on HIV/AIDS in Bolivia, both in the areas of epidemiology and social sciences. Studies associated with this research served as the basis for the implementation of health programs run by The Global Fund, The Pan-American Health Organization, International Cooperation, Non-Governmental Organizations and the Bolivian Ministry of Health and Sports. An analysis of the methodological contradictions and weaknesses was made by reviewing the bibliography of the studies and by conducting qualitative methodological research, that was focused on the quality of health care available to people living with HIV/AIDS in public hospitals and health centers, and looked at how programs targeted at this sector of the population are designed and delivered. In this manner, it was possible to observe the shortcomings of the methodological design in the epidemiological and social science studies which serve as the basis for the implementation of these health programs.
Eichler, Anna K; Glaubitz, Katharina A; Hartmann, Luisa C; Spangler, Gottfried
2014-07-01
Parental stress is increased in clinical contexts (e.g., child psychiatry) and correlates with behavioral and emotional problems of children. In addition, parental stress can result in a biased parental perception of child's behavior and emotions. These interrelations were examined in a normal (N = 320) and a clinical (N = 75) sample. The "Eltern-Belastungs-Screening zur Kindeswohlgefährdung" (EBSK; Deegener, Spangler, Körner & Becker, 2009) was used for the assessment of parental stress. As expected, increased EBSK scores were overrepresented in the clinical sample. In both samples stressed parents reported having children with more behavioral and emotional problems. Children of stressed parents in turn reported significantly less problems than their parents did. The rating of independent third persons, e.g. teachers, was not available and should be added in future research. Restrictions in methodology and conclusions for practice are discussed.
Lees-Haley, Paul R; Greiffenstein, M Frank; Larrabee, Glenn J; Manning, Edward L
2004-08-01
Recently, Kaiser (2003) raised concerns over the increase in brain damage claims reportedly due to exposure to welding fumes. In the present article, we discuss methodological problems in conducting neuropsychological research on the effects of welding exposure, using a recent paper by Bowler et al. (2003) as an example to illustrate problems common in the neurotoxicity literature. Our analysis highlights difficulties in conducting such quasi-experimental investigations, including subject selection bias, litigation effects on symptom report and neuropsychological test performance, response bias, and scientifically inadequate casual reasoning.
Systematic Review of Empirically Evaluated School-Based Gambling Education Programs.
Keen, Brittany; Blaszczynski, Alex; Anjoul, Fadi
2017-03-01
Adolescent problem gambling prevalence rates are reportedly five times higher than in the adult population. Several school-based gambling education programs have been developed in an attempt to reduce problem gambling among adolescents; however few have been empirically evaluated. The aim of this review was to report the outcome of studies empirically evaluating gambling education programs across international jurisdictions. A systematic review following guidelines outlined in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Statement searching five academic databases: PubMed, Scopus, Medline, PsycINFO, and ERIC, was conducted. A total of 20 papers and 19 studies were included after screening and exclusion criteria were applied. All studies reported intervention effects on cognitive outcomes such as knowledge, perceptions, and beliefs. Only nine of the studies attempted to measure intervention effects on behavioural outcomes, and only five of those reported significant changes in gambling behaviour. Of these five, methodological inadequacies were commonly found including brief follow-up periods, lack of control comparison in post hoc analyses, and inconsistencies and misclassifications in the measurement of gambling behaviour, including problem gambling. Based on this review, recommendations are offered for the future development and evaluation of school-based gambling education programs relating to both methodological and content design and delivery considerations.
Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases
1992-09-29
STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases
William H. Cooke; Dennis M. Jacobs
2005-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....
Science and Television Commercials: Adding Relevance to the Research Methodology Course.
ERIC Educational Resources Information Center
Solomon, Paul R.
1979-01-01
Contends that research methodology courses can be relevant to issues outside of psychology and describes a method which relates the course to consumer problems. Students use experimental methodology to test claims made in television commercials advertising deodorant, bathroom tissues, and soft drinks. (KC)
Koh, Y. W.; Chui, C. Y.; Tang, C. S. K.; Lee, A. M.
2014-01-01
Introduction. Despite the fact that maternal perinatal mental health problems have been extensively studied and addressed to be a significant health problem, the literature on paternal perinatal mental health problems is relatively scarce. The present study aims at determining the prevalence of paternal perinatal depression and identifying the risk factors and the relationship between antenatal and postpartum depression. Methodology. 622 expectant fathers were recruited from regional maternal clinics. The expectant fathers were assessed using standardized and validated psychological instruments on 3 time points including early pregnancy, late pregnancy, and six weeks postpartum. Results. Results showed that a significant proportion of expectant fathers manifested depressive symptoms during the perinatal period. Paternal antenatal depression could significantly predict higher level of paternal postpartum depression. Psychosocial risk factors were consistently associated with paternal depression in different time points. Conclusions. The present study points to the need for greater research and clinical attention to paternal depression given that it is a highly prevalent problem and could be detrimental to their spouse and children development. The present findings contribute to theoretical basis of the prevalence and risk factors of paternal perinatal depression and have implications of the design of effective identification, prevention, and interventions of these clinical problems. PMID:24600517
Hospital cost control in Norway: a decade's experience with prospective payment.
Crane, T S
1985-01-01
Under Norway's prospective payment system, which was in existence from 1972 to 1980, hospital costs increased 15.8 percent annually, compared with 15.3 percent in the United States. In 1980 the Norwegian national government started paying for all institutional services according to a population-based, morbidity-adjusted formula. Norway's prospective payment system provides important insights into problems of controlling hospital costs despite significant differences, including ownership of medical facilities and payment and spending as a percent of GNP. Yet striking similarities exist. Annual real growth in health expenditures from 1972 to 1980 in Norway was 2.2 percent, compared with 2.4 percent in the United States. In both countries, public demands for cost control were accompanied by demands for more services. And problems of geographic dispersion of new technology and distribution of resources were similar. Norway's experience in the 1970s demonstrates that prospective payment is no panacea. The annual budget process created disincentives to hospitals to control costs. But Norway's changes in 1980 to a population-based methodology suggest a useful approach to achieve a more equitable distribution of resources. This method of payment provides incentives to control variations in both admissions and cost per case. In contrast, the Medicare approach based on Diagnostic Related Groups (DRGs) is limited, and it does not affect variations in admissions and capital costs. Population-based methodologies can be used in adjusting DRG rates to control both problems. In addition, the DRG system only applies to Medicare payments; the Norwegian experience demonstrates that this system may result in significant shifting of costs onto other payors. PMID:3927385
NASA Technical Reports Server (NTRS)
Smalley, Kurt B.; Tinker, Michael L.; Fischer, Richard T.
2001-01-01
This paper is written for the purpose of providing an introduction and set of guidelines for the use of a methodology for NASTRAN eigenvalue modeling of thin film inflatable structures. It is hoped that this paper will spare the reader from the problems and headaches the authors were confronted with during their investigation by presenting here not only an introduction and verification of the methodology, but also a discussion of the problems that this methodology can ensue. Our goal in this investigation was to verify the basic methodology through the creation and correlation of a simple model. An overview of thin film structures, their history, and their applications is given. Previous modeling work is then briefly discussed. An introduction is then given for the method of modeling. The specific mechanics of the method are then discussed in parallel with a basic discussion of NASTRAN s implementation of these mechanics. The problems encountered with the method are then given along with suggestions for their work-a-rounds. The methodology is verified through the correlation between an analytical model and modal test results of a thin film strut. Recommendations are given for the needed advancement of our understanding of this method and ability to accurately model thin film structures. Finally, conclusions are drawn regarding the usefulness of the methodology.
Unique Applications for Artificial Neural Networks. Phase 1
1991-08-08
significance. For the VRP, a problem that has received considerable attention in the literature, the new NGO-VRP methodology generates better solutions...represent the stop assignments of each route. The effect of the genetic recombinations is to make simple local exchanges to the relative positions of the...technique for representing a computer-based associative memory [Arbib, 1987]. In our routing system, the basic job of the neural network system is to accept
The Utilization of the Behavioral Sciences in Long Range Forecasting and Policy Planning
1973-07-30
73 - December 31, 1973. The report will be divided into six major sections. The first will describe the analysis initiated and completed dur- ing...the half year. Results of special significance will be highlighted. Methodological problems that have arisen during the analysis will be discussed...of this analysis has been the development of a modular computer simula- tion of such operations. The oil simulation module Is then to be used
Recent Advances in Agglomerated Multigrid
NASA Technical Reports Server (NTRS)
Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.; Hammond, Dana P.
2013-01-01
We report recent advancements of the agglomerated multigrid methodology for complex flow simulations on fully unstructured grids. An agglomerated multigrid solver is applied to a wide range of test problems from simple two-dimensional geometries to realistic three- dimensional configurations. The solver is evaluated against a single-grid solver and, in some cases, against a structured-grid multigrid solver. Grid and solver issues are identified and overcome, leading to significant improvements over single-grid solvers.
NASA Technical Reports Server (NTRS)
Hermann, Robert
1997-01-01
The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.
Was Feyerabend a Popperian? Methodological issues in the History of the Philosophy of Science.
Collodel, Matteo
2016-06-01
For more than three decades, there has been significant debate about the relation between Feyerabend and Popper. The discussion has been nurtured and complicated by the rift that opened up between the two and by the later Feyerabend's controversial portrayal of his earlier self. The first part of the paper provides an overview of the accounts of the relation that have been proposed over the years, disentangles the problems they deal with, and analyses the evidence supporting their conclusions as well as the methodological approaches used to process that evidence. Rather than advancing a further speculative account of the relation based on Feyerabend's philosophical work or autobiographical recollections, the second part of the paper strives to clarify the problems at issue by making use of a wider range of evidence. It outlines a historical reconstruction of the social context within which Feyerabend's intellectual trajectory developed, putting a special emphasis on the interplay between the perceived intellectual identity of Feyerabend, Feyerabend's own intellectual self-concept, and the peculiar features of the evolving Popperian research group. Copyright © 2015 Elsevier Ltd. All rights reserved.
Financial Support for the Humanities: A Special Methodological Report.
ERIC Educational Resources Information Center
Gomberg, Irene L.; Atelsek, Frank J.
Findings and methodological problems of a survey on financial support for humanities in higher education are discussed. Usable data were gathered from 351 of 671 Higher Education Panel member institutions. Two weighting methodologies were employed. The conventional method assumed that nonrespondents were similar to respondents, whereas a…
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
GIS and Multi-criteria evaluation (MCE) for landform geodiversity assessment
NASA Astrophysics Data System (ADS)
Najwer, Alicja; Reynard, Emmanuel; Zwoliński, Zbigniew
2014-05-01
In geomorphology, at the contemporary stage of methodology and methodological development, it is very significant to undertake new research problems, from theoretical and application point of view. As an example of applying geoconservation results in landscape studies and environmental conservation one can refer to the problem of the landform geodiversity. The concept of geodiversity was created relatively recently and, therefore, little progress has been made in its objective assessment and mapping. In order to ensure clarity and coherency, it is recommended that the evaluation process to be rigorous. Multi-criteria evaluation meets these criteria well. The main objective of this presentation is to demonstrate a new methodology for the assessment of the selected natural environment components in response to the definition of geodiversity, as well as visualization of the landforms geodiversity, using the opportunities offered by the geoinformation environment. The study area consists of two peculiar alpine valleys: Illgraben and Derborence, located in the Swiss Alps. Apart from glacial and fluvial landforms, the morphology of these two sites is largely due to the extreme phenomena(rockslides, torrential processes). Both valleys are recognized as geosites of national importance. The basis of the assessment is the selection of the geographical environment features. Firstly, six factor maps were prepared for each area: the landform energy, the landform fragmentation, the contemporary landform preservation, geological settings and hydrographic elements (lakes and streams). Input maps were then standardized and resulted from map algebra operations carried out by multi-criteria evaluation (MCE) with GIS-based Weighted Sum technique. Weights for particular classes were calculated using pair-comparison matrixes method. The final stage of deriving landform geodiversity maps was the reclassification procedure with the use of natural breaks method. The final maps of landform geodiversity were generated with the use of the same methodological algorithm and multiplication of each factor map by its given weight with consistency ratio = 0.07. However, the results that were obtained were radically different. The map of geodiversity for Derborence is characterized by much more significant fragmentation. Areas of low geodiveristy constitute a greater contribution. In the Illgraben site, there is a significant contribution of high and very high geodiversity classes. The obtained maps were reviewed during the field exploration with positive results, which gives a basis to conclude that the methodology used is correct and can be applied for other similar areas. Therefore, it is very important to develop an objective methodology that can be implemented for areas at the local and regional scale, but also giving satisfactory results for areas with a landscape different from the alpine one. The maps of landform geodiversity may be used for environment conservation management, preservation of specific features within the geosite perimeter, spatial planning or tourism management.
Technology transfer methodology
NASA Technical Reports Server (NTRS)
Labotz, Rich
1991-01-01
Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.
Engineering tradeoff problems viewed as multiple objective optimizations and the VODCA methodology
NASA Astrophysics Data System (ADS)
Morgan, T. W.; Thurgood, R. L.
1984-05-01
This paper summarizes a rational model for making engineering tradeoff decisions. The model is a hybrid from the fields of social welfare economics, communications, and operations research. A solution methodology (Vector Optimization Decision Convergence Algorithm - VODCA) firmly grounded in the economic model is developed both conceptually and mathematically. The primary objective for developing the VODCA methodology was to improve the process for extracting relative value information about the objectives from the appropriate decision makers. This objective was accomplished by employing data filtering techniques to increase the consistency of the relative value information and decrease the amount of information required. VODCA is applied to a simplified hypothetical tradeoff decision problem. Possible use of multiple objective analysis concepts and the VODCA methodology in product-line development and market research are discussed.
Six-Degree-of-Freedom Trajectory Optimization Utilizing a Two-Timescale Collocation Architecture
NASA Technical Reports Server (NTRS)
Desai, Prasun N.; Conway, Bruce A.
2005-01-01
Six-degree-of-freedom (6DOF) trajectory optimization of a reentry vehicle is solved using a two-timescale collocation methodology. This class of 6DOF trajectory problems are characterized by two distinct timescales in their governing equations, where a subset of the states have high-frequency dynamics (the rotational equations of motion) while the remaining states (the translational equations of motion) vary comparatively slowly. With conventional collocation methods, the 6DOF problem size becomes extraordinarily large and difficult to solve. Utilizing the two-timescale collocation architecture, the problem size is reduced significantly. The converged solution shows a realistic landing profile and captures the appropriate high-frequency rotational dynamics. A large reduction in the overall problem size (by 55%) is attained with the two-timescale architecture as compared to the conventional single-timescale collocation method. Consequently, optimum 6DOF trajectory problems can now be solved efficiently using collocation, which was not previously possible for a system with two distinct timescales in the governing states.
Is there an "abortion trauma syndrome"? Critiquing the evidence.
Robinson, Gail Erlick; Stotland, Nada L; Russo, Nancy Felipe; Lang, Joan A; Occhiogrosso, Mallay
2009-01-01
The objective of this review is to identify and illustrate methodological issues in studies used to support claims that induced abortion results in an "abortion trauma syndrome" or a psychiatric disorder. After identifying key methodological issues to consider when evaluating such research, we illustrate these issues by critically examining recent empirical studies that are widely cited in legislative and judicial testimony in support of the existence of adverse psychiatric sequelae of induced abortion. Recent studies that have been used to assert a causal connection between abortion and subsequent mental disorders are marked by methodological problems that include, but not limited to: poor sample and comparison group selection; inadequate conceptualization and control of relevant variables; poor quality and lack of clinical significance of outcome measures; inappropriateness of statistical analyses; and errors of interpretation, including misattribution of causal effects. By way of contrast, we review some recent major studies that avoid these methodological errors. The most consistent predictor of mental disorders after abortion remains preexisting disorders, which, in turn, are strongly associated with exposure to sexual abuse and intimate violence. Educating researchers, clinicians, and policymakers how to appropriately assess the methodological quality of research about abortion outcomes is crucial. Further, methodologically sound research is needed to evaluate not only psychological outcomes of abortion, but also the impact of existing legislation and the effects of social attitudes and behaviors on women who have abortions.
Peirlinck, Mathias; De Beule, Matthieu; Segers, Patrick; Rebelo, Nuno
2018-05-28
Patient-specific biomechanical modeling of the cardiovascular system is complicated by the presence of a physiological pressure load given that the imaged tissue is in a pre-stressed and -strained state. Neglect of this prestressed state into solid tissue mechanics models leads to erroneous metrics (e.g. wall deformation, peak stress, wall shear stress) which in their turn are used for device design choices, risk assessment (e.g. procedure, rupture) and surgery planning. It is thus of utmost importance to incorporate this deformed and loaded tissue state into the computational models, which implies solving an inverse problem (calculating an undeformed geometry given the load and the deformed geometry). Methodologies to solve this inverse problem can be categorized into iterative and direct methodologies, both having their inherent advantages and disadvantages. Direct methodologies are typically based on the inverse elastostatics (IE) approach and offer a computationally efficient single shot methodology to compute the in vivo stress state. However, cumbersome and problem-specific derivations of the formulations and non-trivial access to the finite element analysis (FEA) code, especially for commercial products, refrain a broad implementation of these methodologies. For that reason, we developed a novel, modular IE approach and implemented this methodology in a commercial FEA solver with minor user subroutine interventions. The accuracy of this methodology was demonstrated in an arterial tube and porcine biventricular myocardium model. The computational power and efficiency of the methodology was shown by computing the in vivo stress and strain state, and the corresponding unloaded geometry, for two models containing multiple interacting incompressible, anisotropic (fiber-embedded) and hyperelastic material behaviors: a patient-specific abdominal aortic aneurysm and a full 4-chamber heart model. Copyright © 2018 Elsevier Ltd. All rights reserved.
Varieties of second modernity: the cosmopolitan turn in social and political theory and research.
Beck, Ulrich; Grande, Edgar
2010-09-01
The theme of this special issue is the necessity of a cosmopolitan turn in social and political theory. The question at the heart of this introductory chapter takes the challenge of 'methodological cosmopolitanism', already addressed in a Special Issue on Cosmopolitan Sociology in this journal (Beck and Sznaider 2006), an important step further: How can social and political theory be opened up, theoretically as well as methodologically and normatively, to a historically new, entangled Modernity which threatens its own foundations? How can it account for the fundamental fragility, the mutability of societal dynamics (of unintended side effects, domination and power), shaped by the globalization of capital and risks at the beginning of the twenty-first century? What theoretical and methodological problems arise and how can they be addressed in empirical research? In the following, we will develop this 'cosmopolitan turn' in four steps: firstly, we present the major conceptual tools for a theory of cosmopolitan modernities; secondly, we de-construct Western modernity by using examples taken from research on individualization and risk; thirdly, we address the key problem of methodological cosmopolitanism, namely the problem of defining the appropriate unit of analysis; and finally,we discuss normative questions, perspectives, and dilemmas of a theory of cosmopolitan modernities, in particular problems of political agency and prospects of political realization.
A Bayesian approach to truncated data sets: An application to Malmquist bias in Supernova Cosmology
NASA Astrophysics Data System (ADS)
March, Marisa Cristina
2018-01-01
A problem commonly encountered in statistical analysis of data is that of truncated data sets. A truncated data set is one in which a number of data points are completely missing from a sample, this is in contrast to a censored sample in which partial information is missing from some data points. In astrophysics this problem is commonly seen in a magnitude limited survey such that the survey is incomplete at fainter magnitudes, that is, certain faint objects are simply not observed. The effect of this `missing data' is manifested as Malmquist bias and can result in biases in parameter inference if it is not accounted for. In Frequentist methodologies the Malmquist bias is often corrected for by analysing many simulations and computing the appropriate correction factors. One problem with this methodology is that the corrections are model dependent. In this poster we derive a Bayesian methodology for accounting for truncated data sets in problems of parameter inference and model selection. We first show the methodology for a simple Gaussian linear model and then go on to show the method for accounting for a truncated data set in the case for cosmological parameter inference with a magnitude limited supernova Ia survey.
NASA Astrophysics Data System (ADS)
Kernicky, Timothy; Whelan, Matthew; Al-Shaer, Ehab
2018-06-01
A methodology is developed for the estimation of internal axial force and boundary restraints within in-service, prismatic axial force members of structural systems using interval arithmetic and contractor programming. The determination of the internal axial force and end restraints in tie rods and cables using vibration-based methods has been a long standing problem in the area of structural health monitoring and performance assessment. However, for structural members with low slenderness where the dynamics are significantly affected by the boundary conditions, few existing approaches allow for simultaneous identification of internal axial force and end restraints and none permit for quantifying the uncertainties in the parameter estimates due to measurement uncertainties. This paper proposes a new technique for approaching this challenging inverse problem that leverages the Set Inversion Via Interval Analysis algorithm to solve for the unknown axial forces and end restraints using natural frequency measurements. The framework developed offers the ability to completely enclose the feasible solutions to the parameter identification problem, given specified measurement uncertainties for the natural frequencies. This ability to propagate measurement uncertainty into the parameter space is critical towards quantifying the confidence in the individual parameter estimates to inform decision-making within structural health diagnosis and prognostication applications. The methodology is first verified with simulated data for a case with unknown rotational end restraints and then extended to a case with unknown translational and rotational end restraints. A laboratory experiment is then presented to demonstrate the application of the methodology to an axially loaded rod with progressively increased end restraint at one end.
Leighton, Angela; Weinborn, Michael; Maybery, Murray
2014-10-01
Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.
NASA Astrophysics Data System (ADS)
Fulkerson, David E.
2010-02-01
This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.
A hybrid approach to select features and classify diseases based on medical data
NASA Astrophysics Data System (ADS)
AbdelLatif, Hisham; Luo, Jiawei
2018-03-01
Feature selection is popular problem in the classification of diseases in clinical medicine. Here, we developing a hybrid methodology to classify diseases, based on three medical datasets, Arrhythmia, Breast cancer, and Hepatitis datasets. This methodology called k-means ANOVA Support Vector Machine (K-ANOVA-SVM) uses K-means cluster with ANOVA statistical to preprocessing data and selection the significant features, and Support Vector Machines in the classification process. To compare and evaluate the performance, we choice three classification algorithms, decision tree Naïve Bayes, Support Vector Machines and applied the medical datasets direct to these algorithms. Our methodology was a much better classification accuracy is given of 98% in Arrhythmia datasets, 92% in Breast cancer datasets and 88% in Hepatitis datasets, Compare to use the medical data directly with decision tree Naïve Bayes, and Support Vector Machines. Also, the ROC curve and precision with (K-ANOVA-SVM) Achieved best results than other algorithms
A self-contained, automated methodology for optimal flow control validated for transition delay
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, R. A.; Erlebacher, Gordon; Hussaini, M. Yousuff
1995-01-01
This paper describes a self-contained, automated methodology for flow control along with a validation of the methodology for the problem of boundary layer instability suppression. The objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow, e.g., Blasius boundary layer. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The present approach couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields, and control, e.g., actuators, may be determined. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc.
Fluence-based and microdosimetric event-based methods for radiation protection in space
NASA Technical Reports Server (NTRS)
Curtis, Stanley B.; Meinhold, C. B. (Principal Investigator)
2002-01-01
The National Council on Radiation Protection and Measurements (NCRP) has recently published a report (Report #137) that discusses various aspects of the concepts used in radiation protection and the difficulties in measuring the radiation environment in spacecraft for the estimation of radiation risk to space travelers. Two novel dosimetric methodologies, fluence-based and microdosimetric event-based methods, are discussed and evaluated, along with the more conventional quality factor/LET method. It was concluded that for the present, any reason to switch to a new methodology is not compelling. It is suggested that because of certain drawbacks in the presently-used conventional method, these alternative methodologies should be kept in mind. As new data become available and dosimetric techniques become more refined, the question should be revisited and that in the future, significant improvement might be realized. In addition, such concepts as equivalent dose and organ dose equivalent are discussed and various problems regarding the measurement/estimation of these quantities are presented.
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Mattern, Duane L.; Bright, Michelle M.; Ouzts, Peter J.
1990-01-01
Results are presented from an application of H-infinity control design methodology to a centralized integrated flight/propulsion control (IFPC) system design for a supersonic Short Take-Off and Vertical Landing (STOVL) fighter aircraft in transition flight. The overall design methodology consists of a centralized IFPC controller design with controller partitioning. Only the feedback controller design portion of the methodology is addressed. Design and evaluation vehicle models are summarized, and insight is provided into formulating the H-infinity control problem such that it reflects the IFPC design objectives. The H-infinity controller is shown to provide decoupled command tracking for the design model. The controller order could be significantly reduced by modal residualization of the fast controller modes without any deterioration in performance. A discussion is presented of the areas in which the controller performance needs to be improved, and ways in which these improvements can be achieved within the framework of an H-infinity based linear control design.
Schedule Risks Due to Delays in Advanced Technology Development
NASA Technical Reports Server (NTRS)
Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan
2008-01-01
This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.
Methodological Problems of Soviet Pedagogy
ERIC Educational Resources Information Center
Noah, Harold J., Ed.; Beach, Beatrice S., Ed.
1974-01-01
Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)
Inverse problems in quantum chemistry
NASA Astrophysics Data System (ADS)
Karwowski, Jacek
Inverse problems constitute a branch of applied mathematics with well-developed methodology and formalism. A broad family of tasks met in theoretical physics, in civil and mechanical engineering, as well as in various branches of medical and biological sciences has been formulated as specific implementations of the general theory of inverse problems. In this article, it is pointed out that a number of approaches met in quantum chemistry can (and should) be classified as inverse problems. Consequently, the methodology used in these approaches may be enriched by applying ideas and theorems developed within the general field of inverse problems. Several examples, including the RKR method for the construction of potential energy curves, determining parameter values in semiempirical methods, and finding external potentials for which the pertinent Schrödinger equation is exactly solvable, are discussed in detail.
Sarvet, Barry D; Wegner, Lynn
2010-01-01
By working in collaboration with pediatric primary care providers, child and adolescent psychiatrists have the opportunity to address significant levels of unmet need for the majority of children and teenagers with serious mental health problems who have been unable to gain access to care. Effective collaboration with primary care represents a significant change from practice-as-usual for many child and adolescent psychiatrists. Implementation of progressive levels of collaborative practice, from the improvement of provider communication through the development of comprehensive collaborative systems, may be possible with sustained management efforts and application of process improvement methodology.
Borodulin, V I; Gliantsev, S P
2017-07-01
The article considers particular key methodological aspects of problem of scientific clinical school in national medicine. These aspects have to do with notion of school, its profile, issues of pedagogues, teachings and followers, subsidiary schools and issue of ethical component of scientific school. The article is a polemic one hence one will find no definite answers to specified questions. The reader is proposed to ponder over answers independently adducing examples of pro and contra. The conclusion is made about necessity of studying scientific schools in other areas of medicine and further elaboration of problem.
Electromagnetic Simulation of the Near-Field Distribution around a Wind Farm
Yang, Shang-Te; Ling, Hao
2013-01-01
An efficienmore » t approach to compute the near-field distribution around and within a wind farm under plane wave excitation is proposed. To make the problem computationally tractable, several simplifying assumptions are made based on the geometry problem. By comparing the approximations against full-wave simulations at 500 MHz, it is shown that the assumptions do not introduce significant errors into the resulting near-field distribution. The near fields around a 3 × 3 wind farm are computed using the developed methodology at 150 MHz, 500 MHz, and 3 GHz. Both the multipath interference patterns and the forward shadows are predicted by the proposed method.« less
Disease-mongering through clinical trials.
González-Moreno, María; Saborido, Cristian; Teira, David
2015-06-01
Our goal in this paper is to articulate a precise concept of at least a certain kind of disease-mongering, showing how pharmaceutical marketing can commercially exploit certain diseases when their best definition is given through the success of a treatment in a clinical trial. We distinguish two types of disease-mongering according to the way they exploit the definition of the trial population for marketing purposes. We argue that behind these two forms of disease-mongering there are two well-known problems in the statistical methodology of clinical trials (the reference class problem and the distinction between statistical and clinical significance). Overcoming them is far from simple. Copyright © 2015 Elsevier Ltd. All rights reserved.
Prevention Interventions of Alcohol Problems in the Workplace
Ames, Genevieve M.; Bennett, Joel B.
2011-01-01
The workplace offers advantages as a setting for interventions that result in primary prevention of alcohol abuse. Such programs have the potential to reach broad audiences and populations that would otherwise not receive prevention programs and, thereby, benefit both the employee and employer. Researchers have implemented and evaluated a variety of workplace alcohol problem prevention efforts in recent years, including programs focused on health promotion, social health promotion, brief interventions, and changing the work environment. Although some studies reported significant reductions in alcohol use outcomes, additional research with a stronger and integrated methodological approach is needed. The field of workplace alcohol prevention also might benefit from a guiding framework, such as the one proposed in this article. PMID:22330216
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
ERIC Educational Resources Information Center
Riazi, A. Mehdi
2016-01-01
Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…
Active Methodologies in a Queueing Systems Course for Telecommunication Engineering Studies
ERIC Educational Resources Information Center
Garcia, J.; Hernandez, A.
2010-01-01
This paper presents the results of a one-year experiment in incorporating active methodologies in a Queueing Systems course as part of the Telecommunication Engineering degree at the University of Zaragoza, Spain, during the period of adaptation to the European Higher Education Area. A problem-based learning methodology has been introduced, and…
Harris, Alex; Reeder, Rachelle; Hyun, Jenny
2011-01-01
The authors surveyed 21 editors and reviewers from major psychology journals to identify and describe the statistical and design errors they encounter most often and to get their advice regarding prevention of these problems. Content analysis of the text responses revealed themes in 3 major areas: (a) problems with research design and reporting (e.g., lack of an a priori power analysis, lack of congruence between research questions and study design/analysis, failure to adequately describe statistical procedures); (b) inappropriate data analysis (e.g., improper use of analysis of variance, too many statistical tests without adjustments, inadequate strategy for addressing missing data); and (c) misinterpretation of results. If researchers attended to these common methodological and analytic issues, the scientific quality of manuscripts submitted to high-impact psychology journals might be significantly improved.
Developing close combat behaviors for simulated soldiers using genetic programming techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pryor, Richard J.; Schaller, Mark J.
2003-10-01
Genetic programming is a powerful methodology for automatically producing solutions to problems in a variety of domains. It has been used successfully to develop behaviors for RoboCup soccer players and simple combat agents. We will attempt to use genetic programming to solve a problem in the domain of strategic combat, keeping in mind the end goal of developing sophisticated behaviors for compound defense and infiltration. The simplified problem at hand is that of two armed agents in a small room, containing obstacles, fighting against each other for survival. The base case and three changes are considered: a memory of positionsmore » using stacks, context-dependent genetic programming, and strongly typed genetic programming. Our work demonstrates slight improvements from the first two techniques, and no significant improvement from the last.« less
Sparse and stable Markowitz portfolios
Brodie, Joshua; Daubechies, Ingrid; De Mol, Christine; Giannone, Domenico; Loris, Ignace
2009-01-01
We consider the problem of portfolio selection within the classical Markowitz mean-variance framework, reformulated as a constrained least-squares regression problem. We propose to add to the objective function a penalty proportional to the sum of the absolute values of the portfolio weights. This penalty regularizes (stabilizes) the optimization problem, encourages sparse portfolios (i.e., portfolios with only few active positions), and allows accounting for transaction costs. Our approach recovers as special cases the no-short-positions portfolios, but does allow for short positions in limited number. We implement this methodology on two benchmark data sets constructed by Fama and French. Using only a modest amount of training data, we construct portfolios whose out-of-sample performance, as measured by Sharpe ratio, is consistently and significantly better than that of the naïve evenly weighted portfolio. PMID:19617537
Improved finite element methodology for integrated thermal structural analysis
NASA Technical Reports Server (NTRS)
Dechaumphai, P.; Thornton, E. A.
1982-01-01
An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Xiangqi; Zhang, Yingchen
This paper presents an optimal voltage control methodology with coordination among different voltage-regulating resources, including controllable loads, distributed energy resources such as energy storage and photovoltaics (PV), and utility voltage-regulating devices such as voltage regulators and capacitors. The proposed methodology could effectively tackle the overvoltage and voltage regulation device distortion problems brought by high penetrations of PV to improve grid operation reliability. A voltage-load sensitivity matrix and voltage-regulator sensitivity matrix are used to deploy the resources along the feeder to achieve the control objectives. Mixed-integer nonlinear programming is used to solve the formulated optimization control problem. The methodology has beenmore » tested on the IEEE 123-feeder test system, and the results demonstrate that the proposed approach could actively tackle the voltage problem brought about by high penetrations of PV and improve the reliability of distribution system operation.« less
Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness
NASA Astrophysics Data System (ADS)
Kaushik, Anshul; Ramani, Anand
2014-04-01
Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.
Fontoura, Francisca Pinheiro; Gonçalves, Cláudia Giglio de Oliveira; Willig, Mariluci Hautsch; Lüders, Debora
2018-02-19
Evaluate the effectiveness of educational interventions on hearing health developed at a hospital laundry. Quantitative assessment conducted at a hospital laundry. The study sample comprised 80 workers of both genders divided into two groups: Study Group (SG) and Control Group (CG). The educational interventions in hearing preservation were evaluated based on a theoretical approach using the Participatory Problem-based Methodology in five workshops. To assess the results of the workshops, an instrument containing 36 questions on knowledge, attitudes, and practices in hearing preservation at work was used. Questionnaires A and B were applied prior to and one month after intervention, respectively. The answers to both questionnaires were analyzed by group according to gender and schooling. Results of the pre-intervention phase showed low scores regarding knowledge about hearing health in the work setting for both groups, but significant improvement in knowledge was observed after intervention in the SG, with 77.7% of the answers presenting significant difference between the groups. There was also an improvement in the mean scores, with 35 responses (95.22%) presenting scores >4 (considered adequate). The women presented lower knowledge scores than the men; however, these differences were not observed in the SG after the workshops. Schooling was not a relevant factor in the assessment. The educational proposal grounded in the Participatory Problem-based Methodology expanded knowledge about hearing health at work among the participants.
NASA Astrophysics Data System (ADS)
Vatutin, Eduard
2017-12-01
The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.
Building an adaptive agent to monitor and repair the electrical power system of an orbital satellite
NASA Technical Reports Server (NTRS)
Tecuci, Gheorghe; Hieb, Michael R.; Dybala, Tomasz
1995-01-01
Over several years we have developed a multistrategy apprenticeship learning methodology for building knowledge-based systems. Recently we have developed and applied our methodology to building intelligent agents. This methodology allows a subject matter expert to build an agent in the same way in which the expert would teach a human apprentice. The expert will give the agent specific examples of problems and solutions, explanations of these solutions, or supervise the agent as it solves new problems. During such interactions, the agent learns general rules and concepts, continuously extending and improving its knowledge base. In this paper we present initial results on applying this methodology to build an intelligent adaptive agent for monitoring and repair of the electrical power system of an orbital satellite, stressing the interaction with the expert during apprenticeship learning.
Singh, Shantanu; Prakash, Jyoti; Das, R. C.; Srivastava, Kalpana
2016-01-01
Background: Medical students undergo significant stress during training which may lead to own suffering or problem in patient care. High level of burnouts and depression is also not uncommon. The transition from preclinical to clinical training has been regarded as crucial to student in relation to the stress. Methodology: An assessment of perceived stress and its relation to general psychopathology, the pattern of coping, and burnout in the final-year medical student was done to bring out clear nature, pattern, and extent of the problem. Results: Perceived stress had statistically significant association with general psychopathology and depressive-anxiety component of burnout. Acceptance, positive reframing, humor, planning, and active coping correlated with lower score on perceived stress. Conclusion: Higher score on perceived stress was associated with higher scores on general psychopathology and burnout. Age of joining MBBS course and doctor in the family did not affect the stress significantly. People who displayed positive coping strategies had lesser stress and general psychopathology. PMID:28659697
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Critical Thinking and the Use of Nontraditional Instructional Methodologies.
Orique, Sabrina B; McCarthy, Mary Ann
2015-08-01
The purpose of this study was to examine the relationship between critical thinking and the use of concept mapping (CM) and problem-based learning (PBL) during care plan development. A quasi-experimental study with a pretest-posttest design was conducted using a convenience sample (n = 49) of first-semester undergraduate baccalaureate nursing students. Critical thinking was measured using the Holistic Critical Thinking Scoring Rubric. Data analysis consisted of a repeated measures analysis of variance with post hoc mean comparison tests using the Bonferroni method. Findings indicated that mean critical thinking at phase 4 (CM and PBL) was significantly higher, compared with phase 1 (baseline), phase 2 (PBL), and phase 3 (CM [p < 0.001]). The results support the utilization of nontraditional instructional (CM and PBL) methodologies in undergraduate nursing curricula. Copyright 2015, SLACK Incorporated.
[Production chain supply management for public hospitals: a logistical approach to healthcare].
Infante, Maria; dos Santos, Maria Angélica Borges
2007-01-01
Despite their importance for hospital operations, discussions of healthcare organization logistics and supply and materials management are notably lacking in Brazilian literature. This paper describes a methodology for organizing the supply of medical materials in public hospitals, based on an action-research approach. Interventions were based on the assumption that a significant portion of problems in Brazil's National Health System (SUS) facilities derive from the fact that their clinical and administrative departments do not see themselves as belonging to the same production chain - neither the hospital nor the supply department is aware of what the other produces. The development of the methodology and its main steps are presented and discussed, against a background of recent literature and total quality and supply chain management concepts.
Tracking problem solving by multivariate pattern analysis and Hidden Markov Model algorithms.
Anderson, John R
2012-03-01
Multivariate pattern analysis can be combined with Hidden Markov Model algorithms to track the second-by-second thinking as people solve complex problems. Two applications of this methodology are illustrated with a data set taken from children as they interacted with an intelligent tutoring system for algebra. The first "mind reading" application involves using fMRI activity to track what students are doing as they solve a sequence of algebra problems. The methodology achieves considerable accuracy at determining both what problem-solving step the students are taking and whether they are performing that step correctly. The second "model discovery" application involves using statistical model evaluation to determine how many substates are involved in performing a step of algebraic problem solving. This research indicates that different steps involve different numbers of substates and these substates are associated with different fluency in algebra problem solving. Copyright © 2011 Elsevier Ltd. All rights reserved.
Creativity and psychopathology: a systematic review.
Thys, Erik; Sabbe, Bernard; De Hert, Marc
2014-01-01
The possible link between creativity and psychopathology has been a long-time focus of research up to the present day. However, the research results in this field are heterogeneous and contradictory. Links between creativity and specific psychiatric disorders have been confirmed and refuted in different studies. This disparity is partly explained by the methodological challenges peculiar to this field. In this systematic review of the literature from 1950, research articles in the field of creativity and psychopathology are presented, focusing on the methodology and results of the collected studies. This review confirms the methodological problems and the heterogeneity of the study designs and results. The assessment of psychopathology, but more so of creativity, remains a fundamental challenge. On the whole, study results cautiously confirm an association between creativity and both bipolar disorder and schizotypy. The research on creativity and psychopathology is hampered by serious methodological problems. Study results are to be interpreted with caution and future research needs more methodological rigor. © 2014 S. Karger AG, Basel.
Analyzing Problem's Difficulty Based on Neural Networks and Knowledge Map
ERIC Educational Resources Information Center
Kuo, Rita; Lien, Wei-Peng; Chang, Maiga; Heh, Jia-Sheng
2004-01-01
This paper proposes a methodology to calculate both the difficulty of the basic problems and the difficulty of solving a problem. The method to calculate the difficulty of problem is according to the process of constructing a problem, including Concept Selection, Unknown Designation, and Proposition Construction. Some necessary measures observed…
Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks
NASA Technical Reports Server (NTRS)
Brown, Richard Lee
2008-01-01
Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.
Evaluation in health: participatory methodology and involvement of municipal managers
de Almeida, Cristiane Andrea Locatelli; Tanaka, Oswaldo Yoshimi
2016-01-01
ABSTRACT OBJECTIVE To analyze scopes and limits of the use of participatory methodology of evaluation with municipal health managers and administrators. METHODS Qualitative research with health policymakers and managers of the Comissão Intergestores Regional (CIR – Regional Interagency Commission) of a health region of the state of Sao Paulo in Brazil. Representatives from seven member cities participated in seven workshops facilitated by the researchers, with the aim of assessing a specific problem of the care line, which would be used as a tracer of the system integrality. The analysis of the collected empirical material was based on the hermeneutic-dialectic methodology and aimed at the evaluation of the applied participatory methodology, according to its capacity of promoting a process of assessment capable to be used as a support for municipal management. RESULTS With the participatory approach of evaluation, we were able to promote in-depth discussions with the group, especially related to the construction of integral care and to the inclusion of the user’s perspective in decision-making, linked to the search for solution to concrete problems of managers. By joint exploration, the possibility of using data from electronic information systems was opened, as well as information coming directly from the users of the services, to enhance discussions and negotiations between partners. The participants were disbelievers of the replication potential of this type of evaluation without the direct monitoring of the academy, given the difficulty of organizing the process in everyday life, already taken by emergency and political issues. CONCLUSIONS Evaluations of programs and services carried out within the Regional Interagency Commission, starting from the local interest and facilitating the involvement of its members by the use of participatory methodologies, can contribute to the construction of integral care. To the extent that the act of evaluating stay invested with greater significance to the local actors, its involvement with the evaluations at the federal level can also be stimulated. PMID:27509011
Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro
2016-01-01
The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.
Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro
2016-01-01
The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952
Innovation design of medical equipment based on TRIZ.
Gao, Changqing; Guo, Leiming; Gao, Fenglan; Yang, Bo
2015-01-01
Medical equipment is closely related to personal health and safety, and this can be of concern to the equipment user. Furthermore, there is much competition among medical equipment manufacturers. Innovative design is the key to success for those enterprises. The design of medical equipment usually covers vastly different domains of knowledge. The application of modern design methodology in medical equipment and technology invention is an urgent requirement. TRIZ (Russian abbreviation of what can be translated as `theory of inventive problem solving') was born in Russia, which contain some problem-solving methods developed by patent analysis around the world, including Conflict Matrix, Substance Field Analysis, Standard Solution, Effects, etc. TRIZ is an inventive methodology for problems solving. As an Engineering example, infusion system is analyzed and re-designed by TRIZ. The innovative idea is generated to liberate the caretaker from the infusion bag watching out. The research in this paper shows the process of the application of TRIZ in medical device inventions. It is proved that TRIZ is an inventive methodology for problems solving and can be used widely in medical device development.
A risk assessment methodology for critical transportation infrastructure.
DOT National Transportation Integrated Search
2002-01-01
Infrastructure protection typifies a problem of risk assessment and management in a large-scale system. This study offers a methodological framework to identify, prioritize, assess, and manage risks. It includes the following major considerations: (1...
Methodology for nonwork travel analysis in suburban communities.
DOT National Transportation Integrated Search
1994-01-01
The increase in the number of nonwork trips during the past decade has contributed substantially to congestion and to environmental problems. Data collection methodologies, descriptive information, and reliable models of nonwork travel behavior are n...
ERIC Educational Resources Information Center
Jakovljevic, Maria; Ankiewicz, Piet; De swardt, Estelle; Gross, Elna
2004-01-01
Traditional instructional methodology in the Information System Design (ISD) environment lacks explicit strategies for promoting the cognitive skills of prospective system designers. This contributes to the fragmented knowledge and low motivational and creative involvement of learners in system design tasks. In addition, present ISD methodologies,…
2017-06-15
the methodology of reducing the online-algorithm-selecting problem as a contextual bandit problem, which is yet another interactive learning...KH2016a] Kuan-Hao Huang and Hsuan-Tien Lin. Linear upper confidence bound algorithm for contextual bandit problem with piled rewards. In Proceedings
Decomposition of timed automata for solving scheduling problems
NASA Astrophysics Data System (ADS)
Nishi, Tatsushi; Wakatake, Masato
2014-03-01
A decomposition algorithm for scheduling problems based on timed automata (TA) model is proposed. The problem is represented as an optimal state transition problem for TA. The model comprises of the parallel composition of submodels such as jobs and resources. The procedure of the proposed methodology can be divided into two steps. The first step is to decompose the TA model into several submodels by using decomposable condition. The second step is to combine individual solution of subproblems for the decomposed submodels by the penalty function method. A feasible solution for the entire model is derived through the iterated computation of solving the subproblem for each submodel. The proposed methodology is applied to solve flowshop and jobshop scheduling problems. Computational experiments demonstrate the effectiveness of the proposed algorithm compared with a conventional TA scheduling algorithm without decomposition.
Ager, Alastair; Bancroft, Carolyn; Berger, Elizabeth; Stark, Lindsay
2018-01-01
Gender-based violence (GBV) is a significant problem in conflict-affected settings. Understanding local constructions of such violence is crucial to developing preventive and responsive interventions to address this issue. This study reports on a secondary analysis of archived data collected as part of formative qualitative work - using a group participatory ranking methodology (PRM) - informing research on the prevalence of GBV amongst IDPs in northern Uganda in 2006. Sixty-four PRM group discussions were held with women, with men, with girls (aged 14 to 18 years), and with boys (aged 14 to 18 years) selected on a randomized basis across four internally displaced persons (IDP) camps in Lira District. Discussions elicited problems facing women in the camps, and - through structured participatory methods - consensus ranking of their importance and narrative accounts explaining these judgments. Amongst forms of GBV faced by women, rape was ranked as the greatest concern amongst participants (with a mean problem rank of 3.4), followed by marital rape (mean problem rank of 4.5) and intimate partner violence (mean problem rank of 4.9). Girls ranked all forms of GBV as higher priority concerns than other participants. Discussions indicated that these forms of GBV were generally considered normalized within the camp. Gender roles and power, economic deprivation, and physical and social characteristics of the camp setting emerged as key explanatory factors in accounts of GBV prevalence, although these played out in different ways with respect to differing forms of violence. All groups acknowledged GBV to represent a significant threat - among other major concerns such as transportation, water, shelter, food and security - for women residing in the camps. Given evidence of the significantly higher risk in the camp of intimate partner violence and marital rape, the relative prominence of the issue of rape in all rankings suggests normalization of violence within the home. Programs targeting reduction in GBV need to address community-identified root causes such as economic deprivation and social norms related to gender roles. More generally, PRM appears to offer an efficient means of identifying local constructions of prevailing challenges in a manner that can inform programming.
Cost-benefit analysis of space technology
NASA Technical Reports Server (NTRS)
Hein, G. F.; Stevenson, S. M.; Sivo, J. N.
1976-01-01
A discussion of the implications and problems associated with the use of cost-benefit techniques is presented. Knowledge of these problems is useful in the structure of a decision making process. A methodology of cost-benefit analysis is presented for the evaluation of space technology. The use of the methodology is demonstrated with an evaluation of ion thrusters for north-south stationkeeping aboard geosynchronous communication satellites. A critique of the concept of consumers surplus for measuring benefits is also presented.
ERIC Educational Resources Information Center
Lee, Chwee Beng; Ling, Keck Voon; Reimann, Peter; Diponegoro, Yudho Ahmad; Koh, Chia Heng; Chew, Derwin
2014-01-01
Purpose: The purpose of this paper is to argue for the need to develop pre-service teachers' problem solving ability, in particular, in the context of real-world complex problems. Design/methodology/approach: To argue for the need to develop pre-service teachers' problem solving skills, the authors describe a web-based problem representation…
Using the CPGI to Determine Problem Gambling Prevalence in Australia: Measurement Issues
ERIC Educational Resources Information Center
Jackson, Alun C.; Wynne, Harold; Dowling, Nicki A.; Tomnay, Jane E.; Thomas, Shane A.
2010-01-01
Most states and territories in Australia have adopted the Problem Gambling Severity Index (PGSI) of the Canadian Problem Gambling Index as the standard measure of problem gambling in their prevalence studies and research programs. However, notwithstanding this attempted standardisation, differences in sampling and recruitment methodologies and in…
Using Problem-Based Learning to Enhance Team and Player Development in Youth Soccer
ERIC Educational Resources Information Center
Hubball, Harry; Robertson, Scott
2004-01-01
Problem-based learning (PBL) is a coaching and teaching methodology that develops knowledge, abilities, and skills. It also encourages participation, collaborative investigation, and the resolution of authentic, "ill-structured" problems through the use of problem definition, teamwork, communication, data collection, decision-making,…
NASA Astrophysics Data System (ADS)
Omoragbon, Amen
Although, the Aerospace and Defense (A&D) industry is a significant contributor to the United States' economy, national prestige and national security, it experiences significant cost and schedule overruns. This problem is related to the differences between technology acquisition assessments and aerospace vehicle conceptual design. Acquisition assessments evaluate broad sets of alternatives with mostly qualitative techniques, while conceptual design tools evaluate narrow set of alternatives with multidisciplinary tools. In order for these two fields to communicate effectively, a common platform for both concerns is desired. This research is an original contribution to a three-part solution to this problem. It discusses the decomposition step of an innovation technology and sizing tool generation framework. It identifies complex multidisciplinary system definitions as a bridge between acquisition and conceptual design. It establishes complex multidisciplinary building blocks that can be used to build synthesis systems as well as technology portfolios. It also describes a Graphical User Interface Designed to aid in decomposition process. Finally, it demonstrates an application of the methodology to a relevant acquisition and conceptual design problem posed by the US Air Force.
A methodology to find the elementary landscape decomposition of combinatorial optimization problems.
Chicano, Francisco; Whitley, L Darrell; Alba, Enrique
2011-01-01
A small number of combinatorial optimization problems have search spaces that correspond to elementary landscapes, where the objective function f is an eigenfunction of the Laplacian that describes the neighborhood structure of the search space. Many problems are not elementary; however, the objective function of a combinatorial optimization problem can always be expressed as a superposition of multiple elementary landscapes if the underlying neighborhood used is symmetric. This paper presents theoretical results that provide the foundation for algebraic methods that can be used to decompose the objective function of an arbitrary combinatorial optimization problem into a sum of subfunctions, where each subfunction is an elementary landscape. Many steps of this process can be automated, and indeed a software tool could be developed that assists the researcher in finding a landscape decomposition. This methodology is then used to show that the subset sum problem is a superposition of two elementary landscapes, and to show that the quadratic assignment problem is a superposition of three elementary landscapes.
On the generalized VIP time integral methodology for transient thermal problems
NASA Technical Reports Server (NTRS)
Mei, Youping; Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The paper describes the development and applicability of a generalized VIrtual-Pulse (VIP) time integral method of computation for thermal problems. Unlike past approaches for general heat transfer computations, and with the advent of high speed computing technology and the importance of parallel computations for efficient use of computing environments, a major motivation via the developments described in this paper is the need for developing explicit computational procedures with improved accuracy and stability characteristics. As a consequence, a new and effective VIP methodology is described which inherits these improved characteristics. Numerical illustrative examples are provided to demonstrate the developments and validate the results obtained for thermal problems.
A robust optimization methodology for preliminary aircraft design
NASA Astrophysics Data System (ADS)
Prigent, S.; Maréchal, P.; Rondepierre, A.; Druot, T.; Belleville, M.
2016-05-01
This article focuses on a robust optimization of an aircraft preliminary design under operational constraints. According to engineers' know-how, the aircraft preliminary design problem can be modelled as an uncertain optimization problem whose objective (the cost or the fuel consumption) is almost affine, and whose constraints are convex. It is shown that this uncertain optimization problem can be approximated in a conservative manner by an uncertain linear optimization program, which enables the use of the techniques of robust linear programming of Ben-Tal, El Ghaoui, and Nemirovski [Robust Optimization, Princeton University Press, 2009]. This methodology is then applied to two real cases of aircraft design and numerical results are presented.
DESIGNING PROCESSES FOR ENVIRONMENTAL PROBLEMS
Designing for the environment requires consideration of environmental impacts. The Generalized WAR Algorithm is the methodology that allows the user to evaluate the potential environmental impact of the design of a chemical process. In this methodology, chemicals are assigned val...
Using soft systems methodology to develop a simulation of out-patient services.
Lehaney, B; Paul, R J
1994-10-01
Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.
Brain Dynamics: Methodological Issues and Applications in Psychiatric and Neurologic Diseases
NASA Astrophysics Data System (ADS)
Pezard, Laurent
The human brain is a complex dynamical system generating the EEG signal. Numerical methods developed to study complex physical dynamics have been used to characterize EEG since the mid-eighties. This endeavor raised several issues related to the specificity of EEG. Firstly, theoretical and methodological studies should address the major differences between the dynamics of the human brain and physical systems. Secondly, this approach of EEG signal should prove to be relevant for dealing with physiological or clinical problems. A set of studies performed in our group is presented here within the context of these two problematic aspects. After the discussion of methodological drawbacks, we review numerical simulations related to the high dimension and spatial extension of brain dynamics. Experimental studies in neurologic and psychiatric disease are then presented. We conclude that if it is now clear that brain dynamics changes in relation with clinical situations, methodological problems remain largely unsolved.
Improving engineering effectiveness
NASA Technical Reports Server (NTRS)
Fiero, J. D.
1985-01-01
Methodologies to improve engineering productivity were investigated. The rocky road to improving engineering effectiveness is reviewed utilizing a specific semiconductor engineering organization as a case study. The organization had a performance problem regarding new product introductions. With the help of this consultant as a change agent the engineering team used a systems approach to through variables that were effecting their output significantly. Critical factors for improving this engineering organization's effectiveness and the roles/responsibilities of management, the individual engineers and the internal consultant are discussed.
Accountancy, teaching methods, sex, and American College Test scores.
Heritage, J; Harper, B S; Harper, J P
1990-10-01
This study examines the significance of sex, methodology, academic preparation, and age as related to development of judgmental and problem-solving skills. Sex, American College Test (ACT) Mathematics scores, Composite ACT scores, grades in course work, grade point average (GPA), and age were used in studying the effects of teaching method on 96 students' ability to analyze data in financial statements. Results reflect positively on accounting students compared to the general college population and the women students in particular.
NASA Astrophysics Data System (ADS)
Dehbozorgi, Mohammad Reza
2000-10-01
Improvements in power system reliability have always been of interest to both power companies and customers. Since there are no sizable electrical energy storage elements in electrical power systems, the generated power should match the load demand at any given time. Failure to meet this balance may cause severe system problems, including loss of generation and system blackouts. This thesis proposes a methodology which can respond to either loss of generation or loss of load. It is based on switching of electric water heaters using power system frequency as the controlling signal. The proposed methodology encounters, and the thesis has addressed, the following associated problems. The controller must be interfaced with the existing thermostat control. When necessary to switch on loads, the water in the tank should not be overheated. Rapid switching of blocks of load, or chattering, has been considered. The contributions of the thesis are: (A) A system has been proposed which makes a significant portion of the distributed loads connected to a power system to behave in a predetermined manner to improve the power system response during disturbances. (B) The action of the proposed system is transparent to the customers. (C) The thesis proposes a simple analysis for determining the amount of such loads which might be switched and relates this amount to the size of the disturbances which can occur in the utility. (D) The proposed system acts without any formal communication links, solely using the embedded information present system-wide. (E) The methodology of the thesis proposes switching of water heater loads based on a simple, localized frequency set-point controller. The thesis has identified the consequent problem of rapid switching of distributed loads, which is referred to as chattering. (F) Two approaches have been proposed to reduce chattering to tolerable levels. (G) A frequency controller has been designed and built according to the specifications required to switch electric water heater loads in response to power system disturbances. (H) A cost analysis for building and installing the distributed frequency controller has been carried out. (I) The proposed equipment and methodology has been implemented and tested successfully. (Abstract shortened by UMI.)
NASA Astrophysics Data System (ADS)
Nebot, Àngela; Mugica, Francisco
2012-10-01
Fuzzy inductive reasoning (FIR) is a modelling and simulation methodology derived from the General Systems Problem Solver. It compares favourably with other soft computing methodologies, such as neural networks, genetic or neuro-fuzzy systems, and with hard computing methodologies, such as AR, ARIMA, or NARMAX, when it is used to predict future behaviour of different kinds of systems. This paper contains an overview of the FIR methodology, its historical background, and its evolution.
Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab
2014-08-25
We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.
OPUS: Optimal Projection for Uncertain Systems. Volume 1
1991-09-01
unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
Fuzzy Linear Programming and its Application in Home Textile Firm
NASA Astrophysics Data System (ADS)
Vasant, P.; Ganesan, T.; Elamvazuthi, I.
2011-06-01
In this paper, new fuzzy linear programming (FLP) based methodology using a specific membership function, named as modified logistic membership function is proposed. The modified logistic membership function is first formulated and its flexibility in taking up vagueness in parameter is established by an analytical approach. The developed methodology of FLP has provided a confidence in applying to real life industrial production planning problem. This approach of solving industrial production planning problem can have feedback with the decision maker, the implementer and the analyst.
Direct Maximization of Protein Identifications from Tandem Mass Spectra*
Spivak, Marina; Weston, Jason; Tomazela, Daniela; MacCoss, Michael J.; Noble, William Stafford
2012-01-01
The goal of many shotgun proteomics experiments is to determine the protein complement of a complex biological mixture. For many mixtures, most methodological approaches fall significantly short of this goal. Existing solutions to this problem typically subdivide the task into two stages: first identifying a collection of peptides with a low false discovery rate and then inferring from the peptides a corresponding set of proteins. In contrast, we formulate the protein identification problem as a single optimization problem, which we solve using machine learning methods. This approach is motivated by the observation that the peptide and protein level tasks are cooperative, and the solution to each can be improved by using information about the solution to the other. The resulting algorithm directly controls the relevant error rate, can incorporate a wide variety of evidence and, for complex samples, provides 18–34% more protein identifications than the current state of the art approaches. PMID:22052992
Problems in Classifying Mild Cognitive Impairment (MCI): One or Multiple Syndromes?
Díaz-Mardomingo, María del Carmen; García-Herranz, Sara; Rodríguez-Fernández, Raquel; Venero, César; Peraita, Herminia
2017-01-01
As the conceptual, methodological, and technological advances applied to dementias have evolved the construct of mild cognitive impairment (MCI), one problem encountered has been its classification into subtypes. Here, we aim to revise the concept of MCI and its subtypes, addressing the problems of classification not only from the psychometric point of view or by using alternative methods, such as latent class analysis, but also considering the absence of normative data. In addition to the well-known influence of certain factors on cognitive function, such as educational level and cultural traits, recent studies highlight the relevance of other factors that may significantly affect the genesis and evolution of MCI: subjective memory complaints, loneliness, social isolation, etc. The present work will contemplate the most relevant attempts to clarify the issue of MCI categorization and classification, combining our own data with that from recent studies which suggest the role of relevant psychosocial factors in MCI. PMID:28862676
McGreevy, P D; Della Torre, P K; Evans, D L
2003-01-01
Interactive software has been developed on CD-ROM to facilitate learning of problem formulation, diagnostic methodology, and therapeutic options in dog and cat behavior problems. Students working in small groups are presented with a signalment, a case history, and brief description of the problem behavior as perceived by the client. Students then navigate through the case history by asking the client questions from an icon-driven question pad. Animated video responses to the questions are provided. Students are then required to rate the significance of the questions and answers with respect to the development of the unwelcome behavior. Links to online self-assessments and to resource materials about causation and treatment options are provided to assist students in their decision-making process. The activity concludes with a software-generated e-mail submission that includes the recorded history, diagnosis, and recommended treatment for assessment purposes.
Bouça-Machado, Raquel; Rosário, Madalena; Alarcão, Joana; Correia-Guedes, Leonor; Abreu, Daisy; Ferreira, Joaquim J
2017-01-25
Over the past decades there has been a significant increase in the number of published clinical trials in palliative care. However, empirical evidence suggests that there are methodological problems in the design and conduct of studies, which raises questions about the validity and generalisability of the results and of the strength of the available evidence. We sought to evaluate the methodological characteristics and assess the quality of reporting of clinical trials in palliative care. We performed a systematic review of published clinical trials assessing therapeutic interventions in palliative care. Trials were identified using MEDLINE (from its inception to February 2015). We assessed methodological characteristics and describe the quality of reporting using the Cochrane Risk of Bias tool. We retrieved 107 studies. The most common medical field studied was oncology, and 43.9% of trials evaluated pharmacological interventions. Symptom control and physical dimensions (e.g. intervention on pain, breathlessness, nausea) were the palliative care-specific issues most studied. We found under-reporting of key information in particular on random sequence generation, allocation concealment, and blinding. While the number of clinical trials in palliative care has increased over time, methodological quality remains suboptimal. This compromises the quality of studies. Therefore, a greater effort is needed to enable the appropriate performance of future studies and increase the robustness of evidence-based medicine in this important field.
Grant, Nina; Lawrence, Megan; Preti, Antonio; Wykes, Til; Cella, Matteo
2017-08-01
People with a diagnosis of schizophrenia have significant social and functional difficulties. Social cognition was found to influences these outcomes and in recent years interventions targeting this domain were developed. This paper reviews the existing literature on social cognition interventions for people with a diagnosis of schizophrenia focussing on: i) comparing focussed (i.e. targeting only one social cognitive domain) and global interventions and ii) studies methodological quality. Systematic search was conducted on PubMed and PsycInfo. Studies were included if they were randomised control trials, participants had a diagnosis of schizophrenia or schizoaffective disorder, and the intervention targeted at least one out of four social cognition domains (i.e. theory of mind, affect recognition, social perception and attribution bias). All papers were assessed for methodological quality. Information on the intervention, control condition, study methodology and the main findings from each study were extracted and critically summarised. Data from 32 studies fulfilled the inclusion criteria, considering a total of 1440 participants. Taking part in social cognition interventions produced significant improvements in theory of mind and affect recognition compared to both passive and active control conditions. Results were less clear for social perception and attributional bias. Focussed and global interventions had similar results on outcomes. Overall study methodological quality was modest. There was very limited evidence showing that social cognitive intervention result in functional outcome improvement. The evidence considered suggests that social cognition interventions may be a valuable approach for people with a diagnosis of schizophrenia. However, evidence quality is limited by measure heterogeneity, modest study methodology and short follow-up periods. The findings point to a number of recommendations for future research, including measurement standardisation, appropriately powered studies and investigation of the impact of social cognition improvements on functioning problems. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Applying Lakatos' Theory to the Theory of Mathematical Problem Solving.
ERIC Educational Resources Information Center
Nunokawa, Kazuhiko
1996-01-01
The relation between Lakatos' theory and issues in mathematics education, especially mathematical problem solving, is investigated by examining Lakatos' methodology of a scientific research program. (AIM)
Fernandez, Ana Maria; Vera-Villarroel, Pablo; Sierra, Juan Carlos; Zubeidat, Ihab
2007-01-01
The authors studied gender differences in response to hypothetical infidelity in Spanish students. Using a forced-choice methodology, the authors asked a sample of 266 participants to indicate which kind of infidelity would be more distressing: emotional or sexual. Men were significantly more distressed by sexual infidelity than were women, and women were significantly more distressed by emotional infidelity than were men. Results supported the hypothesis that particular infidelity types, which resemble adaptive problems that human beings faced in the past, contribute to the psychology of jealousy. The results are consistent with previous cross-cultural research.
Tenerife revisited: the critical role of dentistry.
Brannon, R B; Morlang, W M
2001-05-01
The authors record the contribution of dentistry to the identification of victims of one of the most significant disasters in the history of aviation-the March 1977 collision of two Boeing 747 jumbo jets in the Canary Islands, which resulted in 583 fatalities. Dental identification was the primary method of victim identification because a high percentage of the bodies were severely burned. Virtually all aspects of the U.S. identification efforts have been reported with the exception of the valuable role of dentistry. The dental team's organization, methodology, and significant contributions to forensic dentistry and a variety of remarkable problems that the team encountered are documented.
On the Analysis of Two-Person Problem Solving Protocols.
ERIC Educational Resources Information Center
Schoenfeld, Alan H.
Methodological issues in the use of protocol analysis for research into human problem solving processes are examined through a case study in which two students were videotaped as they worked together to solve mathematical problems "out loud." The students' chosen strategic or executive behavior in examining and solving a problem was…
Problem? "No Problem!" Solving Technical Contradictions
ERIC Educational Resources Information Center
Kutz, K. Scott; Stefan, Victor
2007-01-01
TRIZ (pronounced TREES), the Russian acronym for the theory of inventive problem solving, enables a person to focus his attention on finding genuine, potential solutions in contrast to searching for ideas that "may" work through a happenstance way. It is a patent database-backed methodology that helps to reduce time spent on the problem,…
ERIC Educational Resources Information Center
Cormas, Peter C.
2016-01-01
Preservice teachers (N = 27) in two sections of a sequenced, methodological and process integrated mathematics/science course solved a levers problem with three similar learning processes and a problem-solving approach, and identified a problem-solving approach through one different learning process. Similar learning processes used included:…
A TAPS Interactive Multimedia Package to Solve Engineering Dynamics Problem
ERIC Educational Resources Information Center
Sidhu, S. Manjit; Selvanathan, N.
2005-01-01
Purpose: To expose engineering students to using modern technologies, such as multimedia packages, to learn, visualize and solve engineering problems, such as in mechanics dynamics. Design/methodology/approach: A multimedia problem-solving prototype package is developed to help students solve an engineering problem in a step-by-step approach. A…
Hardware proofs using EHDM and the RSRE verification methodology
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Sjogren, Jon A.
1988-01-01
Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.
phMRI: methodological considerations for mitigating potential confounding factors
Bourke, Julius H.; Wall, Matthew B.
2015-01-01
Pharmacological Magnetic Resonance Imaging (phMRI) is a variant of conventional MRI that adds pharmacological manipulations in order to study the effects of drugs, or uses pharmacological probes to investigate basic or applied (e.g., clinical) neuroscience questions. Issues that may confound the interpretation of results from various types of phMRI studies are briefly discussed, and a set of methodological strategies that can mitigate these problems are described. These include strategies that can be employed at every stage of investigation, from study design to interpretation of resulting data, and additional techniques suited for use with clinical populations are also featured. Pharmacological MRI is a challenging area of research that has both significant advantages and formidable difficulties, however with due consideration and use of these strategies many of the key obstacles can be overcome. PMID:25999812
How Root Cause Analysis Can Improve the Value Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wixson, James Robert
2002-05-01
Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can bemore » developed in the creativity phase because the team better understands the problems associated with these functions.« less
Helicopter-V/STOL dynamic wind and turbulence design methodology
NASA Technical Reports Server (NTRS)
Bailey, J. Earl
1987-01-01
Aircraft and helicopter accidents due to severe dynamic wind and turbulence continue to present challenging design problems. The development of the current set of design analysis tools for a aircraft wind and turbulence design began in the 1940's and 1950's. The areas of helicopter dynamic wind and turbulence modeling and vehicle response to severe dynamic wind inputs (microburst type phenomena) during takeoff and landing remain as major unsolved design problems from a lack of both environmental data and computational methodology. The development of helicopter and V/STOL dynamic wind and turbulence response computation methology is reviewed, the current state of the design art in industry is outlined, and comments on design methodology are made which may serve to improve future flight vehicle design.
Ambient Vibration Testing for Story Stiffness Estimation of a Heritage Timber Building
Min, Kyung-Won; Kim, Junhee; Park, Sung-Ah; Park, Chan-Soo
2013-01-01
This paper investigates dynamic characteristics of a historic wooden structure by ambient vibration testing, presenting a novel estimation methodology of story stiffness for the purpose of vibration-based structural health monitoring. As for the ambient vibration testing, measured structural responses are analyzed by two output-only system identification methods (i.e., frequency domain decomposition and stochastic subspace identification) to estimate modal parameters. The proposed methodology of story stiffness is estimation based on an eigenvalue problem derived from a vibratory rigid body model. Using the identified natural frequencies, the eigenvalue problem is efficiently solved and uniquely yields story stiffness. It is noteworthy that application of the proposed methodology is not necessarily confined to the wooden structure exampled in the paper. PMID:24227999
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Craig, Hugh; Berretta, Regina; Moscato, Pablo
2016-01-01
In this study we propose a novel, unsupervised clustering methodology for analyzing large datasets. This new, efficient methodology converts the general clustering problem into the community detection problem in graph by using the Jensen-Shannon distance, a dissimilarity measure originating in Information Theory. Moreover, we use graph theoretic concepts for the generation and analysis of proximity graphs. Our methodology is based on a newly proposed memetic algorithm (iMA-Net) for discovering clusters of data elements by maximizing the modularity function in proximity graphs of literary works. To test the effectiveness of this general methodology, we apply it to a text corpus dataset, which contains frequencies of approximately 55,114 unique words across all 168 written in the Shakespearean era (16th and 17th centuries), to analyze and detect clusters of similar plays. Experimental results and comparison with state-of-the-art clustering methods demonstrate the remarkable performance of our new method for identifying high quality clusters which reflect the commonalities in the literary style of the plays. PMID:27571416
[Problem-based learning, a strategy to employ it].
Guillamet Lloveras, Ana; Celma Vicente, Matilde; González Carrión, Pilar; Cano-Caballero Gálvez, Ma Dolores; Pérez Ramírez, Francisca
2009-02-01
The Virgen de las Nieves University School of Nursing has adopted the methodology of Problem-Based Learning (ABP in Spanish acronym) as a supplementary method to gain specific transversal competencies. In so doing, all basic required/obligatory subjects necessary for a degree have been partially affected. With the objective of identifying and administering all the structural and cultural barriers which could impede the success or effectiveness of its adoption, a strategic analysis at the School was carried out. This technique was based on a) knowing the strong and weak points the School has for adopting the Problem-Based Learning methodology; b) describing the structural problems and necessities to carry out this teaching innovation; c) to discover the needs professors have regarding knowledge and skills related to Problem-Based Learning; d) to prepare students by informing them about the characteristics of Problem-Based Learning; e) to evaluate the results obtained by means of professor and student opinions, f) to adopt the improvements identified. The stages followed were: strategic analysis, preparation, pilot program, adoption and evaluation.
A Nursing Process Methodology.
ERIC Educational Resources Information Center
Ryan-Wenger, Nancy M.
1990-01-01
A nursing methodology developed by the faculty at The Ohio State University teaches nursing students problem-solving techniques applicable to any nursing situation. It also provides faculty and students with a basis for measuring students' progress and ability in applying the nursing process. (Author)
Software production methodology tested project
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1976-01-01
The history and results of a 3 1/2-year study in software development methodology are reported. The findings of this study have become the basis for DSN software development guidelines and standard practices. The article discusses accomplishments, discoveries, problems, recommendations and future directions.
STUDYING FOREST ROOT SYSTEMS - AN OVERVIEW OF METHODOLOGICAL PROBLEMS
The study of tree root systems is central to understanding forest ecosystem carbon and nutrient cycles, nutrient and water uptake, C allocation patterns by trees, soil microbial populations, adaptation of trees to stress, soil organic matter production, etc. Methodological probl...
Project management practices in engineering university
NASA Astrophysics Data System (ADS)
Sirazitdinova, Y.; Dulzon, A.; Mueller, B.
2015-10-01
The article presents the analysis of usage of project management methodology in Tomsk Polytechnic University, in particular the experience with the course Project management which started 15 years ago. The article presents the discussion around advantages of project management methodology for engineering education and administration of the university in general and the problems impeding extensive implementation of this methodology in teaching, research and management in the university.
NASA Technical Reports Server (NTRS)
David, J. W.; Mitchell, L. D.
1982-01-01
Difficulties in solution methodology to be used to deal with the potentially higher nonlinear rotor equations when dynamic coupling is included. A solution methodology is selected to solve the nonlinear differential equations. The selected method was verified to give good results even at large nonlinearity levels. The transfer matrix methodology is extended to the solution of nonlinear problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krummel, J.R.; Markin, J.B.; O'Neill, R.V.
Regional analyses of the interaction between human populations and natural resources must integrate landscape scale environmental problems. An approach that considers human culture, environmental processes, and resource needs offers an appropriate methodology. With this methodology, we analyze problems of food availability in African cattle-keeping societies. The analysis interrelates cattle biomass, forage availability, milk and blood production, crop yields, gathering, food subsidies, population, and variable precipitation. While an excess of cattle leads to overgrazing, cattle also serve as valuable food storage mechanisms during low rainfall periods. Food subsidies support higher population levels but do not alter drought-induced population fluctuations. Variable precipitationmore » patterns require solutions that stabilize year-to-year food production and also address problems of overpopulation.« less
Automating the packing heuristic design process with genetic programming.
Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John
2012-01-01
The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.
NASA Astrophysics Data System (ADS)
Mwakabuta, Ndaga Stanslaus
Electric power distribution systems play a significant role in providing continuous and "quality" electrical energy to different classes of customers. In the context of the present restrictions on transmission system expansions and the new paradigm of "open and shared" infrastructure, new approaches to distribution system analyses, economic and operational decision-making need investigation. This dissertation includes three layers of distribution system investigations. In the basic level, improved linear models are shown to offer significant advantages over previous models for advanced analysis. In the intermediate level, the improved model is applied to solve the traditional problem of operating cost minimization using capacitors and voltage regulators. In the advanced level, an artificial intelligence technique is applied to minimize cost under Distributed Generation injection from private vendors. Soft computing techniques are finding increasing applications in solving optimization problems in large and complex practical systems. The dissertation focuses on Genetic Algorithm for investigating the economic aspects of distributed generation penetration without compromising the operational security of the distribution system. The work presents a methodology for determining the optimal pricing of distributed generation that would help utilities make a decision on how to operate their system economically. This would enable modular and flexible investments that have real benefits to the electric distribution system. Improved reliability for both customers and the distribution system in general, reduced environmental impacts, increased efficiency of energy use, and reduced costs of energy services are some advantages.
The Dogma of "The" Scientific Method.
ERIC Educational Resources Information Center
Wivagg, Dan; Allchin, Douglas
2002-01-01
Points out major problems with the scientific method as a model for learning about methodology in science and suggests teaching about the scientists' toolbox to remedy problems with the conventional scientific method. (KHR)
Fuzzy multi objective transportation problem – evolutionary algorithm approach
NASA Astrophysics Data System (ADS)
Karthy, T.; Ganesan, K.
2018-04-01
This paper deals with fuzzy multi objective transportation problem. An fuzzy optimal compromise solution is obtained by using Fuzzy Genetic Algorithm. A numerical example is provided to illustrate the methodology.
Recent advances in computational-analytical integral transforms for convection-diffusion problems
NASA Astrophysics Data System (ADS)
Cotta, R. M.; Naveira-Cotta, C. P.; Knupp, D. C.; Zotin, J. L. Z.; Pontes, P. C.; Almeida, A. P.
2017-10-01
An unifying overview of the Generalized Integral Transform Technique (GITT) as a computational-analytical approach for solving convection-diffusion problems is presented. This work is aimed at bringing together some of the most recent developments on both accuracy and convergence improvements on this well-established hybrid numerical-analytical methodology for partial differential equations. Special emphasis is given to novel algorithm implementations, all directly connected to enhancing the eigenfunction expansion basis, such as a single domain reformulation strategy for handling complex geometries, an integral balance scheme in dealing with multiscale problems, the adoption of convective eigenvalue problems in formulations with significant convection effects, and the direct integral transformation of nonlinear convection-diffusion problems based on nonlinear eigenvalue problems. Then, selected examples are presented that illustrate the improvement achieved in each class of extension, in terms of convergence acceleration and accuracy gain, which are related to conjugated heat transfer in complex or multiscale microchannel-substrate geometries, multidimensional Burgers equation model, and diffusive metal extraction through polymeric hollow fiber membranes. Numerical results are reported for each application and, where appropriate, critically compared against the traditional GITT scheme without convergence enhancement schemes and commercial or dedicated purely numerical approaches.
Robust optimization modelling with applications to industry and environmental problems
NASA Astrophysics Data System (ADS)
Chaerani, Diah; Dewanto, Stanley P.; Lesmana, Eman
2017-10-01
Robust Optimization (RO) modeling is one of the existing methodology for handling data uncertainty in optimization problem. The main challenge in this RO methodology is how and when we can reformulate the robust counterpart of uncertain problems as a computationally tractable optimization problem or at least approximate the robust counterpart by a tractable problem. Due to its definition the robust counterpart highly depends on how we choose the uncertainty set. As a consequence we can meet this challenge only if this set is chosen in a suitable way. The development on RO grows fast, since 2004, a new approach of RO called Adjustable Robust Optimization (ARO) is introduced to handle uncertain problems when the decision variables must be decided as a ”wait and see” decision variables. Different than the classic Robust Optimization (RO) that models decision variables as ”here and now”. In ARO, the uncertain problems can be considered as a multistage decision problem, thus decision variables involved are now become the wait and see decision variables. In this paper we present the applications of both RO and ARO. We present briefly all results to strengthen the importance of RO and ARO in many real life problems.
Evaluating Writing Programs: Paradigms, Problems, Possibilities.
ERIC Educational Resources Information Center
McLeod, Susan H.
1992-01-01
Describes two methodological approaches (qualitative and quantitative) that grow out of two different research examples. Suggests the problems these methods present. Discusses the ways in which an awareness of these problems can help teachers to understand how to work with researchers in designing useful evaluations of writing programs. (PRA)
[Scientific and methodologic approaches to evaluating medical management for workers of Kazakhstan].
2012-01-01
The article covers topical problems of workers' health preservation. Complex research results enabled to evaluate and analyze occupational risks in leading industries of Kazakhstan, for improving scientific and methodologic approaches to medical management for workers subjected to hazardous conditions.
Global/local methods research using a common structural analysis framework
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Ransom, Jonathan B.; Griffin, O. H., Jr.; Thompson, Danniella M.
1991-01-01
Methodologies for global/local stress analysis are described including both two- and three-dimensional analysis methods. These methods are being developed within a common structural analysis framework. Representative structural analysis problems are presented to demonstrate the global/local methodologies being developed.
Terminology and Methodology Related to the Use of Heart Rate Responsivity in Infancy Research
ERIC Educational Resources Information Center
Woodcock, James M.
1971-01-01
Methodological problems in measuring and interpreting infantile heart rate reactivity in research are discussed. Various ways of describing cardiac activity are listed. Attention is given to the relationship between resting state and heart rate responsivity. (Author/WY)
Employee Turnover: An Empirical and Methodological Assessment.
ERIC Educational Resources Information Center
Muchinsky, Paul M.; Tuttle, Mark L.
1979-01-01
Reviews research on the prediction of employee turnover. Groups predictor variables into five general categories: attitudinal (job satisfaction), biodata, work-related, personal, and test-score predictors. Consistent relationships between common predictor variables and turnover were found for four categories. Eight methodological problems/issues…
ERIC Educational Resources Information Center
And Others; Rynders, John E.
1978-01-01
For many years, the educational capabilities of Down's syndrome persons have been underestimated because a large number of studies purporting to give an accurate picture of Down's syndrome persons' developmental capabilities have had serious methodological flaws. (Author)
Analysis of oil-pipeline distribution of multiple products subject to delivery time-windows
NASA Astrophysics Data System (ADS)
Jittamai, Phongchai
This dissertation defines the operational problems of, and develops solution methodologies for, a distribution of multiple products into oil pipeline subject to delivery time-windows constraints. A multiple-product oil pipeline is a pipeline system composing of pipes, pumps, valves and storage facilities used to transport different types of liquids. Typically, products delivered by pipelines are petroleum of different grades moving either from production facilities to refineries or from refineries to distributors. Time-windows, which are generally used in logistics and scheduling areas, are incorporated in this study. The distribution of multiple products into oil pipeline subject to delivery time-windows is modeled as multicommodity network flow structure and mathematically formulated. The main focus of this dissertation is the investigation of operating issues and problem complexity of single-source pipeline problems and also providing solution methodology to compute input schedule that yields minimum total time violation from due delivery time-windows. The problem is proved to be NP-complete. The heuristic approach, a reversed-flow algorithm, is developed based on pipeline flow reversibility to compute input schedule for the pipeline problem. This algorithm is implemented in no longer than O(T·E) time. This dissertation also extends the study to examine some operating attributes and problem complexity of multiple-source pipelines. The multiple-source pipeline problem is also NP-complete. A heuristic algorithm modified from the one used in single-source pipeline problems is introduced. This algorithm can also be implemented in no longer than O(T·E) time. Computational results are presented for both methodologies on randomly generated problem sets. The computational experience indicates that reversed-flow algorithms provide good solutions in comparison with the optimal solutions. Only 25% of the problems tested were more than 30% greater than optimal values and approximately 40% of the tested problems were solved optimally by the algorithms.
NASA Astrophysics Data System (ADS)
Dib, Alain; Kavvas, M. Levent
2018-03-01
The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.
A Novel Performance Evaluation Methodology for Single-Target Trackers.
Kristan, Matej; Matas, Jiri; Leonardis, Ales; Vojir, Tomas; Pflugfelder, Roman; Fernandez, Gustavo; Nebehay, Georg; Porikli, Fatih; Cehovin, Luka
2016-11-01
This paper addresses the problem of single-target tracker performance evaluation. We consider the performance measures, the dataset and the evaluation system to be the most important components of tracker evaluation and propose requirements for each of them. The requirements are the basis of a new evaluation methodology that aims at a simple and easily interpretable tracker comparison. The ranking-based methodology addresses tracker equivalence in terms of statistical significance and practical differences. A fully-annotated dataset with per-frame annotations with several visual attributes is introduced. The diversity of its visual properties is maximized in a novel way by clustering a large number of videos according to their visual attributes. This makes it the most sophistically constructed and annotated dataset to date. A multi-platform evaluation system allowing easy integration of third-party trackers is presented as well. The proposed evaluation methodology was tested on the VOT2014 challenge on the new dataset and 38 trackers, making it the largest benchmark to date. Most of the tested trackers are indeed state-of-the-art since they outperform the standard baselines, resulting in a highly-challenging benchmark. An exhaustive analysis of the dataset from the perspective of tracking difficulty is carried out. To facilitate tracker comparison a new performance visualization technique is proposed.
NASA Astrophysics Data System (ADS)
Neumann, Karl
1987-06-01
In the methodological discussion of recent years it has become apparent that many research problems, including problems relating to the theory of educational science, cannot be solved by using quantitative methods. The multifaceted aspects of human behaviour and all its environment-bound subtle nuances, especially the process of education or the development of identity, cannot fully be taken into account within a rigid neopositivist approach. In employing the paradigm of symbolic interactionism as a suitable model for the analysis of processes of education and formation, the research has generally to start out from complex reciprocal social interactions instead of unambigious connections of causes. In analysing several particular methodological problems, the article demonstrates some weaknesses of quantitative approaches and then shows the advantages in and the necessity for using qualitative research tools.
NASA Astrophysics Data System (ADS)
González-Gómez, David; Jeong, Jin Su; Airado Rodríguez, Diego; Cañada-Cañada, Florentina
2016-06-01
"Flipped classroom" teaching methodology is a type of blended learning in which the traditional class setting is inverted. Lecture is shifted outside of class, while the classroom time is employed to solve problems or doing practical works through the discussion/peer collaboration of students and instructors. This relatively new instructional methodology claims that flipping your classroom engages more effectively students with the learning process, achieving better teaching results. Thus, this research aimed to evaluate the effects of the flipped classroom on the students' performance and perception of this new methodology. This study was conducted in a general science course, sophomore of the Primary Education bachelor degree in the Training Teaching School of the University of Extremadura (Spain) during the course 2014/2015. In order to assess the suitability of the proposed methodology, the class was divided in two groups. For the first group, a traditional methodology was followed, and it was used as control. On the other hand, the "flipped classroom" methodology was used in the second group, where the students were given diverse materials, such as video lessons and reading materials, before the class to be revised at home by them. Online questionnaires were as well provided to assess the progress of the students before the class. Finally, the results were compared in terms of students' achievements and a post-task survey was also conducted to know the students' perceptions. A statistically significant difference was found on all assessments with the flipped class students performing higher on average. In addition, most students had a favorable perception about the flipped classroom noting the ability to pause, rewind and review lectures, as well as increased individualized learning and increased teacher availability.
NASA Astrophysics Data System (ADS)
Dos Santos Neta, Maria Luiza
2017-02-01
In the Medium Teaching when topics of Astronomy are supplied happen in Physics discipline with the use of methodologies that don't contribute to the development of the learning significant, however to turn them effective it is fundamental, for the apprehension of habitual events. By this context intends to analyze and to understand the current contributions of the use of a proposal of Teaching of Physics promoted the significant learning again, when topics of Astronomy be worked with the students of the Medium Teaching of a public school of the State Net of Teaching located in the city of Sirinhaém, in the south coast of Pernambuco. This research presented characteristic qualitative, as well as quantitative contemplating methodological procedures, such as: the application of a Pre-Test, the didactic intervention/sequences stages of the Cycle of Experience and Post-Test, following by situation-problem. As central theme one worked contents regarding the Astronomy, with prominence for the oceanic tides, being the significant learning stimulated to each stage: exhibition of videos, slides groups, discussions and activities written. The results obtained in the Pre-Test demonstrated that, the conditions of the previous knowledge presented by the students, in relation to the theme to be worked - oceanic tides - if they found inadequate to begin the study on the phenomenon. However, after the application of the didactic intervention/ sequences stages and comparing the result of the Post-Test in function of the Pre-Test was verified that, the previous knowledge are in appropriate conditions for the understanding of the event, as well as, for they be used in situation-problem that demands her understanding They suggests her that, the application of the Cycle of Experience as didactic sequence frequently happens, because it is verified that her use potentiates the construction of the significant learning.
[Function of the present systematic evaluation in establishment of guidance for clinical practice].
Yang, Jin-Hong; Hu, Jing; Yang, Feng-Chun; Zhang, Ning; Wang, Bing; Li, Xin
2012-07-01
Treatment of insomnia with acupuncture is taken as an example to explore the significance and problems existed in the present systematic evaluation in establishment of guidance for clinical practice. Fifteen articles on systematic evaluation of both English and Chinese were retrieved and studied carefully, their basic information was analyzed. Through study on the establishing process of the guidance of clinical practice, researches were focused on the possible significance of the articles to the guidance as well as the notes in the reuse of those articles since problem still existed. It is held that the systematic evaluation has great significance on the establishment of the guidance from the aspects of applicable people, recommended standards of diagnosis and therapeutic evaluation, extended recommendation and methodology. Great importance should also be attached to the direct application of the research result and understanding of the evaluation result. The data should be rechecked when necessary. Great guiding function can be found on the systematic evaluation of articles to the guidance. Moreover, if information needed to be taken into a full play, specific analysis should also be done on the concrete research targets.
Efficient Robust Optimization of Metal Forming Processes using a Sequential Metamodel Based Strategy
NASA Astrophysics Data System (ADS)
Wiebenga, J. H.; Klaseboer, G.; van den Boogaard, A. H.
2011-08-01
The coupling of Finite Element (FE) simulations to mathematical optimization techniques has contributed significantly to product improvements and cost reductions in the metal forming industries. The next challenge is to bridge the gap between deterministic optimization techniques and the industrial need for robustness. This paper introduces a new and generally applicable structured methodology for modeling and solving robust optimization problems. Stochastic design variables or noise variables are taken into account explicitly in the optimization procedure. The metamodel-based strategy is combined with a sequential improvement algorithm to efficiently increase the accuracy of the objective function prediction. This is only done at regions of interest containing the optimal robust design. Application of the methodology to an industrial V-bending process resulted in valuable process insights and an improved robust process design. Moreover, a significant improvement of the robustness (>2σ) was obtained by minimizing the deteriorating effects of several noise variables. The robust optimization results demonstrate the general applicability of the robust optimization strategy and underline the importance of including uncertainty and robustness explicitly in the numerical optimization procedure.
Bayesian design of decision rules for failure detection
NASA Technical Reports Server (NTRS)
Chow, E. Y.; Willsky, A. S.
1984-01-01
The formulation of the decision making process of a failure detection algorithm as a Bayes sequential decision problem provides a simple conceptualization of the decision rule design problem. As the optimal Bayes rule is not computable, a methodology that is based on the Bayesian approach and aimed at a reduced computational requirement is developed for designing suboptimal rules. A numerical algorithm is constructed to facilitate the design and performance evaluation of these suboptimal rules. The result of applying this design methodology to an example shows that this approach is potentially a useful one.
Development of Contemporary Problem-Based Learning Projects in Particle Technology
ERIC Educational Resources Information Center
Harris, Andrew T.
2009-01-01
The University of Sydney has offered an undergraduate course in particle technology using a contemporary problem based learning (PBL) methodology since 2005. Student learning is developed through the solution of complex, open-ended problems drawn from modern chemical engineering practice. Two examples are presented; i) zero emission electricity…
The Study of Socio-Biospheric Problems.
ERIC Educational Resources Information Center
Scott, Andrew M.
Concepts, tools, and a methodology are needed which will permit the analysis of emergent socio-biospheric problems and facilitate their effective management. Many contemporary problems may be characterized as socio-biospheric; for example, pollution of the seas, acid rain, the growth of cities, and an atmosphere loaded with carcinogens. However,…
Atwood's Machine as a Tool to Introduce Variable Mass Systems
ERIC Educational Resources Information Center
de Sousa, Celia A.
2012-01-01
This article discusses an instructional strategy which explores eventual similarities and/or analogies between familiar problems and more sophisticated systems. In this context, the Atwood's machine problem is used to introduce students to more complex problems involving ropes and chains. The methodology proposed helps students to develop the…
Methodology of modeling and measuring computer architectures for plasma simulations
NASA Technical Reports Server (NTRS)
Wang, L. P. T.
1977-01-01
A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.
Towards a Methodology for Identifying Program Constraints During Requirements Analysis
NASA Technical Reports Server (NTRS)
Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo
1997-01-01
Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.
Montero, Javier; Dib, Abraham; Guadilla, Yasmina; Flores, Javier; Santos, Juan Antonio; Aguilar, Rosa Anaya; Gómez-Polo, Cristina
2018-02-01
The aim of this study was to compare the perceived competence for treating prosthodontic patients of two samples of fourth-year dental students: those educated using traditional methodologies and those educated using problem-based learning (PBL). Two cohorts of fourth-year dental students at a dental school in Spain were surveyed: the traditional methods cohort (n=46) was comprised of all students in academic years 2012 and 2013, and the PBL cohort (n=57) was comprised of all students in academic years 2014 and 2015. Students in both cohorts reported the number of prosthodontic treatments they carried out per year and their perceived level of competence in performing such treatments. The results showed that the average number of treatments performed was similar for the two cohorts, except the number of metal-based removable partial dentures was significantly higher for students in the traditional (0.8±1.0) than the PBL (0.4±0.6) cohort. The level of perceived competence to treat complete denture patients for the combined cohorts was significantly higher (7.3±1.1) than that for partial acrylic dentures (6.7±1.5) and combined dentures (5.7±1.3). Students' clinical competence in prosthodontics mainly depended on number of treatments performed as the operator as well as the assistant. Students in the traditional methods cohort considered themselves to be significantly more competent at treating patients for removable partial and fixed prostheses (7.8±1.1 and 7.6±1.1, respectively) than did students in the PBL cohort (6.4±1.5 and 6.6±1.5, respectively). Overall, however, the study found that practical experiences were more important than the teaching method used to achieve students' perceived competence.
ERIC Educational Resources Information Center
Mosher, Paul H.
1979-01-01
Reviews the history, literature, and methodology of collection evaluation or assessment in American research libraries; discusses current problems, tools, and methodology of evaluation; and describes an ongoing collection evaluation program at the Stanford University Libraries. (Author/MBR)
NASA Astrophysics Data System (ADS)
Echer, L.; Marczak, R. J.
2018-02-01
The objective of the present work is to introduce a methodology capable of modelling welded components for structural stress analysis. The modelling technique was based on the recommendations of the International Institute of Welding; however, some geometrical features of the weld fillet were used as design parameters in an optimization problem. Namely, the weld leg length and thickness of the shell elements representing the weld fillet were optimized in such a way that the first natural frequencies were not changed significantly when compared to a reference result. Sequential linear programming was performed for T-joint structures corresponding to two different structural details: with and without full penetration weld fillets. Both structural details were tested in scenarios of various plate thicknesses and depths. Once the optimal parameters were found, a modelling procedure was proposed for T-shaped components. Furthermore, the proposed modelling technique was extended for overlapped welded joints. The results obtained were compared to well-established methodologies presented in standards and in the literature. The comparisons included results for natural frequencies, total mass and structural stress. By these comparisons, it was observed that some established practices produce significant errors in the overall stiffness and inertia. The methodology proposed herein does not share this issue and can be easily extended to other types of structure.
NASA Astrophysics Data System (ADS)
Ghosh, Sukanya; Roy, Souvanic; Sanyal, Manas Kumar
2016-09-01
With the help of a case study, the article has explored current practices of implementation of governmental affordable housing programme for urban poor in a slum of India. This work shows that the issues associated with the problems of governmental affordable housing programme has to be addressed to with a suitable methodology as complexities are not only dealing with quantitative data but qualitative data also. The Hard System Methodologies (HSM), which is conventionally applied to address the issues, deals with real and known problems which can be directly solved. Since most of the issues of affordable housing programme as found in the case study are subjective and complex in nature, Soft System Methodology (SSM) has been tried for better representation from subjective points of views. The article explored drawing of Rich Picture as an SSM approach for better understanding and analysing complex issues and constraints of affordable housing programme so that further exploration of the issues is possible.
Expert System Development Methodology (ESDM)
NASA Technical Reports Server (NTRS)
Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.
1990-01-01
The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.
Hybrid Fourier pseudospectral/discontinuous Galerkin time-domain method for wave propagation
NASA Astrophysics Data System (ADS)
Pagán Muñoz, Raúl; Hornikx, Maarten
2017-11-01
The Fourier Pseudospectral time-domain (Fourier PSTD) method was shown to be an efficient way of modelling acoustic propagation problems as described by the linearized Euler equations (LEE), but is limited to real-valued frequency independent boundary conditions and predominantly staircase-like boundary shapes. This paper presents a hybrid approach to solve the LEE, coupling Fourier PSTD with a nodal Discontinuous Galerkin (DG) method. DG exhibits almost no restrictions with respect to geometrical complexity or boundary conditions. The aim of this novel method is to allow the computation of complex geometries and to be a step towards the implementation of frequency dependent boundary conditions by using the benefits of DG at the boundaries, while keeping the efficient Fourier PSTD in the bulk of the domain. The hybridization approach is based on conformal meshes to avoid spatial interpolation of the DG solutions when transferring values from DG to Fourier PSTD, while the data transfer from Fourier PSTD to DG is done utilizing spectral interpolation of the Fourier PSTD solutions. The accuracy of the hybrid approach is presented for one- and two-dimensional acoustic problems and the main sources of error are investigated. It is concluded that the hybrid methodology does not introduce significant errors compared to the Fourier PSTD stand-alone solver. An example of a cylinder scattering problem is presented and accurate results have been obtained when using the proposed approach. Finally, no instabilities were found during long-time calculation using the current hybrid methodology on a two-dimensional domain.
The implementation of problem-based learning in health service management training programs.
Stankunas, Mindaugas; Czabanowska, Katarzyna; Avery, Mark; Kalediene, Ramune; Babich, Suzanne Marie
2016-10-03
Purpose Strengthening management capacity within the health care sector could have a significant impact on population health. However, many training programs in this area are still delivered using a classic lecture-based approach. The purpose of this paper is to evaluate and better understand the feasibility of using a problem-based learning (PBL) approach in health services management training programs. Design/methodology/approach A PBL teaching approach (based on the Maastricht University model) was tested with second-year postgraduate students from the Master in Public Health Management program at the Lithuanian University of Health Sciences. Students' opinions about PBL were investigated using a questionnaire with eight open-ended questions. Thematic content analysis was chosen to reflect the search for patterns across the data. Findings Respondents stated that the main advantage of PBL was that it was a more interesting and effective way of learning: "It is easier to remember, when you study by yourself and discuss with all peers". In addition, it was mentioned that PBL initiated a rapid exchange of ideas and sharing of personal experience. Students stressed that PBL was a good tool for developing other skills as well, such as "public speaking, communication, logic thinking". All students recommended delivering all other courses in the health services management program using PBL methodologies. Originality/value Findings from our study suggest that PBL may be an effective approach to teaching health services management. Potential problems in implementation are noted.
Udod, Sonia A; Racine, Louise
2017-12-01
To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that are meaningful to nurses. © 2017 John Wiley & Sons Ltd.
Enhanced Methodologies to Enumerate Persons Experiencing Homelessness in a Large Urban Area.
Troisi, Catherine L; D'Andrea, Ritalinda; Grier, Gary; Williams, Stephen
2015-10-01
Homelessness is a public health problem, and persons experiencing homelessness are a vulnerable population. Estimates of the number of persons experiencing homelessness inform funding allocations and services planning and directly determine the ability of a community to intervene effectively in homelessness. The point-in-time (PIT) count presents a logistical problem in large urban areas, particularly those covering a vast geographical area. Working together, academia, local government, and community organizations improved the methodology for the count. Specific enhancements include use of incident command system (ICS), increased number of staging areas/teams, specialized outreach and Special Weapons and Tactics teams, and day-after surveying to collect demographic information. This collaboration and enhanced methodology resulted in a more accurate estimate of the number of persons experiencing homelessness and allowed comparison of findings for 4 years. While initial results showed an increase due to improved counting, the number of persons experiencing homelessness counted for the subsequent years showed significant decrease during the same time period as a "housing first" campaign was implemented. The collaboration also built capacity in each sector: The health department used ICS as a training opportunity; the academics enhanced their community health efforts; the service sector was taught and implemented more rigorous quantitative methods; and the community was exposed to public health as a pragmatic and effective discipline. Improvements made to increase the reliability of the PIT count can be adapted for use in other jurisdictions, leading to improved counts and better evaluation of progress in ending homelessness. © The Author(s) 2015.
Bartolazzi, Armando; Bellotti, Carlo; Sciacchitano, Salvatore
2012-01-01
In the last decade, the β-galactosyl binding protein galectin-3 has been the object of extensive molecular, structural, and functional studies aimed to clarify its biological role in cancer. Multicenter studies also contributed to discover the potential clinical value of galectin-3 expression analysis in distinguishing, preoperatively, benign from malignant thyroid nodules. As a consequence galectin-3 is receiving significant attention as tumor marker for thyroid cancer diagnosis, but some conflicting results mostly owing to methodological problems have been published. The possibility to apply preoperatively a reliable galectin-3 test method on fine needle aspiration biopsy (FNA)-derived thyroid cells represents an important achievement. When correctly applied, the method reduces consistently the gray area of thyroid FNA cytology, contributing to avoid unnecessary thyroid surgery. Although the efficacy and reliability of the galectin-3 test method have been extensively proved in several studies, its translation in the clinical setting requires well-standardized reagents and procedures. After a decade of experimental work on galectin-3-related basic and translational research projects, the major methodological problems that may potentially impair the diagnostic performance of galectin-3 immunotargeting are highlighted and discussed in detail. A standardized protocol for a reliable galectin-3 expression analysis is finally provided. The aim of this contribution is to improve the clinical management of patients with thyroid nodules, promoting the preoperative use of a reliable galectin-3 test method as ancillary technique to conventional thyroid FNA cytology. The final goal is to decrease unnecessary thyroid surgery and its related social costs.
Hospital Management Between The Modern Image And Aging
NASA Astrophysics Data System (ADS)
Dadulescu, Ana-Maria
2015-09-01
Hospital management has experienced significant progress with the evolution of the Romanian health system reform, it has made strides in terms of resource allocation and cost control, new systems for classification, evaluation and monitoring (DRGs, SIUI, CaPeSaRo) were implemented, some taken from other countries and adapted to local conditions, but not always integrated with the other components and sometimes incompletely implemented and developed. This material does not offer definite solutions to current problems. It only briefly addresses the main aspects of hospital activity, and points out some failures with whom hospital managers are presently faced. Once the problems are identified it creates prerequisites for solving them, it opens channels of research and development of new methodologies or correlation of the existing deficient workflows that can be corrected.
16S rRNA beacons for bacterial monitoring during human space missions.
Larios-Sanz, Maia; Kourentzi, Katerina D; Warmflash, David; Jones, Jeffrey; Pierson, Duane L; Willson, Richard C; Fox, George E
2007-04-01
Microorganisms are unavoidable in space environments and their presence has, at times, been a source of problems. Concerns about disease during human space missions are particularly important considering the significant changes the immune system incurs during spaceflight and the history of microbial contamination aboard the Mir space station. Additionally, these contaminants may have adverse effects on instrumentation and life-support systems. A sensitive, highly specific system to detect, characterize, and monitor these microbial populations is essential. Herein we describe a monitoring approach that uses 16S rRNA targeted molecular beacons to successfully detect several specific bacterial groupings. This methodology will greatly simplify in-flight monitoring by minimizing sample handling and processing. We also address and provide solutions to target accessibility problems encountered in hybridizations that target 16S rRNA.
Guina, Jeffrey; Nahhas, Ramzi W.; Goldberg, Adam J.; Farnsworth, Seth
2016-01-01
Background: Trauma is commonly associated with substance-related problems, yet associations between specific substances and specific posttraumatic stress disorder symptoms (PTSSs) are understudied. We hypothesized that substance-related problems are associated with PTSS severities, interpersonal traumas, and benzodiazepine prescriptions. Methods: Using a cross-sectional survey methodology in a consecutive sample of adult outpatients with trauma histories (n = 472), we used logistic regression to examine substance-related problems in general (primary, confirmatory analysis), as well as alcohol, tobacco, and illicit drug problems specifically (secondary, exploratory analyses) in relation to demographics, trauma type, PTSSs, and benzodiazepine prescriptions. Results: After adjusting for multiple testing, several factors were significantly associated with substance-related problems, particularly benzodiazepines (AOR = 2.78; 1.99 for alcohol, 2.42 for tobacco, 8.02 for illicit drugs), DSM-5 PTSD diagnosis (AOR = 1.92; 2.38 for alcohol, 2.00 for tobacco, 2.14 for illicit drugs), most PTSSs (especially negative beliefs, recklessness, and avoidance), and interpersonal traumas (e.g., assaults and child abuse). Conclusion: In this clinical sample, there were consistent and strong associations between several trauma-related variables and substance-related problems, consistent with our hypotheses. We discuss possible explanations and implications of these findings, which we hope will stimulate further research, and improve screening and treatment. PMID:27517964
Shomaker, Lauren B.; Furman, Wyndol
2010-01-01
This study examined how current parent-adolescent relationship qualities and adolescents’ representations of relationships with parents were related to friendship interactions in 200 adolescent-close friend dyads. Adolescents and friends were observed discussing problems during a series of structured tasks. Negative interactions with mothers were significantly related to adolescents’ greater conflict with friends, poorer focus on tasks, and poorer communication skills. Security of working models (as assessed by interview) was significantly associated with qualities of friendship interactions, whereas security of attachment styles (as assessed by questionnaire) was not. More dismissing (vs. secure) working models were associated with poorer focus on problem discussions and weaker communication skills with friends, even after accounting for gender differences and current parent-adolescent relationship qualities. We discuss possible mechanisms for the observed links between dimensions of parent-adolescent relationships and friendships. We also consider methodological and conceptual differences between working model and style measures of attachment representations. PMID:20174459
Vecchi, Veronica; Hellowell, Mark; Gatti, Stefano
2013-05-01
This paper is concerned with the cost-efficiency of Private Finance Initiatives (PFIs) in the delivery of hospital facilities in the UK. We outline a methodology for identifying the "fair" return on equity, based on the Weighted Average Cost of Capital (WACC) of each investor. We apply this method to assess the expected returns on a sample of 77 contracts signed between 1997 and 2011 by health care provider organisations in the UK. We show that expected returns are in general in excess of the WACC benchmarks. The findings highlight significant problems in current procurement practices and the methodologies by which bids are assessed. To minimise the financial impact of hospital investments on health care systems, a regulatory regime must ensure that expected returns are set at the "fair" rate. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
CNN based approach for activity recognition using a wrist-worn accelerometer.
Panwar, Madhuri; Dyuthi, S Ram; Chandra Prakash, K; Biswas, Dwaipayan; Acharyya, Amit; Maharatna, Koushik; Gautam, Arvind; Naik, Ganesh R
2017-07-01
In recent years, significant advancements have taken place in human activity recognition using various machine learning approaches. However, feature engineering have dominated conventional methods involving the difficult process of optimal feature selection. This problem has been mitigated by using a novel methodology based on deep learning framework which automatically extracts the useful features and reduces the computational cost. As a proof of concept, we have attempted to design a generalized model for recognition of three fundamental movements of the human forearm performed in daily life where data is collected from four different subjects using a single wrist worn accelerometer sensor. The validation of the proposed model is done with different pre-processing and noisy data condition which is evaluated using three possible methods. The results show that our proposed methodology achieves an average recognition rate of 99.8% as opposed to conventional methods based on K-means clustering, linear discriminant analysis and support vector machine.
Clinical governance and operations management methodologies.
Davies, C; Walley, P
2000-01-01
The clinical governance mechanism, introduced since 1998 in the UK National Health Service (NHS), aims to deliver high quality care with efficient, effective and cost-effective patient services. Scally and Donaldson recognised that new approaches are needed, and operations management techniques comprise potentially powerful methodologies in understanding the process of care, which can be applied both within and across professional boundaries. This paper summarises four studies in hospital Trusts which took approaches to improving process that were different from and less structured than business process re-engineering (BPR). The problems were then amenable to change at a relatively low cost and short timescale, producing significant improvement to patient care. This less structured approach to operations management avoided incurring overhead costs of large scale and costly change such as new information technology (IT) systems. The most successful changes were brought about by formal tools to control quantity, content and timing of changes.
Childhood obesity in Asia: the value of accurate body composition methodology.
Hills, Andrew P; Mokhtar, Najat; Brownie, Sharon; Byrne, Nuala M
2014-01-01
Childhood obesity, a significant global public health problem, affects an increasing number of low- and middle-income countries, including in Asia. The obesity epidemic has been fuelled by the rapid nutrition and physical activity transition with the availability of more energy-dense nutrient-poor foods and lifestyles of many children dominated by physical inactivity. During the growing years the pace and quality of grow this best quantified by a combination of anthropometric and body composition measures. However, where normative data are available, this has typically been collected on Caucasian children. To better define and characterise overweight and obesity in Asian children, and to monitor nutrition and physical activity interventions, there is a need to increase the use of standardized anthropometric and body composition methodologies. The current paper reports on initiatives facilitated by the International Atomic Energy Agency (IAEA) and outlines future research needs for the prevention and management of childhood obesity in Asia.
Alzheimer’s Disease Drug Development in 2008 and Beyond: Problems and Opportunities
Becker, Robert E.; Greig, Nigel H.
2008-01-01
Recently, a number of Alzheimer’s disease (AD) multi-center clinical trials (CT) have failed to provide statistically significant evidence of drug efficacy. To test for possible design or execution flaws we analyzed in detail CTs for two failed drugs that were strongly supported by preclinical evidence and by proven CT AD efficacy for other drugs in their class. Studies of the failed commercial trials suggest that methodological flaws may contribute to the failures and that these flaws lurk within current drug development practices ready to impact other AD drug development [1]. To identify and counter risks we considered the relevance to AD drug development of the following factors: (1) effective dosing of the drug product, (2) reliable evaluations of research subjects, (3) effective implementation of quality controls over data at research sites, (4) resources for practitioners to effectively use CT results in patient care, (5) effective disease modeling, (6) effective research designs. New drugs currently under development for AD address a variety of specific mechanistic targets. Mechanistic targets provide AD drug development opportunities to escape from many of the factors that currently undermine AD clinical pharmacology, especially the problems of inaccuracy and imprecision associated with using rated outcomes. In this paper we conclude that many of the current problems encountered in AD drug development can be avoided by changing practices. Current problems with human errors in clinical trials make it difficult to differentiate drugs that fail to evidence efficacy from apparent failures due to Type II errors. This uncertainty and the lack of publication of negative data impede researchers’ abilities to improve methodologies in clinical pharmacology and to develop a sound body of knowledge about drug actions. We consider the identification of molecular targets as offering further opportunities for overcoming current failures in drug development. PMID:18690832
A Social-Medical Approach to Violence in Colombia
Franco, Saul
2003-01-01
Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field. PMID:14652328
A social-medical approach to violence in Colombia.
Franco, Saul
2003-12-01
Violence is the main public health problem in Colombia. Many theoretical and methodological approaches to solving this problem have been attempted from different disciplines. My past work has focused on homicide violence from the perspective of social medicine. In this article I present the main conceptual and methodological aspects and the chief findings of my research over the past 15 years. Findings include a quantitative description of the current situation and the introduction of the category of explanatory contexts as a contribution to the study of Colombian violence. The complexity and severity of this problem demand greater theoretical discussion, more plans for action and a faster transition between the two. Social medicine may make a growing contribution to this field.
Isolation of Extracellular Vesicles: General Methodologies and Latest Trends
Konoshenko, Maria Yu.; Laktionov, Pavel P.
2018-01-01
Background Extracellular vesicles (EVs) play an essential role in the communication between cells and transport of diagnostically significant molecules. A wide diversity of approaches utilizing different biochemical properties of EVs and a lack of accepted protocols make data interpretation very challenging. Scope of Review This review consolidates the data on the classical and state-of-the-art methods for isolation of EVs, including exosomes, highlighting the advantages and disadvantages of each method. Various characteristics of individual methods, including isolation efficiency, EV yield, properties of isolated EVs, and labor consumption are compared. Major Conclusions A mixed population of vesicles is obtained in most studies of EVs for all used isolation methods. The properties of an analyzed sample should be taken into account when planning an experiment aimed at studying and using these vesicles. The problem of adequate EVs isolation methods still remains; it might not be possible to develop a universal EV isolation method but the available protocols can be used towards solving particular types of problems. General Significance With the wide use of EVs for diagnosis and therapy of various diseases the evaluation of existing methods for EV isolation is one of the key problems in modern biology and medicine. PMID:29662902
Handbook for industrial noise control
NASA Technical Reports Server (NTRS)
1981-01-01
The basic principles of sound, measuring techniques, and instrumentation associated with general purpose noise control are discussed. Means for identifying and characterizing a noise problem so that subsequent work may provide the most efficient and cost effective solution are outlined. A methodology for choosing appropriate noise control materials and the proper implementation of control procedures is detailed. The most significant NASA sponsored contributions to the state of the art development of optimum noise control technologies are described including cases in which aeroacoustics and related research have shed some light on ways of reducing noise generation at its source.
Endovascular Neurosurgery: Personal Experience and Future Perspectives.
Raymond, Jean
2016-09-01
From Luessenhop's early clinical experience until the present day, experimental methods have been introduced to make progress in endovascular neurosurgery. A personal historical narrative, spanning the 1980s to 2010s, with a review of past opportunities, current problems, and future perspectives. Although the technology has significantly improved, our clinical culture remains a barrier to methodologically sound and safe innovative care and progress. We must learn how to safely practice endovascular neurosurgery in the presence of uncertainty and verify patient outcomes in real time. Copyright © 2016 Elsevier Inc. All rights reserved.
Handbook for industrial noise control
NASA Astrophysics Data System (ADS)
The basic principles of sound, measuring techniques, and instrumentation associated with general purpose noise control are discussed. Means for identifying and characterizing a noise problem so that subsequent work may provide the most efficient and cost effective solution are outlined. A methodology for choosing appropriate noise control materials and the proper implementation of control procedures is detailed. The most significant NASA sponsored contributions to the state of the art development of optimum noise control technologies are described including cases in which aeroacoustics and related research have shed some light on ways of reducing noise generation at its source.
Adaptive simplification of complex multiscale systems.
Chiavazzo, Eliodoro; Karlin, Ilya
2011-03-01
A fully adaptive methodology is developed for reducing the complexity of large dissipative systems. This represents a significant step toward extracting essential physical knowledge from complex systems, by addressing the challenging problem of a minimal number of variables needed to exactly capture the system dynamics. Accurate reduced description is achieved, by construction of a hierarchy of slow invariant manifolds, with an embarrassingly simple implementation in any dimension. The method is validated with the autoignition of the hydrogen-air mixture where a reduction to a cascade of slow invariant manifolds is observed.
A vertebrate case study of the quality of assemblies derived from next-generation sequences
2011-01-01
The unparalleled efficiency of next-generation sequencing (NGS) has prompted widespread adoption, but significant problems remain in the use of NGS data for whole genome assembly. We explore the advantages and disadvantages of chicken genome assemblies generated using a variety of sequencing and assembly methodologies. NGS assemblies are equivalent in some ways to a Sanger-based assembly yet deficient in others. Nonetheless, these assemblies are sufficient for the identification of the majority of genes and can reveal novel sequences when compared to existing assembly references. PMID:21453517
ERIC Educational Resources Information Center
Bird, Anne Marie; Ross, Diane
1984-01-01
A brief history of research in sport psychology based on Lander's (1982) analysis is presented. A systematic approach to theory building is offered. Previous methodological inadequacies are identified using examples of observational learning and anxiety. (Author/DF)
Structural Equation Modeling of School Violence Data: Methodological Considerations
ERIC Educational Resources Information Center
Mayer, Matthew J.
2004-01-01
Methodological challenges associated with structural equation modeling (SEM) and structured means modeling (SMM) in research on school violence and related topics in the social and behavioral sciences are examined. Problems associated with multiyear implementations of large-scale surveys are discussed. Complex sample designs, part of any…
Modeling, Analyzing, and Mitigating Dissonance Between Alerting Systems
NASA Technical Reports Server (NTRS)
Song, Lixia; Kuchar, James K.
2003-01-01
Alerting systems are becoming pervasive in process operations, which may result in the potential for dissonance or conflict in information from different alerting systems that suggests different threat levels and/or actions to resolve hazards. Little is currently available to help in predicting or solving the dissonance problem. This thesis presents a methodology to model and analyze dissonance between alerting systems, providing both a theoretical foundation for understanding dissonance and a practical basis from which specific problems can be addressed. A state-space representation of multiple alerting system operation is generalized that can be tailored across a variety of applications. Based on the representation, two major causes of dissonance are identified: logic differences and sensor error. Additionally, several possible types of dissonance are identified. A mathematical analysis method is developed to identify the conditions for dissonance originating from logic differences. A probabilistic analysis methodology is developed to estimate the probability of dissonance originating from sensor error, and to compare the relative contribution to dissonance of sensor error against the contribution from logic differences. A hybrid model, which describes the dynamic behavior of the process with multiple alerting systems, is developed to identify dangerous dissonance space, from which the process can lead to disaster. Methodologies to avoid or mitigate dissonance are outlined. Two examples are used to demonstrate the application of the methodology. First, a conceptual In-Trail Spacing example is presented. The methodology is applied to identify the conditions for possible dissonance, to identify relative contribution of logic difference and sensor error, and to identify dangerous dissonance space. Several proposed mitigation methods are demonstrated in this example. In the second example, the methodology is applied to address the dissonance problem between two air traffic alert and avoidance systems: the existing Traffic Alert and Collision Avoidance System (TCAS) vs. the proposed Airborne Conflict Management system (ACM). Conditions on ACM resolution maneuvers are identified to avoid dynamic dissonance between TCAS and ACM. Also included in this report is an Appendix written by Lee Winder about recent and continuing work on alerting systems design. The application of Markov Decision Process (MDP) theory to complex alerting problems is discussed and illustrated with an abstract example system.
Towards lexicographic multi-objective linear programming using grossone methodology
NASA Astrophysics Data System (ADS)
Cococcioni, Marco; Pappalardo, Massimo; Sergeyev, Yaroslav D.
2016-10-01
Lexicographic Multi-Objective Linear Programming (LMOLP) problems can be solved in two ways: preemptive and nonpreemptive. The preemptive approach requires the solution of a series of LP problems, with changing constraints (each time the next objective is added, a new constraint appears). The nonpreemptive approach is based on a scalarization of the multiple objectives into a single-objective linear function by a weighted combination of the given objectives. It requires the specification of a set of weights, which is not straightforward and can be time consuming. In this work we present both mathematical and software ingredients necessary to solve LMOLP problems using a recently introduced computational methodology (allowing one to work numerically with infinities and infinitesimals) based on the concept of grossone. The ultimate goal of such an attempt is an implementation of a simplex-like algorithm, able to solve the original LMOLP problem by solving only one single-objective problem and without the need to specify finite weights. The expected advantages are therefore obvious.
Data based identification and prediction of nonlinear and complex dynamical systems
NASA Astrophysics Data System (ADS)
Wang, Wen-Xu; Lai, Ying-Cheng; Grebogi, Celso
2016-07-01
The problem of reconstructing nonlinear and complex dynamical systems from measured data or time series is central to many scientific disciplines including physical, biological, computer, and social sciences, as well as engineering and economics. The classic approach to phase-space reconstruction through the methodology of delay-coordinate embedding has been practiced for more than three decades, but the paradigm is effective mostly for low-dimensional dynamical systems. Often, the methodology yields only a topological correspondence of the original system. There are situations in various fields of science and engineering where the systems of interest are complex and high dimensional with many interacting components. A complex system typically exhibits a rich variety of collective dynamics, and it is of great interest to be able to detect, classify, understand, predict, and control the dynamics using data that are becoming increasingly accessible due to the advances of modern information technology. To accomplish these goals, especially prediction and control, an accurate reconstruction of the original system is required. Nonlinear and complex systems identification aims at inferring, from data, the mathematical equations that govern the dynamical evolution and the complex interaction patterns, or topology, among the various components of the system. With successful reconstruction of the system equations and the connecting topology, it may be possible to address challenging and significant problems such as identification of causal relations among the interacting components and detection of hidden nodes. The "inverse" problem thus presents a grand challenge, requiring new paradigms beyond the traditional delay-coordinate embedding methodology. The past fifteen years have witnessed rapid development of contemporary complex graph theory with broad applications in interdisciplinary science and engineering. The combination of graph, information, and nonlinear dynamical systems theories with tools from statistical physics, optimization, engineering control, applied mathematics, and scientific computing enables the development of a number of paradigms to address the problem of nonlinear and complex systems reconstruction. In this Review, we describe the recent advances in this forefront and rapidly evolving field, with a focus on compressive sensing based methods. In particular, compressive sensing is a paradigm developed in recent years in applied mathematics, electrical engineering, and nonlinear physics to reconstruct sparse signals using only limited data. It has broad applications ranging from image compression/reconstruction to the analysis of large-scale sensor networks, and it has become a powerful technique to obtain high-fidelity signals for applications where sufficient observations are not available. We will describe in detail how compressive sensing can be exploited to address a diverse array of problems in data based reconstruction of nonlinear and complex networked systems. The problems include identification of chaotic systems and prediction of catastrophic bifurcations, forecasting future attractors of time-varying nonlinear systems, reconstruction of complex networks with oscillatory and evolutionary game dynamics, detection of hidden nodes, identification of chaotic elements in neuronal networks, reconstruction of complex geospatial networks and nodal positioning, and reconstruction of complex spreading networks with binary data.. A number of alternative methods, such as those based on system response to external driving, synchronization, and noise-induced dynamical correlation, will also be discussed. Due to the high relevance of network reconstruction to biological sciences, a special section is devoted to a brief survey of the current methods to infer biological networks. Finally, a number of open problems including control and controllability of complex nonlinear dynamical networks are discussed. The methods outlined in this Review are principled on various concepts in complexity science and engineering such as phase transitions, bifurcations, stabilities, and robustness. The methodologies have the potential to significantly improve our ability to understand a variety of complex dynamical systems ranging from gene regulatory systems to social networks toward the ultimate goal of controlling such systems.
Song, Fujian; Loke, Yoon K; Walsh, Tanya; Glenny, Anne-Marie; Eastwood, Alison J; Altman, Douglas G
2009-04-03
To investigate basic assumptions and other methodological problems in the application of indirect comparison in systematic reviews of competing healthcare interventions. Survey of published systematic reviews. Inclusion criteria Systematic reviews published between 2000 and 2007 in which an indirect approach had been explicitly used. Identified reviews were assessed for comprehensiveness of the literature search, method for indirect comparison, and whether assumptions about similarity and consistency were explicitly mentioned. The survey included 88 review reports. In 13 reviews, indirect comparison was informal. Results from different trials were naively compared without using a common control in six reviews. Adjusted indirect comparison was usually done using classic frequentist methods (n=49) or more complex methods (n=18). The key assumption of trial similarity was explicitly mentioned in only 40 of the 88 reviews. The consistency assumption was not explicit in most cases where direct and indirect evidence were compared or combined (18/30). Evidence from head to head comparison trials was not systematically searched for or not included in nine cases. Identified methodological problems were an unclear understanding of underlying assumptions, inappropriate search and selection of relevant trials, use of inappropriate or flawed methods, lack of objective and validated methods to assess or improve trial similarity, and inadequate comparison or inappropriate combination of direct and indirect evidence. Adequate understanding of basic assumptions underlying indirect and mixed treatment comparison is crucial to resolve these methodological problems. APPENDIX 1: PubMed search strategy. APPENDIX 2: Characteristics of identified reports. APPENDIX 3: Identified studies. References of included studies.
Global-local methodologies and their application to nonlinear analysis
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1989-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
Jitendra, Asha K; Petersen-Brown, Shawna; Lein, Amy E; Zaslofsky, Anne F; Kunkel, Amy K; Jung, Pyung-Gang; Egan, Andrea M
2015-01-01
This study examined the quality of the research base related to strategy instruction priming the underlying mathematical problem structure for students with learning disabilities and those at risk for mathematics difficulties. We evaluated the quality of methodological rigor of 18 group research studies using the criteria proposed by Gersten et al. and 10 single case design (SCD) research studies using criteria suggested by Horner et al. and the What Works Clearinghouse. Results indicated that 14 group design studies met the criteria for high-quality or acceptable research, whereas SCD studies did not meet the standards for an evidence-based practice. Based on these findings, strategy instruction priming the mathematics problem structure is considered an evidence-based practice using only group design methodological criteria. Implications for future research and for practice are discussed. © Hammill Institute on Disabilities 2013.
NASA Astrophysics Data System (ADS)
Çakır, Süleyman
2017-10-01
In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.
ERIC Educational Resources Information Center
Lee, Young-Jin
2017-01-01
Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…
Integrative Problem-Centered Therapy: Toward the Synthesis of Family and Individual Psychotherapies.
ERIC Educational Resources Information Center
Pinsof, William M.
1983-01-01
Presents an overview of the Integrative Problem-Centered Therapy (IPCT) Model, and describes its core principles and premises, and basic methodological steps. The IPCT provides a technique for applying individual and family therapy and behavioral, communicational, and psychodynamic orientations to client problems. Its goal is to create efficient…
Use of Problem-Based Learning in the Teaching and Learning of Horticultural Production
ERIC Educational Resources Information Center
Abbey, Lord; Dowsett, Eric; Sullivan, Jan
2017-01-01
Purpose: Problem-based learning (PBL), a relatively novel teaching and learning process in horticulture, was investigated. Proper application of PBL can potentially create a learning context that enhances student learning. Design/Methodology/Approach: Students worked on two complex ill-structured problems: (1) to produce fresh baby greens for a…
The Problem-Solving Approach of Environmental Education.
ERIC Educational Resources Information Center
Connect, 1983
1983-01-01
The problem-solving approach in environmental education (EE), reports on EE programs and activities in selected foreign countries, and a report on the Asian Subregional Workshop on Teacher Training in EE are provided in this newsletter. The nature of the problem-solving approach and brief discussions of such methodologies as group discussion,…
Adaptive finite element methods for two-dimensional problems in computational fracture mechanics
NASA Technical Reports Server (NTRS)
Min, J. B.; Bass, J. M.; Spradley, L. W.
1994-01-01
Some recent results obtained using solution-adaptive finite element methods in two-dimensional problems in linear elastic fracture mechanics are presented. The focus is on the basic issue of adaptive finite element methods for validating the new methodology by computing demonstration problems and comparing the stress intensity factors to analytical results.
Problem Solving Frameworks for Mathematics and Software Development
ERIC Educational Resources Information Center
McMaster, Kirby; Sambasivam, Samuel; Blake, Ashley
2012-01-01
In this research, we examine how problem solving frameworks differ between Mathematics and Software Development. Our methodology is based on the assumption that the words used frequently in a book indicate the mental framework of the author. We compared word frequencies in a sample of 139 books that discuss problem solving. The books were grouped…
Backtrack Programming: A Computer-Based Approach to Group Problem Solving.
ERIC Educational Resources Information Center
Scott, Michael D.; Bodaken, Edward M.
Backtrack problem-solving appears to be a viable alternative to current problem-solving methodologies. It appears to have considerable heuristic potential as a conceptual and operational framework for small group communication research, as well as functional utility for the student group in the small group class or the management team in the…
How to Arrive at Good Research Questions?
ERIC Educational Resources Information Center
Gafoor, K. Abdul
2008-01-01
Identifying an area of research a topic, deciding on a problem, and formulating it in to a researchable question are very difficult stages in the whole research process at least for beginners. Few books on research methodology elaborates the various process involved in problem selection and clarification. Viewing research and problem selection as…
Music Education Preservice Teachers' Confidence in Resolving Behavior Problems
ERIC Educational Resources Information Center
Hedden, Debra G.
2015-01-01
The purpose of this study was to investigate whether there would be a change in preservice teachers' (a) confidence concerning the resolution of behavior problems, (b) tactics for resolving them, (c) anticipation of problems, (d) fears about management issues, and (e) confidence in methodology and pedagogy over the time period of a one-semester…
Addict Life Stories: An Exploration of the Methodological Grounds for the Study of Social Problems.
ERIC Educational Resources Information Center
Kaplan, Charles D.
1982-01-01
Explores the use of sociological life histories to study social problems such as drug addiction. The factors influencing the fluctuating popularity of this research technique within the social sciences are examined. The impact of the researcher's direct exposure to the interviewee's problems on research results is discussed. (AM)
Pre-Service and In-Service Teachers' Metacognitive Knowledge about Problem-Solving Strategies
ERIC Educational Resources Information Center
Metallidou, Panayiota
2009-01-01
The present study based on Antonietti, A., Ignazi, S., & Perego, P. (2000). Metacognitive knowledge about problem-solving methods. "British Journal of Educational Psychology, 70", 1-16 methodology with the aim to examine primary school teachers' metacognitive knowledge about problem-solving strategies. A sample of 338 in-service (172) and…
Wind Tunnel to Atmospheric Mapping for Static Aeroelastic Scaling
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Spain, Charles V.; Rivera, J. A.
2004-01-01
Wind tunnel to Atmospheric Mapping (WAM) is a methodology for scaling and testing a static aeroelastic wind tunnel model. The WAM procedure employs scaling laws to define a wind tunnel model and wind tunnel test points such that the static aeroelastic flight test data and wind tunnel data will be correlated throughout the test envelopes. This methodology extends the notion that a single test condition - combination of Mach number and dynamic pressure - can be matched by wind tunnel data. The primary requirements for affecting this extension are matching flight Mach numbers, maintaining a constant dynamic pressure scale factor and setting the dynamic pressure scale factor in accordance with the stiffness scale factor. The scaling is enabled by capabilities of the NASA Langley Transonic Dynamics Tunnel (TDT) and by relaxation of scaling requirements present in the dynamic problem that are not critical to the static aeroelastic problem. The methodology is exercised in two example scaling problems: an arbitrarily scaled wing and a practical application to the scaling of the Active Aeroelastic Wing flight vehicle for testing in the TDT.
Philip, Bobby; Berrill, Mark A.; Allu, Srikanth; ...
2015-01-26
We describe an efficient and nonlinearly consistent parallel solution methodology for solving coupled nonlinear thermal transport problems that occur in nuclear reactor applications over hundreds of individual 3D physical subdomains. Efficiency is obtained by leveraging knowledge of the physical domains, the physics on individual domains, and the couplings between them for preconditioning within a Jacobian Free Newton Krylov method. Details of the computational infrastructure that enabled this work, namely the open source Advanced Multi-Physics (AMP) package developed by the authors are described. The details of verification and validation experiments, and parallel performance analysis in weak and strong scaling studies demonstratingmore » the achieved efficiency of the algorithm are presented. Moreover, numerical experiments demonstrate that the preconditioner developed is independent of the number of fuel subdomains in a fuel rod, which is particularly important when simulating different types of fuel rods. Finally, we demonstrate the power of the coupling methodology by considering problems with couplings between surface and volume physics and coupling of nonlinear thermal transport in fuel rods to an external radiation transport code.« less
Suzuki, Atsuko
2004-06-01
A review of the cross-cultural research on gender in psychology since 1990 reveals (1) conceptual confusion of the definitions of sex, gender, man, and woman; (2) diversification, refinement, reification, and a problem-solving orientation in the research topics; and (3) the possibility of the elucidation of the psychological sex-difference mechanism in relation to the biological sex differences. A comparison of 1990 and 2000 cross-cultural psychological articles published in "Sex Roles" found that overall, the research is Western-centered and some methodological problems remain to be solved concerning the measures and the sampling. These findings lead to the following suggestions for cross-cultural research on gender to resolve the problems and contribute to the development of psychology in general: (1) use of an operational definition for conceptual equivalence; (2) conducting more etic-approach research; (3) avoiding ethnocentric or androcentric research attitudes; (4) use of a theoretical framework; (5) strict examination of methodologies; and (6) examination of the specific context of participants in terms of cultural diversity, dynamics of husband-wife relationships, and relationships with husbands and fathers.
Physiology-based face recognition in the thermal infrared spectrum.
Buddharaju, Pradeep; Pavlidis, Ioannis T; Tsiamyrtzis, Panagiotis; Bazakos, Mike
2007-04-01
The current dominant approaches to face recognition rely on facial characteristics that are on or over the skin. Some of these characteristics have low permanency can be altered, and their phenomenology varies significantly with environmental factors (e.g., lighting). Many methodologies have been developed to address these problems to various degrees. However, the current framework of face recognition research has a potential weakness due to its very nature. We present a novel framework for face recognition based on physiological information. The motivation behind this effort is to capitalize on the permanency of innate characteristics that are under the skin. To establish feasibility, we propose a specific methodology to capture facial physiological patterns using the bioheat information contained in thermal imagery. First, the algorithm delineates the human face from the background using the Bayesian framework. Then, it localizes the superficial blood vessel network using image morphology. The extracted vascular network produces contour shapes that are characteristic to each individual. The branching points of the skeletonized vascular network are referred to as Thermal Minutia Points (TMPs) and constitute the feature database. To render the method robust to facial pose variations, we collect for each subject to be stored in the database five different pose images (center, midleft profile, left profile, midright profile, and right profile). During the classification stage, the algorithm first estimates the pose of the test image. Then, it matches the local and global TMP structures extracted from the test image with those of the corresponding pose images in the database. We have conducted experiments on a multipose database of thermal facial images collected in our laboratory, as well as on the time-gap database of the University of Notre Dame. The good experimental results show that the proposed methodology has merit, especially with respect to the problem of low permanence over time. More importantly, the results demonstrate the feasibility of the physiological framework in face recognition and open the way for further methodological and experimental research in the area.
NASA Astrophysics Data System (ADS)
Fekete, Tamás
2018-05-01
Structural integrity calculations play a crucial role in designing large-scale pressure vessels. Used in the electric power generation industry, these kinds of vessels undergo extensive safety analyses and certification procedures before deemed feasible for future long-term operation. The calculations are nowadays directed and supported by international standards and guides based on state-of-the-art results of applied research and technical development. However, their ability to predict a vessel's behavior under accidental circumstances after long-term operation is largely limited by the strong dependence of the analysis methodology on empirical models that are correlated to the behavior of structural materials and their changes during material aging. Recently a new scientific engineering paradigm, structural integrity has been developing that is essentially a synergistic collaboration between a number of scientific and engineering disciplines, modeling, experiments and numerics. Although the application of the structural integrity paradigm highly contributed to improving the accuracy of safety evaluations of large-scale pressure vessels, the predictive power of the analysis methodology has not yet improved significantly. This is due to the fact that already existing structural integrity calculation methodologies are based on the widespread and commonly accepted 'traditional' engineering thermal stress approach, which is essentially based on the weakly coupled model of thermomechanics and fracture mechanics. Recently, a research has been initiated in MTA EK with the aim to review and evaluate current methodologies and models applied in structural integrity calculations, including their scope of validity. The research intends to come to a better understanding of the physical problems that are inherently present in the pool of structural integrity problems of reactor pressure vessels, and to ultimately find a theoretical framework that could serve as a well-grounded theoretical foundation for a new modeling framework of structural integrity. This paper presents the first findings of the research project.
Ensemble-based data assimilation and optimal sensor placement for scalar source reconstruction
NASA Astrophysics Data System (ADS)
Mons, Vincent; Wang, Qi; Zaki, Tamer
2017-11-01
Reconstructing the characteristics of a scalar source from limited remote measurements in a turbulent flow is a problem of great interest for environmental monitoring, and is challenging due to several aspects. Firstly, the numerical estimation of the scalar dispersion in a turbulent flow requires significant computational resources. Secondly, in actual practice, only a limited number of observations are available, which generally makes the corresponding inverse problem ill-posed. Ensemble-based variational data assimilation techniques are adopted to solve the problem of scalar source localization in a turbulent channel flow at Reτ = 180 . This approach combines the components of variational data assimilation and ensemble Kalman filtering, and inherits the robustness from the former and the ease of implementation from the latter. An ensemble-based methodology for optimal sensor placement is also proposed in order to improve the condition of the inverse problem, which enhances the performances of the data assimilation scheme. This work has been partially funded by the Office of Naval Research (Grant N00014-16-1-2542) and by the National Science Foundation (Grant 1461870).
Qualitative Research in PBL in Health Sciences Education: A Review
ERIC Educational Resources Information Center
Jin, Jun; Bridges, Susan
2016-01-01
Context: Qualitative methodologies are relatively new in health sciences education research, especially in the area of problem-based learning (PBL). A key advantage of qualitative approaches is the ability to gain in-depth, textured insights into educational phenomena. Key methodological issues arise, however, in terms of the strategies of…
Assessing Personality and Mood With Adjective Check List Methodology: A Review
ERIC Educational Resources Information Center
Craig, Robert J.
2005-01-01
This article addresses the benefits and problems in using adjective check list methodology to assess personality. Recent developments in this assessment method are reviewed, emphasizing seminal adjective-based personality tests (Gough's Adjective Check List), mood tests (Lubin's Depressive Adjective Test, Multiple Affect Adjective Check List),…
Interpreting Qualitative Data: A Methodological Inquiry.
ERIC Educational Resources Information Center
Newman, Isadore; MacDonald, Suzanne
The methodology of interpretation of qualitative data was explored using a grounded theory approach to the synthesis of data, examining the construction of categories in particular. The focus is on ways of organizing data and attaching meaning, as research problems embedded in cultural context are explored. A qualitative research training task…
Extension Agents and Conflict Narratives: A Case of Laikipia County, Kenya
ERIC Educational Resources Information Center
Bond, Jennifer
2016-01-01
Purpose: This work investigated the narratives of development extensionists in relation to natural resource conflict, in order to understand the competing discourses surrounding the wicked problems of natural resource management in Laikipia County, Kenya. Methodology: Q methodology was used to elicit the conflict narratives present among extension…
Home/Work: Engaging the Methodological Dilemmas and Possibilities of Intimate Inquiry
ERIC Educational Resources Information Center
Laura, Crystal T.
2010-01-01
The paucity of solutions to the persistent problem of youth entanglement with the school-to-prison pipeline demands that educational researchers experiment with research differently. In this methodological article, I briefly sketch the beginnings of an "intimate" approach to educational inquiry that researchers can use to connect with…
Training Evaluation: An Analysis of the Stakeholders' Evaluation Needs
ERIC Educational Resources Information Center
Guerci, Marco; Vinante, Marco
2011-01-01
Purpose: In recent years, the literature on program evaluation has examined multi-stakeholder evaluation, but training evaluation models and practices have not generally taken this problem into account. The aim of this paper is to fill this gap. Design/methodology/approach: This study identifies intersections between methodologies and approaches…
Chalmers, Charlotte; Leathem, Janet; Bennett, Simon; McNaughton, Harry; Mahawish, Karim
2017-11-26
To investigate the efficacy of problem solving therapy for reducing the emotional distress experienced by younger stroke survivors. A non-randomized waitlist controlled design was used to compare outcome measures for the treatment group and a waitlist control group at baseline and post-waitlist/post-therapy. After the waitlist group received problem solving therapy an analysis was completed on the pooled outcome measures at baseline, post-treatment, and three-month follow-up. Changes on outcome measures between baseline and post-treatment (n = 13) were not significantly different between the two groups, treatment (n = 13), and the waitlist control group (n = 16) (between-subject design). The pooled data (n = 28) indicated that receiving problem solving therapy significantly reduced participants levels of depression and anxiety and increased quality of life levels from baseline to follow up (within-subject design), however, methodological limitations, such as the lack of a control group reduce the validity of this finding. The between-subject results suggest that there was no significant difference between those that received problem solving therapy and a waitlist control group between baseline and post-waitlist/post-therapy. The within-subject design suggests that problem solving therapy may be beneficial for younger stroke survivors when they are given some time to learn and implement the skills into their day to day life. However, additional research with a control group is required to investigate this further. This study provides limited evidence for the provision of support groups for younger stroke survivors post stroke, however, it remains unclear about what type of support this should be. Implications for Rehabilitation Problem solving therapy is no more effective for reducing post stroke distress than a wait-list control group. Problem solving therapy may be perceived as helpful and enjoyable by younger stroke survivors. Younger stroke survivors may use the skills learnt from problem solving therapy to solve problems in their day to day lives. Younger stroke survivors may benefit from age appropriate psychological support; however, future research is needed to determine what type of support this should be.
Norberg, Melissa M; Ham, Lindsay S; Olivier, Jake; Zamboanga, Byron L; Melkonian, Alexander; Fugitt, Jessica L
2016-07-02
Pregaming is a high-risk drinking behavior associated with increased alcohol consumption and alcohol-related problems. Quantity of alcohol consumed does not fully explain the level of problems associated with pregaming; yet, limited research has examined factors that may interact with pregaming behavior to contribute to the experience of alcohol-related problems. The current study examined whether use of two emotion regulation strategies influence pregaming's contribution to alcohol-related problems. Undergraduates (N = 1857) aged 18-25 years attending 19 different colleges completed an online survey in 2008-2009. Linear mixed models were used to test whether emotion regulation strategies moderate the association between pregaming status (pregamers vs. non/infrequent pregamers) and alcohol-related problems, when controlling for alcohol consumption, demographic covariates, and site as a random effect. Greater use of cognitive reappraisal was associated with decreased alcohol problems. Expressive suppression interacted with pregaming status. There was no relationship between pregaming status and alcohol problems for students who rarely used expression suppression; however, the relationship between pregaming status and alcohol problems was statistically significant for students who occasionally to frequently used expression suppression. Findings suggest that the relationship between pregaming and alcohol-related problems is complex. Accordingly, future studies should utilize event-level methodology to understand how emotion regulation strategies influence alcohol-related problems. Further, clinicians should tailor alcohol treatments to help students increase their use of cognitive reappraisal and decrease their use of suppression.
An approach to solve replacement problems under intuitionistic fuzzy nature
NASA Astrophysics Data System (ADS)
Balaganesan, M.; Ganesan, K.
2018-04-01
Due to impreciseness to solve the day to day problems the researchers use fuzzy sets in their discussions of the replacement problems. The aim of this paper is to solve the replacement theory problems with triangular intuitionistic fuzzy numbers. An effective methodology based on fuzziness index and location index is proposed to determine the optimal solution of the replacement problem. A numerical example is illustrated to validate the proposed method.
Results of the Software Process Improvement Efforts of the Early Adopters in NAVAIR 4.0
2007-12-01
and customer satisfaction. AIRSpeed utilizes a structured, problem solving methodology called DMAIC (Define, Measure, Analyze, Improve, Control...widely used in business. DMAIC leads project teams through the logical steps from problem definition to problem resolution. Each phase has a specific set...costs and improving productivity and customer satisfaction. AIRSpeed utilizes the DMAIC (Define, Measure, Analyze, Improve, Control) structured problem
NASA Technical Reports Server (NTRS)
Newman, P. A.; Hou, G. J.-W.; Jones, H. E.; Taylor, A. C., III; Korivi, V. M.
1992-01-01
How a combination of various computational methodologies could reduce the enormous computational costs envisioned in using advanced CFD codes in gradient based optimized multidisciplinary design (MdD) procedures is briefly outlined. Implications of these MdD requirements upon advanced CFD codes are somewhat different than those imposed by a single discipline design. A means for satisfying these MdD requirements for gradient information is presented which appear to permit: (1) some leeway in the CFD solution algorithms which can be used; (2) an extension to 3-D problems; and (3) straightforward use of other computational methodologies. Many of these observations have previously been discussed as possibilities for doing parts of the problem more efficiently; the contribution here is observing how they fit together in a mutually beneficial way.
System testing of a production Ada (trademark) project: The GRODY study
NASA Technical Reports Server (NTRS)
Seigle, Jeffrey; Esker, Linda; Shi, Ying-Liang
1990-01-01
The use of the Ada language and design methodologies that utilize its features has a strong impact on all phases of the software development project lifecycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The teams found some qualitative differences between the system test phases of the two projects. Although planning for system testing and conducting of tests were not generally affected by the use of Ada, the solving of problems found in system testing was generally facilitated by Ada constructs and design methodology. Most problems found in system testing were not due to difficulty with the language or methodology but to lack of experience with the application.
Optimal use of human and machine resources for Space Station assembly operations
NASA Technical Reports Server (NTRS)
Parrish, Joseph C.
1988-01-01
This paper investigates the issues involved in determining the best mix of human and machine resources for assembly of the Space Station. It presents the current Station assembly sequence, along with descriptions of the available assembly resources. A number of methodologies for optimizing the human/machine tradeoff problem have been developed, but the Space Station assembly offers some unique issues that have not yet been addressed. These include a strong constraint on available EVA time for early flights and a phased deployment of assembly resources over time. A methodology for incorporating the previously developed decision methods to the special case of the Space Station is presented. This methodology emphasizes an application of multiple qualitative and quantitative techniques, including simulation and decision analysis, for producing an objective, robust solution to the tradeoff problem.
Leijten, Fenna R M; van den Heuvel, Swenne G; Ybema, Jan Fekke; van der Beek, Allard J; Robroek, Suzan J W; Burdorf, Alex
2014-09-01
This study aimed to assess the influence of chronic health problems on work ability and productivity at work among older employees using different methodological approaches in the analysis of longitudinal studies. Data from employees, aged 45-64, of the longitudinal Study on Transitions in Employment, Ability and Motivation was used (N=8411). Using three annual online questionnaires, we assessed the presence of seven chronic health problems, work ability (scale 0-10), and productivity at work (scale 0-10). Three linear regression generalized estimating equations were used. The time-lag model analyzed the relation of health problems with work ability and productivity at work after one year; the autoregressive model adjusted for work ability and productivity in the preceding year; and the third model assessed the relation of incidence and recovery with changes in work ability and productivity at work within the same year. Workers with health problems had lower work ability at one-year follow-up than workers without these health problems, varying from a 2.0% reduction with diabetes mellitus to a 9.5% reduction with psychological health problems relative to the overall mean (time-lag). Work ability of persons with health problems decreased slightly more during one-year follow-up than that of persons without these health problems, ranging from 1.4% with circulatory to 5.9% with psychological health problems (autoregressive). Incidence related to larger decreases in work ability, from 0.6% with diabetes mellitus to 19.0% with psychological health problems, than recovery related to changes in work ability, from a 1.8% decrease with circulatory to an 8.5% increase with psychological health problems (incidence-recovery). Only workers with musculoskeletal and psychological health problems had lower productivity at work at one-year follow-up than workers without those health problems (1.2% and 5.6%, respectively, time-lag). All methodological approaches indicated that chronic health problems were associated with decreased work ability and, to a much lesser extent, lower productivity at work. The choice for a particular methodological approach considerably influenced the strength of the associations, with the incidence of health problems resulting in the largest decreases in work ability and productivity at work.
Markham, Francis; Young, Martin; Doran, Bruce; Sugden, Mark
2017-05-23
Many jurisdictions regularly conduct surveys to estimate the prevalence of problem gambling in their adult populations. However, the comparison of such estimates is problematic due to methodological variations between studies. Total consumption theory suggests that an association between mean electronic gaming machine (EGM) and casino gambling losses and problem gambling prevalence estimates may exist. If this is the case, then changes in EGM losses may be used as a proxy indicator for changes in problem gambling prevalence. To test for this association this study examines the relationship between aggregated losses on electronic gaming machines (EGMs) and problem gambling prevalence estimates for Australian states and territories between 1994 and 2016. A Bayesian meta-regression analysis of 41 cross-sectional problem gambling prevalence estimates was undertaken using EGM gambling losses, year of survey and methodological variations as predictor variables. General population studies of adults in Australian states and territory published before 1 July 2016 were considered in scope. 41 studies were identified, with a total of 267,367 participants. Problem gambling prevalence, moderate-risk problem gambling prevalence, problem gambling screen, administration mode and frequency threshold were extracted from surveys. Administrative data on EGM and casino gambling loss data were extracted from government reports and expressed as the proportion of household disposable income lost. Money lost on EGMs is correlated with problem gambling prevalence. An increase of 1% of household disposable income lost on EGMs and in casinos was associated with problem gambling prevalence estimates that were 1.33 times higher [95% credible interval 1.04, 1.71]. There was no clear association between EGM losses and moderate-risk problem gambling prevalence estimates. Moderate-risk problem gambling prevalence estimates were not explained by the models (I 2 ≥ 0.97; R 2 ≤ 0.01). The present study adds to the weight of evidence that EGM losses are associated with the prevalence of problem gambling. No patterns were evident among moderate-risk problem gambling prevalence estimates, suggesting that this measure is either subject to pronounced measurement error or lacks construct validity. The high degree of residual heterogeneity raises questions about the validity of comparing problem gambling prevalence estimates, even after adjusting for methodological variations between studies.
NASA Technical Reports Server (NTRS)
Taylor, Arthur C., III; Hou, Gene W.
1992-01-01
Fundamental equations of aerodynamic sensitivity analysis and approximate analysis for the two dimensional thin layer Navier-Stokes equations are reviewed, and special boundary condition considerations necessary to apply these equations to isolated lifting airfoils on 'C' and 'O' meshes are discussed in detail. An efficient strategy which is based on the finite element method and an elastic membrane representation of the computational domain is successfully tested, which circumvents the costly 'brute force' method of obtaining grid sensitivity derivatives, and is also useful in mesh regeneration. The issue of turbulence modeling is addressed in a preliminary study. Aerodynamic shape sensitivity derivatives are efficiently calculated, and their accuracy is validated on two viscous test problems, including: (1) internal flow through a double throat nozzle, and (2) external flow over a NACA 4-digit airfoil. An automated aerodynamic design optimization strategy is outlined which includes the use of a design optimization program, an aerodynamic flow analysis code, an aerodynamic sensitivity and approximate analysis code, and a mesh regeneration and grid sensitivity analysis code. Application of the optimization methodology to the two test problems in each case resulted in a new design having a significantly improved performance in the aerodynamic response of interest.
NASA Astrophysics Data System (ADS)
Reem, Daniel; De Pierro, Alvaro
2017-04-01
Many problems in science and engineering involve, as part of their solution process, the consideration of a separable function which is the sum of two convex functions, one of them possibly non-smooth. Recently a few works have discussed inexact versions of several accelerated proximal methods aiming at solving this minimization problem. This paper shows that inexact versions of a method of Beck and Teboulle (fast iterative shrinkable tresholding algorithm) preserve, in a Hilbert space setting, the same (non-asymptotic) rate of convergence under some assumptions on the decay rate of the error terms The notion of inexactness discussed here seems to be rather simple, but, interestingly, when comparing to related works, closely related decay rates of the errors terms yield closely related convergence rates. The derivation sheds some light on the somewhat mysterious origin of some parameters which appear in various accelerated methods. A consequence of the analysis is that the accelerated method is perturbation resilient, making it suitable, in principle, for the superiorization methodology. By taking this into account, we re-examine the superiorization methodology and significantly extend its scope. This work was supported by FAPESP 2013/19504-9. The second author was supported also by CNPq grant 306030/2014-4.
Community resilience assessment and literature analysis.
Weiner, John M; Walsh, John J
2015-01-01
Earlier and current disaster-related research emphasised the sociological/behavioural perspective. This led to a significant amount of literature devoted to descriptive context of natural, man-made and technological disasters and sequelae. This paper considers a next step involving a more expanded approach in research methodology. The phases include: (1) the development of a comprehensive database of ideas provided by authors of scholarly and scientific papers; (2) the development of computer-supported algorithms to prepare an array of scenarios representing relationships, gaps and inconsistencies in existing knowledge; (3) a process for evaluating the scenarios to determine a feasible and interesting next research strategy or programmatic action that will provide enhanced description of the problems as well as possible insights to their correction by interventions. The intent is to develop interventions as an essential component for better prevention, mitigation, rehabilitation, reconstruction and problem-solving affected by disaster events. To illustrate this approach, community resilience, a relatively new and important idea was studied. The phrase was used to describe relationships and omissions. The ideas associated with this central idea were considered in the building of a new instrument for evaluation of community vulnerability and readiness. This methodology addresses the time constraints realised by practitioners and investigators. The methods should eliminate tedious, clerical functions and focus on the intellectual functions representing optimal use of human energy.
Schry, Amie R; White, Susan W
2013-11-01
Many college students use alcohol, and most of these students experience problems related to their use. Emerging research indicates that socially anxious students face heightened risk of experiencing alcohol-related problems, although the extant research on alcohol use and social anxiety in this population has yielded inconsistent findings. This meta-analysis was conducted to examine the relationship between social anxiety and alcohol variables in college students. A literature search was used to identify studies on college students that included measures of social anxiety and at least one of the alcohol variables of interest. All analyses were conducted using random effects models. We found that social anxiety was negatively correlated with alcohol use variables (e.g., typical quantity and typical frequency), but significantly positively correlated with alcohol-related problems, coping, conformity, and social motives for alcohol use, and positive and negative alcohol outcome expectancies. Several moderators of effect sizes were found to be significant, including methodological factors such as sample ascertainment approach. Given that social anxiety was negatively related to alcohol use but positively related to alcohol-related problems, research is needed to address why individuals high in social anxiety experience more problems as a result of their alcohol use. Avoidance of social situations among socially anxious students should also be taken into account when measuring alcohol use. The primary limitation of this study is the small number of studies available for inclusion in some of the analyses. © 2013 Elsevier Ltd. All rights reserved.
A decision model for planetary missions
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.; Brigadier, W. L.
1976-01-01
Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.
Cooperative vehicle routing problem: an opportunity for cost saving
NASA Astrophysics Data System (ADS)
Zibaei, Sedighe; Hafezalkotob, Ashkan; Ghashami, Seyed Sajad
2016-09-01
In this paper, a novel methodology is proposed to solve a cooperative multi-depot vehicle routing problem. We establish a mathematical model for multi-owner VRP in which each owner (i.e. player) manages single or multiple depots. The basic idea consists of offering an option that owners cooperatively manage the VRP to save their costs. We present cooperative game theory techniques for cost saving allocations which are obtained from various coalitions of owners. The methodology is illustrated with a numerical example in which different coalitions of the players are evaluated along with the results of cooperation and cost saving allocation methods.
Solution methods for one-dimensional viscoelastic problems
NASA Technical Reports Server (NTRS)
Stubstad, John M.; Simitses, George J.
1987-01-01
A recently developed differential methodology for solution of one-dimensional nonlinear viscoelastic problems is presented. Using the example of an eccentrically loaded cantilever beam-column, the results from the differential formulation are compared to results generated using a previously published integral solution technique. It is shown that the results obtained from these distinct methodologies exhibit a surprisingly high degree of correlation with one another. A discussion of the various factors affecting the numerical accuracy and rate of convergence of these two procedures is also included. Finally, the influences of some 'higher order' effects, such as straining along the centroidal axis are discussed.
Robust approximate optimal guidance strategies for aeroassisted orbital transfer missions
NASA Astrophysics Data System (ADS)
Ilgen, Marc R.
This thesis presents the application of game theoretic and regular perturbation methods to the problem of determining robust approximate optimal guidance laws for aeroassisted orbital transfer missions with atmospheric density and navigated state uncertainties. The optimal guidance problem is reformulated as a differential game problem with the guidance law designer and Nature as opposing players. The resulting equations comprise the necessary conditions for the optimal closed loop guidance strategy in the presence of worst case parameter variations. While these equations are nonlinear and cannot be solved analytically, the presence of a small parameter in the equations of motion allows the method of regular perturbations to be used to solve the equations approximately. This thesis is divided into five parts. The first part introduces the class of problems to be considered and presents results of previous research. The second part then presents explicit semianalytical guidance law techniques for the aerodynamically dominated region of flight. These guidance techniques are applied to unconstrained and control constrained aeroassisted plane change missions and Mars aerocapture missions, all subject to significant atmospheric density variations. The third part presents a guidance technique for aeroassisted orbital transfer problems in the gravitationally dominated region of flight. Regular perturbations are used to design an implicit guidance technique similar to the second variation technique but that removes the need for numerically computing an optimal trajectory prior to flight. This methodology is then applied to a set of aeroassisted inclination change missions. In the fourth part, the explicit regular perturbation solution technique is extended to include the class of guidance laws with partial state information. This methodology is then applied to an aeroassisted plane change mission using inertial measurements and subject to uncertainties in the initial value of the flight path angle. A summary of performance results for all these guidance laws is presented in the fifth part of this thesis along with recommendations for further research.
Bulfone, Giampiera; Galletti, Caterina; Vellone, Ercole; Zanini, Antonietta; Quattrin, Rosanna
2008-01-01
The process nurses adopt to solve the patients' problems is known as "Problem Solving" in the literature. Problem Solving Abilities include Diagnostic Reasoning, Prognostic Judgment and Decision Making. Nursing students apply the Problem Solving to the Nursing Process that is the mental and operative approach that nurses use to plan the nursing care. The purpose of the present study is to examine if there is a positive relationship between the number of Educational Tutorial Strategies (Briefing, Debriefing and Discussion according to the Objective Structured Clinical Examination Methodology) used for nursing students and their learning of Problem Solving Abilities (Diagnostic Reasoning, Prognostic Judgment and Decision Making). The study design was retrospective, descriptive and comparative. The Problem Solving Instrument, specifically developed for this study and proved for its reliability and validity, was used to collect the data from a sample of 106 nursing care plans elaborated by the second-year students of the Bachelor Degree in Nursing of the University of Udine. Nursing care plans were elaborated during three times consecutively, after students had participated in different Educational Tutorial Strategies. Results showed that the more the students took part in a higher number of Educational Tutorial Strategies the more they significantly increased their Problem Solving Abilities. The results demonstrate that it is important to use Educational Tutorial Strategies in the nursing education to teach skills.
Workplace bullying and subsequent health problems.
Nielsen, Morten Birkeland; Magerøy, Nils; Gjerstad, Johannes; Einarsen, Ståle
2014-07-01
Cross-sectional studies demonstrate that exposure to bullying in the workplace is positively correlated with self-reported health problems. However, these studies do not provide a basis to draw conclusions on the extent to which bullying leads to increased health problems or whether health problems increase the risk of being bullied. To provide better indications of a causal relationship, knowledge from prospective studies on the association between bullying in the workplace and health outcomes is therefore summarised. We conducted a systematic literature review of original articles from central literature databases on longitudinal associations between bullying in the workplace and health. Average associations between bullying and health outcomes are calculated using meta-analysis. A consistent finding across the studies is that exposure to bullying is significantly positively related to mental health problems (OR =1.68; 95% KI 1.35-2.09) and somatic symptoms (OR = 1.77; 95% KI 1.41-2.22) over time. Mental health problems are also associated with subsequent exposure to bullying (OR = 1.74; 95% KI 1.44-2.12). Bullying is positively related to mental health problems and somatic symptoms. The association between mental health problems and subsequent bullying indicates a self-reinforcing process between mental health and bullying. The methodological quality of the studies that were conducted is relatively sound. However, based on the existing knowledge base there are no grounds for conclusions regarding an unambiguous causal relationship between bullying and health.
Efficient Computation of Info-Gap Robustness for Finite Element Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.
2012-07-05
A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers anmore » alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.« less
[Methodology of Screening New Antibiotics: Present Status and Prospects].
Trenin, A S
2015-01-01
Due to extensive distribution of pathogen resistance to available pharmaceuticals and serious problems in the treatment of various infections and tumor diseases, the necessity of new antibiotics is urgent. The basic methodological approaches to chemical synthesis of antibiotics and screening of new antibiotics among natural products, mainly among microbial secondary metabolites, are considered in the review. Since the natural compounds are very much diverse, screening of such substances gives a good opportunity to discover antibiotics of various chemical structure and mechanism of action. Such an approach followed by chemical or biological transformation, is capable of providing the health care with new effective pharmaceuticals. The review is mainly concentrated on screening of natural products and methodological problems, such as: isolation of microbial producers from the habitats, cultivation of microorganisms producing appropriate substances, isolation and chemical characterization of microbial metabolites, identification of the biological activity of the metabolites. The main attention is paid to the problems of microbial secondary metabolism and design of new models for screening biologically active compounds. The last achievements in the field of antibiotics and most perspective approaches to future investigations are discussed. The main methodological approach to isolation and cultivation of the producers remains actual and needs constant improvement. The increase of the screening efficiency can be achieved by more rapid chemical identification of antibiotics and design of new screening models based on the biological activity detection.
Aircraft optimization by a system approach: Achievements and trends
NASA Technical Reports Server (NTRS)
Sobieszczanski-Sobieski, Jaroslaw
1992-01-01
Recently emerging methodology for optimal design of aircraft treated as a system of interacting physical phenomena and parts is examined. The methodology is found to coalesce into methods for hierarchic, non-hierarchic, and hybrid systems all dependent on sensitivity analysis. A separate category of methods has also evolved independent of sensitivity analysis, hence suitable for discrete problems. References and numerical applications are cited. Massively parallel computer processing is seen as enabling technology for practical implementation of the methodology.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1986-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
NASA Technical Reports Server (NTRS)
Landgrebe, D. A. (Principal Investigator); Hixson, M. M.; Davis, B. J.; Bauer, M. E.
1978-01-01
The author has identified the following significant results. A stratification was performed and sample segments were selected for an initial investigation of multicrop problems in order to support development and evaluation of procedures for using LACIE and other technologies for the classification of corn and soybeans, to identify factors likely to affect classification performance, and to evaluate problems encountered and techniques which are applicable to the crop estimation problem in foreign countries. Two types of samples, low density and high density, supporting these requirements were selected as research data set for an initial evaluation of technical issues. Looking at the geographic location of the strata, the system appears to be logical and the various segments seem to represent different conditions. This result is supportive not only of the variables and the methodology employed in the stratification, but also of the validity of the data sets employed.
NASA Astrophysics Data System (ADS)
Tuckerman, Mark
2006-03-01
One of the computational grand challenge problems is to develop methodology capable of sampling conformational equilibria in systems with rough energy landscapes. If met, many important problems, most notably protein folding, could be significantly impacted. In this talk, two new approaches for addressing this problem will be presented. First, it will be shown how molecular dynamics can be combined with a novel variable transformation designed to warp configuration space in such a way that barriers are reduced and attractive basins stretched. This method rigorously preserves equilibrium properties while leading to very large enhancements in sampling efficiency. Extensions of this approach to the calculation/exploration of free energy surfaces will be discussed. Next, a new very large time-step molecular dynamics method will be introduced that overcomes the resonances which plague many molecular dynamics algorithms. The performance of the methods is demonstrated on a variety of systems including liquid water, long polymer chains simple protein models, and oligopeptides.
Fractional-order TV-L2 model for image denoising
NASA Astrophysics Data System (ADS)
Chen, Dali; Sun, Shenshen; Zhang, Congrong; Chen, YangQuan; Xue, Dingyu
2013-10-01
This paper proposes a new fractional order total variation (TV) denoising method, which provides a much more elegant and effective way of treating problems of the algorithm implementation, ill-posed inverse, regularization parameter selection and blocky effect. Two fractional order TV-L2 models are constructed for image denoising. The majorization-minimization (MM) algorithm is used to decompose these two complex fractional TV optimization problems into a set of linear optimization problems which can be solved by the conjugate gradient algorithm. The final adaptive numerical procedure is given. Finally, we report experimental results which show that the proposed methodology avoids the blocky effect and achieves state-of-the-art performance. In addition, two medical image processing experiments are presented to demonstrate the validity of the proposed methodology.
Cognitive Models for Integrating Testing and Instruction, Phase II. Methodology Program.
ERIC Educational Resources Information Center
Quellmalz, Edys S.; Shaha, Steven
The potential of a cognitive model task analysis scheme (CMS) that specifies features of test problems shown by research to affect performance is explored. CMS describes the general skill area and the generic task or problem type. It elaborates features of the problem situation and required responses found by research to influence performance.…
ERIC Educational Resources Information Center
Jitendra, Asha K.; Petersen-Brown, Shawna; Lein, Amy E.; Zaslofsky, Anne F.; Kunkel, Amy K.; Jung, Pyung-Gang; Egan, Andrea M.
2015-01-01
This study examined the quality of the research base related to strategy instruction priming the underlying mathematical problem structure for students with learning disabilities and those at risk for mathematics difficulties. We evaluated the quality of methodological rigor of 18 group research studies using the criteria proposed by Gersten et…
ERIC Educational Resources Information Center
Glenn, Margaret K.; Diaz, Sebastian R.; Hawley, Carolyn
2009-01-01
Professionals in the field of addictions view problems associated with recovery management across multiple domains. This exploratory study utilized concept mapping and pattern matching methodology to conceptualize the resulting 7 domains of concern for treatment and aftercare of problem and pathological gamblers. The information can be used by…
A Research Methodology for Studying What Makes Some Problems Difficult to Solve
ERIC Educational Resources Information Center
Gulacar, Ozcan; Fynewever, Herb
2010-01-01
We present a quantitative model for predicting the level of difficulty subjects will experience with specific problems. The model explicitly accounts for the number of subproblems a problem can be broken into and the difficultly of each subproblem. Although the model builds on previously published models, it is uniquely suited for blending with…
Solution-adaptive finite element method in computational fracture mechanics
NASA Technical Reports Server (NTRS)
Min, J. B.; Bass, J. M.; Spradley, L. W.
1993-01-01
Some recent results obtained using solution-adaptive finite element method in linear elastic two-dimensional fracture mechanics problems are presented. The focus is on the basic issue of adaptive finite element method for validating the applications of new methodology to fracture mechanics problems by computing demonstration problems and comparing the stress intensity factors to analytical results.
Assessing copper pinhole leaks in residential plumbing.
Edwards, M; Rushing, J C; Kvech, S; Reiber, S
2004-01-01
Pinhole leaks in copper tubes are a major problem for homeowners, and an aggressive conscientious effort by utilities is recommended to diagnose the problem and identify potential solutions. In a case study at one utility, pinhole leak frequency data was compiled and a methodology was followed that might prove to be a useful guide for those facing similar problems.
Are Funny Groups Good at Solving Problems? A Methodological Evaluation and Some Preliminary Results.
ERIC Educational Resources Information Center
Pollio, Howard R.; Bainum, Charlene Kubo
1983-01-01
Observed college students (N=195) divided according to sex and measures of wittiness to determine the effects of humor on problem solving in groups. Results showed that group composition was not a crucial issue in problem-solving performance, but that humerous group interaction was, and did not interfere with ongoing task performance. (LLL)
A Methodology for Validation of High Resolution Combat Models
1988-06-01
TELEOLOGICAL PROBLEM ................................ 7 C. EPISTEMOLOGICAL PROBLEM ............................. 8 D. UNCERTAINTY PRINCIPLE...theoretical issues. "The Teleological Problem"--How a model by its nature formulates an explicit cause-and-effect relationship that excludes other...34experts" in establishing the standard for reality. Generalization from personal experience is often hampered by the parochial aspects of the
ERIC Educational Resources Information Center
Yakubova, Gulnoza; Hughes, Elizabeth M.; Hornberger, Erin
2015-01-01
The purpose of this study was to determine the effectiveness of a point-of-view video modeling intervention to teach mathematics problem-solving when working on word problems involving subtracting mixed fractions with uncommon denominators. Using a multiple-probe across students design of single-case methodology, three high school students with…
A Model for Ubiquitous Serious Games Development Focused on Problem Based Learning
ERIC Educational Resources Information Center
Dorneles, Sandro Oliveira; da Costa, Cristiano André; Rigo, Sandro José
2015-01-01
The possibility of using serious games with problem-based learning opens up huge opportunities to connect the experiences of daily life of students with learning. In this context, this article presents a model for serious and ubiquitous games development, focusing on problem based learning methodology. The model allows teachers to create games…
Problem Behaviour at Early Age--Basis for Prediction of Asocial Behaviour
ERIC Educational Resources Information Center
Krneta, Dragoljub; Ševic, Aleksandra
2015-01-01
This paper analyzes the results of the study of prevalence of problem behaviour of students in primary and secondary schools. The starting point is that it is methodologically and logically justified to look for early forms of problem behaviour of students, because it is likely that adult convicted offenders at an early school age manifested forms…
Teachers' Perceptions of Employment-Related Problems: A Survey of Teachers in Two States.
ERIC Educational Resources Information Center
Cutrer, Susan S.; Daniel, Larry G.
This study was conducted to determine the degree to which a randomly selected sample of teachers in Mississippi and Louisiana (N=291) experience various types of work-related problems. It provides an opportunity to either confirm or deny the findings of previous studies, many of them limited by various methodological problems. Data were collected…
Anderson, Malcolm I; Parmenter, Trevor R; Mok, Magdalena
2002-09-01
This study used a modern theory of stress as a framework to strengthen the understanding of the relationship between neurobehavioural problems of TBI, family functioning and psychological distress in spouse/caregivers. The research was an ex post facto design utilising a cross-sectional methodology. Path analysis was used to determine the structural effect of neurobehavioural problems on family functioning and psychological distress. Forty-seven female and 17 male spouse/caregivers of partners with severe TBI were recruited. Spouse/caregivers who reported partners with TBI as having high levels of behavioural and cognitive problems experienced high levels of unhealthy family functioning. High levels of unhealthy family functioning were related to high levels of distress in spouse/caregivers, as family functioning had a moderate influence on psychological distress. Furthermore, indirect effects of behavioural and cognitive problems operating through family functioning intensified the level of psychological distress experienced by spouse/caregivers. Additionally, spouse/caregivers who reported high levels of behavioural, communication and social problems in their partners also experienced high levels of psychological distress. This study was significant because the impact of TBI on the spouse/caregiver from a multidimensional perspective is an important and under-researched area in the brain injury and disability field.
NASA Technical Reports Server (NTRS)
Weissenberger, S. (Editor)
1973-01-01
A systems engineering approach is reported for the problem of reducing the number and severity of California's wildlife fires. Prevention methodologies are reviewed and cost benefit models are developed for making preignition decisions.
Ceberio, Josu; Calvo, Borja; Mendiburu, Alexander; Lozano, Jose A
2018-02-15
In the last decade, many works in combinatorial optimisation have shown that, due to the advances in multi-objective optimisation, the algorithms from this field could be used for solving single-objective problems as well. In this sense, a number of papers have proposed multi-objectivising single-objective problems in order to use multi-objective algorithms in their optimisation. In this article, we follow up this idea by presenting a methodology for multi-objectivising combinatorial optimisation problems based on elementary landscape decompositions of their objective function. Under this framework, each of the elementary landscapes obtained from the decomposition is considered as an independent objective function to optimise. In order to illustrate this general methodology, we consider four problems from different domains: the quadratic assignment problem and the linear ordering problem (permutation domain), the 0-1 unconstrained quadratic optimisation problem (binary domain), and the frequency assignment problem (integer domain). We implemented two widely known multi-objective algorithms, NSGA-II and SPEA2, and compared their performance with that of a single-objective GA. The experiments conducted on a large benchmark of instances of the four problems show that the multi-objective algorithms clearly outperform the single-objective approaches. Furthermore, a discussion on the results suggests that the multi-objective space generated by this decomposition enhances the exploration ability, thus permitting NSGA-II and SPEA2 to obtain better results in the majority of the tested instances.
The method of belief scales as a means for dealing with uncertainty in tough regulatory decisions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pilch, Martin M.
Modeling and simulation is playing an increasing role in supporting tough regulatory decisions, which are typically characterized by variabilities and uncertainties in the scenarios, input conditions, failure criteria, model parameters, and even model form. Variability exists when there is a statistically significant database that is fully relevant to the application. Uncertainty, on the other hand, is characterized by some degree of ignorance. A simple algebraic problem was used to illustrate how various risk methodologies address variability and uncertainty in a regulatory context. These traditional risk methodologies include probabilistic methods (including frequensic and Bayesian perspectives) and second-order methods where variabilities andmore » uncertainties are treated separately. Representing uncertainties with (subjective) probability distributions and using probabilistic methods to propagate subjective distributions can lead to results that are not logically consistent with available knowledge and that may not be conservative. The Method of Belief Scales (MBS) is developed as a means to logically aggregate uncertain input information and to propagate that information through the model to a set of results that are scrutable, easily interpretable by the nonexpert, and logically consistent with the available input information. The MBS, particularly in conjunction with sensitivity analyses, has the potential to be more computationally efficient than other risk methodologies. The regulatory language must be tailored to the specific risk methodology if ambiguity and conflict are to be avoided.« less
Consultative Committee on Road Traffic Fatalities: trauma audit methodology.
McDermott, F T; Cordner, S M; Tremayne, A B
2000-10-01
Since 1992 the Consultative Committee on Road Traffic Fatalities in Victoria has identified deficiencies and errors in the management of 559 road traffic fatalities in which the patients were alive on arrival of ambulance services. The Committee also assessed the preventability of deaths. Reproducibility of results using its methodology has been shown to be statistically significant. The Committee's findings and recommendations, the latter made in association with the learned Colleges and specialist Societies, led to the establishment of a Ministerial Taskforce on Trauma and Emergency Services. As a consequence, in 2000, a new trauma care system will be implemented in Victoria. This paper presents a case example demonstrating the Committee's methodology. The Committee has two 12 member multidisciplinary evaluative panels. A retrospective evaluation was made of the complete ambulance, hospital and autopsy records of eligible fatalities. The clinical and pathological findings were analysed using a comprehensive data proforma, a narrative summary and the complete records. Resulting multidisciplinary discussion problems were identified and the potential preventability of death was assessed. In the present case example the Committee identified 16 management deficiencies of which 11 were assessed as having contributed to the patient's death; the death, however, was judged to be non-preventable. The presentation of this example demonstrating the Committee's methodology may be of assistance to hospital medical staff undertaking their own major trauma audit.
Zhu, Qiaohao; Carriere, K C
2016-01-01
Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.
Methodological considerations for studying social processes.
Patterson, Barbara; Morin, Karen
2012-01-01
To discuss the nature of and considerations in the study of social processes. Social processes include the elements of time, change and human interaction and many phenomena of interest to nurse researchers. Despite the significance of social processes for nursing practice and the labelling in many studies of phenomena as processes, there seems to be an inability to describe processes fully. The paper includes a presentation of two methodological approaches for illuminating the dynamics of social processes: participant observation and prospective time-series designs. Strengths and limitations of the two paradigmatically different approaches are offered. The method an investigator chooses should be considered selectively and appropriately according to the nature of the problem, what is known about the phenomena to be studied, and the investigator's world view and theoretical perspective. The conceptualisation of process can also influence the methodological choice. Capturing a social process in its entirety with either a qualitative or quantitative approach can be a difficult task. The focus of this paper is an initiation and expansion of the dialogue about which methods provide the best insight into social processes. This knowledge may offer opportunities for nurse researchers to design and implement interventions for individuals as they progress through life events.
IDR: A Participatory Methodology for Interdisciplinary Design in Technology Enhanced Learning
ERIC Educational Resources Information Center
Winters, Niall; Mor, Yishay
2008-01-01
One of the important themes that emerged from the CAL'07 conference was the failure of technology to bring about the expected disruptive effect to learning and teaching. We identify one of the causes as an inherent weakness in prevalent development methodologies. While the problem of designing technology for learning is irreducibly…
Qualitative Analysis of Comic Strip Culture: A Methodological Inquiry.
ERIC Educational Resources Information Center
Newman, Isadore; And Others
The paper is a methodological inquiry into the interpretation of qualitative data. It explores a grounded-theory approach to the synthesis of data and examines, in particular, the construction of categories. It focuses on ways of organizing and attaching meaning to data, as research problems embedded in a cultural context are explored. A…
Qualitative Analysis of a Synthetic Culture: A Methodological Inquiry.
ERIC Educational Resources Information Center
MacDonald, Suzanne; And Others
The study is a methodological inquiry into the interpretation of qualitative data. It explores a grounded theory approach to the synthesis of data, and examines, in particular, construction of categories. It focuses on ways of organizing data and attaching meaning, as research problems embedded in cultural context are explored. A qualitative…
ERIC Educational Resources Information Center
Chae, Yoojin; Goodman, Gail S.; Bederian-Gardner, Daniel; Lindsay, Adam
2011-01-01
Scientific studies of child maltreatment victims' memory abilities and court experiences have important legal, psychological, and clinical implications. However, state-of-the-art research on child witnesses is often hindered by methodological challenges. In this paper, we address specific problems investigators may encounter when attempting such…
ERIC Educational Resources Information Center
Jennings, Jerry L.; Apsche, Jack A.; Blossom, Paige; Bayles, Corliss
2013-01-01
Although mindfulness has become a mainstream methodology in mental health treatment, it is a relatively new approach with adolescents, and perhaps especially youth with sexual behavior problems. Nevertheless, clinical experience and several empirical studies are available to show the effectiveness of a systematic mindfulness- based methodology for…
Scale in Education Research: Towards a Multi-Scale Methodology
ERIC Educational Resources Information Center
Noyes, Andrew
2013-01-01
This article explores some theoretical and methodological problems concerned with scale in education research through a critique of a recent mixed-method project. The project was framed by scale metaphors drawn from the physical and earth sciences and I consider how recent thinking around scale, for example, in ecosystems and human geography might…
[Strengthening the methodology of study designs in scientific researches].
Ren, Ze-qin
2010-06-01
Many problems in study designs have affected the validity of scientific researches seriously. We must understand the methodology of research, especially clinical epidemiology and biostatistics, and recognize the urgency in selection and implement of right study design. Thereafter we can promote the research capability and improve the overall quality of scientific researches.
Soft-Systems Methodology. Mendip Papers.
ERIC Educational Resources Information Center
Kowszun, J.
This paper provides an introduction to a particular systems-theoretical approach to problem-solving in the management of education usually referred to as soft-systems methodology (SSM), developed by Peter Checkland in the 1970s. SSM should provide a powerful tool for managers in education at any level who have a strategic role because it can be…
The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology
ERIC Educational Resources Information Center
Wang, Greg G.; Swanson, Richard A.
2008-01-01
Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…
Systemic Sustainability in RtI Using Intervention-Based Scheduling Methodologies
ERIC Educational Resources Information Center
Dallas, William P.
2017-01-01
This study evaluated a scheduling methodology referred to as intervention-based scheduling to address the problem of practice regarding the fidelity of implementing Response to Intervention (RtI) in an existing school schedule design. Employing panel data, this study used fixed-effects regressions and first differences ordinary least squares (OLS)…
A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children
ERIC Educational Resources Information Center
Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.
2012-01-01
Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…
Level-Set Methodology on Adaptive Octree Grids
NASA Astrophysics Data System (ADS)
Gibou, Frederic; Guittet, Arthur; Mirzadeh, Mohammad; Theillard, Maxime
2017-11-01
Numerical simulations of interfacial problems in fluids require a methodology capable of tracking surfaces that can undergo changes in topology and capable to imposing jump boundary conditions in a sharp manner. In this talk, we will discuss recent advances in the level-set framework, in particular one that is based on adaptive grids.
A Framework and a Methodology for Developing Authentic Constructivist e-Learning Environments
ERIC Educational Resources Information Center
Zualkernan, Imran A.
2006-01-01
Semantically rich domains require operative knowledge to solve complex problems in real-world settings. These domains provide an ideal environment for developing authentic constructivist e-learning environments. In this paper we present a framework and a methodology for developing authentic learning environments for such domains. The framework is…
Neuroethics and animals: methods and philosophy.
Takala, Tuija; Häyry, Matti
2014-04-01
This article provides an overview of the six other contributions in the Neuroethics and Animals special section. In addition, it discusses the methodological and theoretical problems of interdisciplinary fields. The article suggests that interdisciplinary approaches without established methodological and theoretical bases are difficult to assess scientifically. This might cause these fields to expand without actually advancing.
Greiffenhagen, Christian; Mair, Michael; Sharrock, Wes
2015-09-01
Across the disciplinary frontiers of the social sciences, studies by social scientists treating their own investigative practices as sites of empirical inquiry have proliferated. Most of these studies have been retrospective, historical, after-the-fact reconstructions of social scientific studies mixing interview data with the (predominantly textual) traces that investigations leave behind. Observational studies of in situ work in social science research are, however, relatively scarce. Ethnomethodology was an early and prominent attempt to treat social science methodology as a topic for sociological investigations and, in this paper, we draw out what we see as its distinctive contribution: namely, a focus on troubles as features of the in situ, practical accomplishment of method, in particular, the way that research outcomes are shaped by the local practices of investigators in response to the troubles they encounter along the way. Based on two case studies, we distinguish methodological troubles as problems and methodological troubles as phenomena to be studied, and suggest the latter orientation provides an alternate starting point for addressing social scientists' investigative practices. © London School of Economics and Political Science 2015.
NASA Astrophysics Data System (ADS)
Ramachandran, C. S.; Balasubramanian, V.; Ananthapadmanabhan, P. V.
2011-03-01
Atmospheric plasma spraying is used extensively to make Thermal Barrier Coatings of 7-8% yttria-stabilized zirconia powders. The main problem faced in the manufacture of yttria-stabilized zirconia coatings by the atmospheric plasma spraying process is the selection of the optimum combination of input variables for achieving the required qualities of coating. This problem can be solved by the development of empirical relationships between the process parameters (input power, primary gas flow rate, stand-off distance, powder feed rate, and carrier gas flow rate) and the coating quality characteristics (deposition efficiency, tensile bond strength, lap shear bond strength, porosity, and hardness) through effective and strategic planning and the execution of experiments by response surface methodology. This article highlights the use of response surface methodology by designing a five-factor five-level central composite rotatable design matrix with full replication for planning, conduction, execution, and development of empirical relationships. Further, response surface methodology was used for the selection of optimum process parameters to achieve desired quality of yttria-stabilized zirconia coating deposits.
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1987-01-01
A combined stochastic feedforward and feedback control design methodology was developed. The objective of the feedforward control law is to track the commanded trajectory, whereas the feedback control law tries to maintain the plant state near the desired trajectory in the presence of disturbances and uncertainties about the plant. The feedforward control law design is formulated as a stochastic optimization problem and is embedded into the stochastic output feedback problem where the plant contains unstable and uncontrollable modes. An algorithm to compute the optimal feedforward is developed. In this approach, the use of error integral feedback, dynamic compensation, control rate command structures are an integral part of the methodology. An incremental implementation is recommended. Results on the eigenvalues of the implemented versus designed control laws are presented. The stochastic feedforward/feedback control methodology is used to design a digital automatic landing system for the ATOPS Research Vehicle, a Boeing 737-100 aircraft. The system control modes include localizer and glideslope capture and track, and flare to touchdown. Results of a detailed nonlinear simulation of the digital control laws, actuator systems, and aircraft aerodynamics are presented.
Semicompeting risks in aging research: methods, issues and needs
Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen
2015-01-01
A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136
Numerical Determination of Critical Conditions for Thermal Ignition
NASA Technical Reports Server (NTRS)
Luo, W.; Wake, G. C.; Hawk, C. W.; Litchford, R. J.
2008-01-01
The determination of ignition or thermal explosion in an oxidizing porous body of material, as described by a dimensionless reaction-diffusion equation of the form .tu = .2u + .e-1/u over the bounded region O, is critically reexamined from a modern perspective using numerical methodologies. First, the classic stationary model is revisited to establish the proper reference frame for the steady-state solution space, and it is demonstrated how the resulting nonlinear two-point boundary value problem can be reexpressed as an initial value problem for a system of first-order differential equations, which may be readily solved using standard algorithms. Then, the numerical procedure is implemented and thoroughly validated against previous computational results based on sophisticated path-following techniques. Next, the transient nonstationary model is attacked, and the full nonlinear form of the reaction-diffusion equation, including a generalized convective boundary condition, is discretized and expressed as a system of linear algebraic equations. The numerical methodology is implemented as a computer algorithm, and validation computations are carried out as a prelude to a broad-ranging evaluation of the assembly problem and identification of the watershed critical initial temperature conditions for thermal ignition. This numerical methodology is then used as the basis for studying the relationship between the shape of the critical initial temperature distribution and the corresponding spatial moments of its energy content integral and an attempt to forge a fundamental conjecture governing this relation. Finally, the effects of dynamic boundary conditions on the classic storage problem are investigated and the groundwork is laid for the development of an approximate solution methodology based on adaptation of the standard stationary model.
Dewey, Colin N
2012-01-01
Whole-genome alignment (WGA) is the prediction of evolutionary relationships at the nucleotide level between two or more genomes. It combines aspects of both colinear sequence alignment and gene orthology prediction, and is typically more challenging to address than either of these tasks due to the size and complexity of whole genomes. Despite the difficulty of this problem, numerous methods have been developed for its solution because WGAs are valuable for genome-wide analyses, such as phylogenetic inference, genome annotation, and function prediction. In this chapter, we discuss the meaning and significance of WGA and present an overview of the methods that address it. We also examine the problem of evaluating whole-genome aligners and offer a set of methodological challenges that need to be tackled in order to make the most effective use of our rapidly growing databases of whole genomes.
Efficacy of a Self-Help Treatment for At-Risk and Pathological Gamblers.
Boudreault, Catherine; Giroux, Isabelle; Jacques, Christian; Goulet, Annie; Simoneau, Hélène; Ladouceur, Robert
2018-06-01
Available evidence suggests that self-help treatments may reduce problem gambling severity but inconsistencies of results across clinical trials leave the extent of their benefits unclear. Moreover, no self-help treatment has yet been validated within a French Canadian setting. The current study therefore assesses the efficacy of a French language self-help treatment including three motivational telephone interviews spread over an 11-week period and a cognitive-behavioral self-help workbook. At-risk and pathological gamblers were randomly assigned to the treatment group (n = 31) or the waiting list (n = 31). Relative to the waiting list, the treatment group showed a statistically significant reduction in the number of DSM-5 gambling disorder criteria met, gambling habits, and gambling consequences at Week 11. Perceived self-efficacy and life satisfaction also significantly improved after 11 weeks for the treatment group, but not for the waiting list group. At Week 11, 13% of participants had dropped out of the study. All significant changes reported for the treatment group were maintained throughout 1, 6 and 12-month follow-ups. Results support the efficacy of the self-help treatment to reduce problem gambling severity, gambling behaviour and to improve overall functioning among a sample of French Canadian problem gamblers over short, medium and long term. Findings from this study lend support to the appropriateness of self-help treatments for problem gamblers and help clarify inconsistencies found in the literature. The low dropout rate is discussed with respect to the advantages of the self-help format. Clinical and methodological implications of the results are put forth.
Application of Design Methodologies for Feedback Compensation Associated with Linear Systems
NASA Technical Reports Server (NTRS)
Smith, Monty J.
1996-01-01
The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.
Decision-theoretic methodology for reliability and risk allocation in nuclear power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.
1985-01-01
This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less
Discrete Adjoint-Based Design for Unsteady Turbulent Flows On Dynamic Overset Unstructured Grids
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Diskin, Boris
2012-01-01
A discrete adjoint-based design methodology for unsteady turbulent flows on three-dimensional dynamic overset unstructured grids is formulated, implemented, and verified. The methodology supports both compressible and incompressible flows and is amenable to massively parallel computing environments. The approach provides a general framework for performing highly efficient and discretely consistent sensitivity analysis for problems involving arbitrary combinations of overset unstructured grids which may be static, undergoing rigid or deforming motions, or any combination thereof. General parent-child motions are also accommodated, and the accuracy of the implementation is established using an independent verification based on a complex-variable approach. The methodology is used to demonstrate aerodynamic optimizations of a wind turbine geometry, a biologically-inspired flapping wing, and a complex helicopter configuration subject to trimming constraints. The objective function for each problem is successfully reduced and all specified constraints are satisfied.
Environmental exposure effects on composite materials for commercial aircraft
NASA Technical Reports Server (NTRS)
Hoffman, D. J.
1978-01-01
Activities reported include completion of the program design tasks, resolution of a high fiber volume problem and resumption of specimen fabrication, fixture fabrication, and progress on the analysis methodology and definition of the typical aircraft environment. Program design activities including test specimens, specimen holding fixtures, flap-track fairing tailcones, and ground exposure racks were completed. The problem experienced in obtaining acceptable fiber volume fraction results on two of the selected graphite epoxy material systems was resolved with an alteration to the bagging procedure called out in BAC 5562. The revised bagging procedure, involving lower numbers of bleeder plies, produces acceptable results. All required laminates for the contract have now been laid up and cured. Progress in the area of analysis methodology has been centered about definition of the environment that a commercial transport aircraft undergoes. The selected methodology is analagous to fatigue life assessment.
An immersed boundary method for modeling a dirty geometry data
NASA Astrophysics Data System (ADS)
Onishi, Keiji; Tsubokura, Makoto
2017-11-01
We present a robust, fast, and low preparation cost immersed boundary method (IBM) for simulating an incompressible high Re flow around highly complex geometries. The method is achieved by the dispersion of the momentum by the axial linear projection and the approximate domain assumption satisfying the mass conservation around the wall including cells. This methodology has been verified against an analytical theory and wind tunnel experiment data. Next, we simulate the problem of flow around a rotating object and demonstrate the ability of this methodology to the moving geometry problem. This methodology provides the possibility as a method for obtaining a quick solution at a next large scale supercomputer. This research was supported by MEXT as ``Priority Issue on Post-K computer'' (Development of innovative design and production processes) and used computational resources of the K computer provided by the RIKEN Advanced Institute for Computational Science.
Publication Bias in Methodological Computational Research.
Boulesteix, Anne-Laure; Stierle, Veronika; Hapfelmeier, Alexander
2015-01-01
The problem of publication bias has long been discussed in research fields such as medicine. There is a consensus that publication bias is a reality and that solutions should be found to reduce it. In methodological computational research, including cancer informatics, publication bias may also be at work. The publication of negative research findings is certainly also a relevant issue, but has attracted very little attention to date. The present paper aims at providing a new formal framework to describe the notion of publication bias in the context of methodological computational research, facilitate and stimulate discussions on this topic, and increase awareness in the scientific community. We report an exemplary pilot study that aims at gaining experiences with the collection and analysis of information on unpublished research efforts with respect to publication bias, and we outline the encountered problems. Based on these experiences, we try to formalize the notion of publication bias.
A negotiation methodology and its application to cogeneration planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, S.M.; Liu, C.C.; Luu, S.
Power system planning has become a complex process in utilities today. This paper presents a methodology for integrated planning with multiple objectives. The methodology uses a graphical representation (Goal-Decision Network) to capture the planning knowledge. The planning process is viewed as a negotiation process that applies three negotiation operators to search for beneficial decisions in a GDN. Also, the negotiation framework is applied to the problem of planning for cogeneration interconnection. The simulation results are presented to illustrate the cogeneration planning process.
Data Mining for Financial Applications
NASA Astrophysics Data System (ADS)
Kovalerchuk, Boris; Vityaev, Evgenii
This chapter describes Data Mining in finance by discussing financial tasks, specifics of methodologies and techniques in this Data Mining area. It includes time dependence, data selection, forecast horizon, measures of success, quality of patterns, hypothesis evaluation, problem ID, method profile, attribute-based and relational methodologies. The second part of the chapter discusses Data Mining models and practice in finance. It covers use of neural networks in portfolio management, design of interpretable trading rules and discovering money laundering schemes using decision rules and relational Data Mining methodology.
An integrated methodology to assess the benefits of urban green space.
De Ridder, K; Adamec, V; Bañuelos, A; Bruse, M; Bürger, M; Damsgaard, O; Dufek, J; Hirsch, J; Lefebre, F; Pérez-Lacorzana, J M; Thierry, A; Weber, C
2004-12-01
The interrelated issues of urban sprawl, traffic congestion, noise, and air pollution are major socioeconomic problems faced by most European cities. A methodology is currently being developed for evaluating the role of green space and urban form in alleviating the adverse effects of urbanisation, mainly focusing on the environment but also accounting for socioeconomic aspects. The objectives and structure of the methodology are briefly outlined and illustrated with preliminary results obtained from case studies performed on several European cities.
Park, Bongki; Noh, Hyeonseok; Choi, Dong-Jun
2018-06-01
Xerostomia (dry mouth) causes many clinical problems, including oral infections, speech difficulties, and impaired chewing and swallowing of food. Many cancer patients have complained of xerostomia induced by cancer therapy. The aim of this systematic review is to assess the efficacy of herbal medicine for the treatment of xerostomia in cancer patients. Randomized controlled trials investigating the use of herbal medicines to treat xerostomia in cancer patients were included. We searched the following 12 databases without restrictions on time or language. The risk of bias was assessed using the Cochrane Risk of Bias Tool. Twenty-five randomized controlled trials involving 1586 patients met the inclusion criteria. A total of 24 formulas were examined in the included trials. Most of the included trials were insufficiently reported in the methodology section. Five formulas were shown to significantly improve the salivary flow rate compared to comparators. Regarding the grade of xerostomia, all formulas with the exception of a Dark Plum gargle solution with normal saline were significantly effective in reducing the severity of dry mouth. Adverse events were reported in 4 trials, and adverse effects of herbal medicine were reported in 3 trials. We found herbal medicines had potential benefits for improving salivary function and reducing the severity of dry mouth in cancer patients. However, methodological limitations and a relatively small sample size reduced the strength of the evidence. More high-quality trials reporting sufficient methodological data are warranted to enforce the strength of evidence regarding the effectiveness of herbal medicines.
Neurophenomenology revisited: second-person methods for the study of human consciousness
Olivares, Francisco A.; Vargas, Esteban; Fuentes, Claudio; Martínez-Pernía, David; Canales-Johnson, Andrés
2015-01-01
In the study of consciousness, neurophenomenology was originally established as a novel research program attempting to reconcile two apparently irreconcilable methodologies in psychology: qualitative and quantitative methods. Its potential relies on Francisco Varela’s idea of reciprocal constraints, in which first-person accounts and neurophysiological data mutually inform each other. However, since its first conceptualization, neurophenomenology has encountered methodological problems. These problems have emerged mainly because of the difficulty of obtaining and analyzing subjective reports in a systematic manner. However, more recently, several interview techniques for describing subjective accounts have been developed, collectively known as “second-person methods.” Second-person methods refer to interview techniques that solicit both verbal and non-verbal information from participants in order to obtain systematic and detailed subjective reports. Here, we examine the potential for employing second-person methodologies in the neurophenomenological study of consciousness and we propose three practical ideas for developing a second-person neurophenomenological method. Thus, we first describe second-person methodologies available in the literature for analyzing subjective reports, identifying specific constraints on the status of the first-, second- and third- person methods. Second, we analyze two experimental studies that explicitly incorporate second-person methods for traversing the “gap” between phenomenology and neuroscience. Third, we analyze the challenges that second-person accounts face in establishing an objective methodology for comparing results across different participants and interviewers: this is the “validation” problem. Finally, we synthesize the common aspects of the interview methods described above. In conclusion, our arguments emphasize that second-person methods represent a powerful approach for closing the gap between the experiential and the neurobiological levels of description in the study of human consciousness. PMID:26074839
Managing cognitive impairment in the elderly: conceptual, intervention and methodological issues.
Buckwalter, K C; Stolley, J M; Farran, C J
1999-11-11
With the aging of society, the incidence of dementia in the elderly is also increasing, and thus results in increased numbers of individuals with cognitive impairment. Nurses and other researchers have investigated issues concerning the management of cognitive impairment. This article highlights conceptual, intervention and methodological issues associated with this phenomenon. Cognitive change is a multivariate construct that includes alterations in a variety of information processing mechanisms such as problem solving ability, memory, perception, attention and learning, and judgement. Although there is a large body of research, conceptual, intervention and methodological issues remain. Much of the clinical research on cognitive impairment is atheoretical, with this issue only recently being addressed. While many clinical interventions have been proposed, few have been adequately tested. There are also various methodological concerns, such as small sample sizes and limited statistical power; study design issues (experimental vs. non-experimental), and internal and external validity problems. Clearly, additional research designed to intervene with these difficult behaviors is needed. A variety of psychosocial, environmental and physical parameters must be considered in the nursing care of persons with cognitive impairment. Special attention has been given to interventions associated with disruptive behaviors. Interventions are complex and knowledge must be integrated from both the biomedical and behavioral sciences in order to deal effectively with the numerous problems that can arise over a long and changing clinical course. Some researchers and clinicians have suggested that a new culture regarding dementia care is needed, one that focuses on changing attitudes and beliefs about persons with dementia and one that changes how organizations deliver that care. This review identifies key conceptual, intervention and methodological issues and recommends how these issues might be addressed in the future.
Magness, Scott T.; Puthoff, Brent J.; Crissey, Mary Ann; Dunn, James; Henning, Susan J.; Houchen, Courtney; Kaddis, John S.; Kuo, Calvin J.; Li, Linheng; Lynch, John; Martin, Martin G.; May, Randal; Niland, Joyce C.; Olack, Barbara; Qian, Dajun; Stelzner, Matthias; Swain, John R.; Wang, Fengchao; Wang, Jiafang; Wang, Xinwei; Yan, Kelley; Yu, Jian
2013-01-01
Fluorescence-activated cell sorting (FACS) is an essential tool for studies requiring isolation of distinct intestinal epithelial cell populations. Inconsistent or lack of reporting of the critical parameters associated with FACS methodologies has complicated interpretation, comparison, and reproduction of important findings. To address this problem a comprehensive multicenter study was designed to develop guidelines that limit experimental and data reporting variability and provide a foundation for accurate comparison of data between studies. Common methodologies and data reporting protocols for tissue dissociation, cell yield, cell viability, FACS, and postsort purity were established. Seven centers tested the standardized methods by FACS-isolating a specific crypt-based epithelial population (EpCAM+/CD44+) from murine small intestine. Genetic biomarkers for stem/progenitor (Lgr5 and Atoh 1) and differentiated cell lineages (lysozyme, mucin2, chromogranin A, and sucrase isomaltase) were interrogated in target and control populations to assess intra- and intercenter variability. Wilcoxon's rank sum test on gene expression levels showed limited intracenter variability between biological replicates. Principal component analysis demonstrated significant intercenter reproducibility among four centers. Analysis of data collected by standardized cell isolation methods and data reporting requirements readily identified methodological problems, indicating that standard reporting parameters facilitate post hoc error identification. These results indicate that the complexity of FACS isolation of target intestinal epithelial populations can be highly reproducible between biological replicates and different institutions by adherence to common cell isolation methods and FACS gating strategies. This study can be considered a foundation for continued method development and a starting point for investigators that are developing cell isolation expertise to study physiology and pathophysiology of the intestinal epithelium. PMID:23928185
Meta-Analyses and Orthodontic Evidence-Based Clinical Practice in the 21st Century
Papadopoulos, Moschos A.
2010-01-01
Introduction: Aim of this systematic review was to assess the orthodontic related issues which currently provide the best evidence as documented by meta-analyses, by critically evaluating and discussing the methodology used in these studies. Material and Methods: Several electronic databases were searched and handsearching was also performed in order to identify the corresponding meta-analyses investigating orthodontic related subjects. In total, 197 studies were retrieved initially. After applying specific inclusion and exclusion criteria, 27 articles were identified as meta-analyses treating orthodontic-related subjects. Results: Many of these 27 papers presented sufficient quality and followed appropriate meta-analytic approaches to quantitatively synthesize data and presented adequately supported evidence. However, the methodology used in some of them presented weaknesses, limitations or deficiencies. Consequently, the topics in orthodontics which currently provide the best evidence, include some issues related to Class II or Class III treatment, treatment of transverse problems, external apical root resorption, dental anomalies, such as congenital missing teeth and tooth transposition, frequency of severe occlusal problems, nickel hypersensitivity, obstructive sleep apnea syndrome, and computer-assisted learning in orthodontic education. Conclusions: Only a few orthodontic related issues have been so far investigated by means of MAs. In addition, for some of these issues investigated in the corresponding MAs no definite conclusions could be drawn, due to significant methodological deficiencies of these studies. According to this investigation, it can be concluded that at the begin of the 21st century there is evidence for only a few orthodontic related issues as documented by meta-analyses, and more well-conducted high quality research studies are needed to produce strong evidence in order to support evidence-based clinical practice in orthodontics. PMID:21673839
Large scale nonlinear programming for the optimization of spacecraft trajectories
NASA Astrophysics Data System (ADS)
Arrieta-Camacho, Juan Jose
Despite the availability of high fidelity mathematical models, the computation of accurate optimal spacecraft trajectories has never been an easy task. While simplified models of spacecraft motion can provide useful estimates on energy requirements, sizing, and cost; the actual launch window and maneuver scheduling must rely on more accurate representations. We propose an alternative for the computation of optimal transfers that uses an accurate representation of the spacecraft dynamics. Like other methodologies for trajectory optimization, this alternative is able to consider all major disturbances. In contrast, it can handle explicitly equality and inequality constraints throughout the trajectory; it requires neither the derivation of costate equations nor the identification of the constrained arcs. The alternative consist of two steps: (1) discretizing the dynamic model using high-order collocation at Radau points, which displays numerical advantages, and (2) solution to the resulting Nonlinear Programming (NLP) problem using an interior point method, which does not suffer from the performance bottleneck associated with identifying the active set, as required by sequential quadratic programming methods; in this way the methodology exploits the availability of sound numerical methods, and next generation NLP solvers. In practice the methodology is versatile; it can be applied to a variety of aerospace problems like homing, guidance, and aircraft collision avoidance; the methodology is particularly well suited for low-thrust spacecraft trajectory optimization. Examples are presented which consider the optimization of a low-thrust orbit transfer subject to the main disturbances due to Earth's gravity field together with Lunar and Solar attraction. Other example considers the optimization of a multiple asteroid rendezvous problem. In both cases, the ability of our proposed methodology to consider non-standard objective functions and constraints is illustrated. Future research directions are identified, involving the automatic scheduling and optimization of trajectory correction maneuvers. The sensitivity information provided by the methodology is expected to be invaluable in such research pursuit. The collocation scheme and nonlinear programming algorithm presented in this work, complement other existing methodologies by providing reliable and efficient numerical methods able to handle large scale, nonlinear dynamic models.
Methodological variations and their effects on reported medication administration error rates.
McLeod, Monsey Chan; Barber, Nick; Franklin, Bryony Dean
2013-04-01
Medication administration errors (MAEs) are a problem, yet methodological variation between studies presents a potential barrier to understanding how best to increase safety. Using the UK as a case-study, we systematically summarised methodological variations in MAE studies, and their effects on reported MAE rates. Nine healthcare databases were searched for quantitative observational MAE studies in UK hospitals. Methodological variations were analysed and meta-analysis of MAE rates performed using studies that used the same definitions. Odds ratios (OR) were calculated to compare MAE rates between intravenous (IV) and non-IV doses, and between paediatric and adult doses. We identified 16 unique studies reporting three MAE definitions, 44 MAE subcategories and four different denominators. Overall adult MAE rates were 5.6% of a total of 21 533 non-IV opportunities for error (OE) (95% CI 4.6% to 6.7%) and 35% of a total of 154 IV OEs (95% CI 2% to 68%). MAEs were five times more likely in IV than non-IV doses (pooled OR 5.1; 95% CI 3.5 to 7.5). Including timing errors of ±30 min increased the MAE rate from 27% to 69% of 320 IV doses in one study. Five studies were unclear as to whether the denominator included dose omissions; omissions accounted for 0%-13% of IV doses and 1.8%-5.1% of non-IV doses. Wide methodological variations exist even within one country, some with significant effects on reported MAE rates. We have made recommendations for future MAE studies; these may be applied both within and outside the UK.
Nelson, Kristen C; Andow, David A; Banker, Michael J
2009-01-01
Societal evaluation of new technologies, specifically nanotechnology and genetically engineered organisms (GEOs), challenges current practices of governance and science. Employing environmental risk assessment (ERA) for governance and oversight assumes we have a reasonable ability to understand consequences and predict adverse effects. However, traditional ERA has come under considerable criticism for its many shortcomings and current governance institutions have demonstrated limitations in transparency, public input, and capacity. Problem Formulation and Options Assessment (PFOA) is a methodology founded on three key concepts in risk assessment (science-based consideration, deliberation, and multi-criteria analysis) and three in governance (participation, transparency, and accountability). Developed through a series of international workshops, the PFOA process emphasizes engagement with stakeholders in iterative stages, from identification of the problem(s) through comparison of multiple technology solutions that could be used in the future with their relative benefits, harms, and risk. It provides "upstream public engagement" in a deliberation informed by science that identifies values for improved decision making.
Alternative approximation concepts for space frame synthesis
NASA Technical Reports Server (NTRS)
Lust, R. V.; Schmit, L. A.
1985-01-01
A structural synthesis methodology for the minimum mass design of 3-dimensionall frame-truss structures under multiple static loading conditions and subject to limits on displacements, rotations, stresses, local buckling, and element cross-sectional dimensions is presented. A variety of approximation concept options are employed to yield near optimum designs after no more than 10 structural analyses. Available options include: (A) formulation of the nonlinear mathematcal programming problem in either reciprocal section property (RSP) or cross-sectional dimension (CSD) space; (B) two alternative approximate problem structures in each design space; and (C) three distinct assumptions about element end-force variations. Fixed element, design element linking, and temporary constraint deletion features are also included. The solution of each approximate problem, in either its primal or dual form, is obtained using CONMIN, a feasible directions program. The frame-truss synthesis methodology is implemented in the COMPASS computer program and is used to solve a variety of problems. These problems were chosen so that, in addition to exercising the various approximation concepts options, the results could be compared with previously published work.
NASA Astrophysics Data System (ADS)
Gupta, Mahima; Mohanty, B. K.
2017-04-01
In this paper, we have developed a methodology to derive the level of compensation numerically in multiple criteria decision-making (MCDM) problems under fuzzy environment. The degree of compensation is dependent on the tranquility and anxiety level experienced by the decision-maker while taking the decision. Higher tranquility leads to the higher realisation of the compensation whereas the increased level of anxiety reduces the amount of compensation in the decision process. This work determines the level of tranquility (or anxiety) using the concept of fuzzy sets and its various level sets. The concepts of indexing of fuzzy numbers, the risk barriers and the tranquility level of the decision-maker are used to derive his/her risk prone or risk averse attitude of decision-maker in each criterion. The aggregation of the risk levels in each criterion gives us the amount of compensation in the entire MCDM problem. Inclusion of the compensation leads us to model the MCDM problem as binary integer programming problem (BIP). The solution to BIP gives us the compensatory decision to MCDM. The proposed methodology is illustrated through a numerical example.
Assessment Problems and Ensuring of Decent Work in the Russian Regions
ERIC Educational Resources Information Center
Simonova, Marina V.; Sankova, Larisa V.; Mirzabalaeva, Farida I.; Shchipanova, Dina Ye.; Dorozhkin, Vladimir E.
2016-01-01
The relevance of the research problem is inspired by the need to ensure decent work principles in Russia. The purpose of this article is to develop evaluation methodologies and identify areas to implement key principles of decent work at the regional level in modern Russia. A leading approach to study this problem is the development of a new…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, R.A.
1980-12-01
This comparison study involves a preliminary verification of finite element calculations. The methodology of the comparison study consists of solving four example problems with both the SPECTROM finite element program and the MARC-CDC general purpose finite element program. The results show close agreement for all example problems.
Methodological Issues in the Measurement of Non-Random Family Problem Solving Interaction.
ERIC Educational Resources Information Center
Kieren, Dianne K.; Hurlbut, Nancy L.
The family is an obvious group for whom problem solving effectiveness holds importance. Problem solving interaction refers to the manner in which the behavior of family members is organized to resolve situations in which there is an unachieved but attainable goal, and the means to overcoming the barriers to achieving the goal are not apparent, but…
TRIZ: A Bridge Between Applied and Industrial Physics
NASA Astrophysics Data System (ADS)
Savransky, Semyon
1997-03-01
TRIZ provides a methodology for creative engineering design. TRIZ was founded by Genrich S. Altshuller in Russia, whose with co-workers analyses about 1,500,000 worldwide patents. The major TRIZ principles are [1,2]: 1. All engineering systems have uniform evolution. Many other systems (economic, educational, etc.) have the same evolution trends. 2. Any inventive problem represents a conflict between new requirements and old system. TRIZ comprises various systematically techniques to find an quasi-ideal answer to the inventive problem through solve the conflict based on the knowledge of a system evolution. Usually the hidden root of technical problem is physical contradictions that is possible to resolve using the lists of effects. TRIZ experts use a knowledge base of applied physics to provide solutions of industrial problems . Many companies around the world cite a phenomenal increase in the producti-vity and quality of solutions to tough engineering problems through the use of TRIZ. [1]. G. S. Altshuller, B.L. Zlotin, A.V. Zusman and V.I. Filatov, The new ideas search: From intuition to technology. (in Russian) Kishinev, 1989, 381p. [2]. S.D. Savransky, and C. Stephan, TRIZ: Methodology of Inventive Problem Solving. The Indust-rial Physicist (December 1996).
Alteration of histological gastritis after cure of Helicobacter pylori infection.
Hojo, M; Miwa, H; Ohkusa, T; Ohkura, R; Kurosawa, A; Sato, N
2002-11-01
It is still disputed whether gastric atrophy or intestinal metaplasia improves after the cure of Helicobacter pylori infection. To clarify the histological changes after the cure of H. pylori infection through a literature survey. Fifty-one selected reports from 1066 relevant articles were reviewed. The extracted data were pooled according to histological parameters of gastritis based on the (updated) Sydney system. Activity improved more rapidly than inflammation. Eleven of 25 reports described significant improvement of atrophy. Atrophy was not improved in one of four studies with a large sample size (> 100 samples) and in two of five studies with a long follow-up period (> 12 months), suggesting that disagreement between the studies was not totally due to sample size or follow-up period. Methodological flaws, such as patient selection, and statistical analysis based on the assumption that atrophy improves continuously and generally in all patients might be responsible for the inconsistent results. Four of 28 studies described significant improvement of intestinal metaplasia [corrected]. Activity and inflammation were improved after the cure of H. pylori infection. Atrophy did not improve generally among all patients, but improved in certain patients. Improvement of intestinal metaplasia was difficult to analyse due to methodological problems including statistical power.
An almost-parameter-free harmony search algorithm for groundwater pollution source identification.
Jiang, Simin; Zhang, Yali; Wang, Pei; Zheng, Maohui
2013-01-01
The spatiotemporal characterization of unknown sources of groundwater pollution is frequently encountered in environmental problems. This study adopts a simulation-optimization approach that combines a contaminant transport simulation model with a heuristic harmony search algorithm to identify unknown pollution sources. In the proposed methodology, an almost-parameter-free harmony search algorithm is developed. The performance of this methodology is evaluated on an illustrative groundwater pollution source identification problem, and the identified results indicate that the proposed almost-parameter-free harmony search algorithm-based optimization model can give satisfactory estimations, even when the irregular geometry, erroneous monitoring data, and prior information shortage of potential locations are considered.
Conceptual and methodological concerns in the theory of perceptual load.
Benoni, Hanna; Tsal, Yehoshua
2013-01-01
The present paper provides a short critical review of the theory of perceptual load. It closely examines the basic tenets and assumptions of the theory and identifies major conceptual and methodological problems that have been largely ignored in the literature. The discussion focuses on problems in the definition of the concept of perceptual load, on the circularity in the characterization and manipulation of perceptual load and the confusion between the concept of perceptual load and its operationalization. The paper also selectively reviews evidence supporting the theory as well as inconsistent evidence which proposed alternative dominant factors influencing the efficacy of attentional selection.
Conceptual and methodological concerns in the theory of perceptual load
Benoni, Hanna; Tsal, Yehoshua
2013-01-01
The present paper provides a short critical review of the theory of perceptual load. It closely examines the basic tenets and assumptions of the theory and identifies major conceptual and methodological problems that have been largely ignored in the literature. The discussion focuses on problems in the definition of the concept of perceptual load, on the circularity in the characterization and manipulation of perceptual load and the confusion between the concept of perceptual load and its operationalization. The paper also selectively reviews evidence supporting the theory as well as inconsistent evidence which proposed alternative dominant factors influencing the efficacy of attentional selection. PMID:23964262
NASA Technical Reports Server (NTRS)
Anderson, W. J.
1980-01-01
The considered investigations deal with some of the more important present day and future bearing requirements, and design methodologies available for coping with them. Solutions to many forthcoming bearing problems lie in the utilization of the most advanced materials, design methods, and lubrication techniques. Attention is given to materials for rolling element bearings, numerical analysis techniques and design methodology for rolling element bearing load support systems, lubrication of rolling element bearings, journal bearing design for high speed turbomachinery, design and energy losses in the case of turbulent flow bearings, and fluid film bearing response to dynamic loading.
Towards Perfectly Absorbing Boundary Conditions for Euler Equations
NASA Technical Reports Server (NTRS)
Hayder, M. Ehtesham; Hu, Fang Q.; Hussaini, M. Yousuff
1997-01-01
In this paper, we examine the effectiveness of absorbing layers as non-reflecting computational boundaries for the Euler equations. The absorbing-layer equations are simply obtained by splitting the governing equations in the coordinate directions and introducing absorption coefficients in each split equation. This methodology is similar to that used by Berenger for the numerical solutions of Maxwell's equations. Specifically, we apply this methodology to three physical problems shock-vortex interactions, a plane free shear flow and an axisymmetric jet- with emphasis on acoustic wave propagation. Our numerical results indicate that the use of absorbing layers effectively minimizes numerical reflection in all three problems considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.
It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less
Chang, Justin; Karra, Satish; Nakshatrala, Kalyana B.
2016-07-26
It is well-known that the standard Galerkin formulation, which is often the formulation of choice under the finite element method for solving self-adjoint diffusion equations, does not meet maximum principles and the non-negative constraint for anisotropic diffusion equations. Recently, optimization-based methodologies that satisfy maximum principles and the non-negative constraint for steady-state and transient diffusion-type equations have been proposed. To date, these methodologies have been tested only on small-scale academic problems. The purpose of this paper is to systematically study the performance of the non-negative methodology in the context of high performance computing (HPC). PETSc and TAO libraries are, respectively, usedmore » for the parallel environment and optimization solvers. For large-scale problems, it is important for computational scientists to understand the computational performance of current algorithms available in these scientific libraries. The numerical experiments are conducted on the state-of-the-art HPC systems, and a single-core performance model is used to better characterize the efficiency of the solvers. Furthermore, our studies indicate that the proposed non-negative computational framework for diffusion-type equations exhibits excellent strong scaling for real-world large-scale problems.« less
High-Order Moving Overlapping Grid Methodology in a Spectral Element Method
NASA Astrophysics Data System (ADS)
Merrill, Brandon E.
A moving overlapping mesh methodology that achieves spectral accuracy in space and up to second-order accuracy in time is developed for solution of unsteady incompressible flow equations in three-dimensional domains. The targeted applications are in aerospace and mechanical engineering domains and involve problems in turbomachinery, rotary aircrafts, wind turbines and others. The methodology is built within the dual-session communication framework initially developed for stationary overlapping meshes. The methodology employs semi-implicit spectral element discretization of equations in each subdomain and explicit treatment of subdomain interfaces with spectrally-accurate spatial interpolation and high-order accurate temporal extrapolation, and requires few, if any, iterations, yet maintains the global accuracy and stability of the underlying flow solver. Mesh movement is enabled through the Arbitrary Lagrangian-Eulerian formulation of the governing equations, which allows for prescription of arbitrary velocity values at discrete mesh points. The stationary and moving overlapping mesh methodologies are thoroughly validated using two- and three-dimensional benchmark problems in laminar and turbulent flows. The spatial and temporal global convergence, for both methods, is documented and is in agreement with the nominal order of accuracy of the underlying solver. Stationary overlapping mesh methodology was validated to assess the influence of long integration times and inflow-outflow global boundary conditions on the performance. In a turbulent benchmark of fully-developed turbulent pipe flow, the turbulent statistics are validated against the available data. Moving overlapping mesh simulations are validated on the problems of two-dimensional oscillating cylinder and a three-dimensional rotating sphere. The aerodynamic forces acting on these moving rigid bodies are determined, and all results are compared with published data. Scaling tests, with both methodologies, show near linear strong scaling, even for moderately large processor counts. The moving overlapping mesh methodology is utilized to investigate the effect of an upstream turbulent wake on a three-dimensional oscillating NACA0012 extruded airfoil. A direct numerical simulation (DNS) at Reynolds Number 44,000 is performed for steady inflow incident upon the airfoil oscillating between angle of attack 5.6° and 25° with reduced frequency k=0.16. Results are contrasted with subsequent DNS of the same oscillating airfoil in a turbulent wake generated by a stationary upstream cylinder.
Future Research Needs in Learning Disabilities.
ERIC Educational Resources Information Center
Senf, Gerald M.
This paper deals with future research needs and problems in learning disabilities, and is divided into the following two broad categories: (1) supporting conditions, which involve necessary prerequisites to the research effort; and (2) procedural considerations, which deal with methodological concerns. First, the problems posed by supporting…
Seductive Details in Multimedia Messages
ERIC Educational Resources Information Center
Rey, Gunter Daniel
2011-01-01
The seductive detail principle asserts that people learn more deeply from a multimedia presentation when interesting but irrelevant adjuncts are excluded rather than included. However, critics could argue that studies about this principle contain methodological problems. The recent experiment attempts to overcome these problems. Students (N = 108)…
Medical Problem-Solving: A Critique of the Literature.
ERIC Educational Resources Information Center
McGuire, Christine H.
1985-01-01
Prescriptive, decision-analysis of medical problem-solving has been based on decision theory that involves calculation and manipulation of complex probability and utility values to arrive at optimal decisions that will maximize patient benefits. The studies offer a methodology for improving clinical judgment. (Author/MLW)
A Systems Approach to Research in Vocational Education.
ERIC Educational Resources Information Center
Miller, Larry E.
1991-01-01
A methodology to address "soft system" problems (those that are unstructured or fuzzy) has these steps: (1) mapping the problem; (2) constructing a root definition; (3) applying conceptual models; (4) comparing models to the real world; and (5) finding and implementing feasible solutions. (SK)
Interdisciplinary Analysis and Global Policy Studies.
ERIC Educational Resources Information Center
Meeks, Philip
This paper examines ways in which interdisciplinary and multidisciplinary analysis of global policy studies can increase understanding of complex global problems. Until recently, social science has been the discipline most often turned to for techniques and methodology to analyze social problems and behaviors. However, because social science…
[Problems of world outlook and methodology of science integration in biological studies].
Khododova, Iu D
1981-01-01
Problems of worldoutlook and methodology of the natural-science knowledge are considered basing on the analysis of tendencies in the development of the membrane theory of cell processes and the use of principles of biological membrane functioning when solving some scientific and applied problems pertaining to different branches of chemistry and biology. The notion scientific knowledge integration is defined as interpenetration of approaches, methods and ideas of different branches of knowledge and enrichment on this basis of their content resulting in knowledge augmentation in each field taken separately. These processes are accompanied by appearance of new branches of knowledge - sciences "on junction" and their subsequent differentiations. The analysis of some gnoseological situations shows that integration of sciences contributes to coordination and some agreement of thinking styles of different specialists, puts forward keen personality of a scientist demanding, in particular, his high professional mobility. Problems of scientific activity organization are considered, which involve social sciences into the integration processes. The role of philosophy in the integration processes is emphasized.
Organizing Blended Learning for Students on the Basis of Learning Roadmaps
ERIC Educational Resources Information Center
Andreeva, Nadezhda M.; Artyukhov, Ivan P.; Myagkova, Elena G.; Pak, Nikolay I.; Akkasynova, Zhamilya K.
2018-01-01
The relevance of the problem of organizing blended learning for students is related to the sharpening contradiction between the high potential of this educational technology and the poor methodological elaboration of its use in actual learning practice. With regard to this, the paper is aimed at providing grounds for the methodological system of…
Remote sensing for site characterization
Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.; Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.
2000-01-01
This volume, Remote Sensing for Site Characterization, describes the feasibility of aircraft- and satellite-based methods of revealing environmental-geological problems. A balanced ratio between explanations of the methodological/technical side and presentations of case studies is maintained. The comparison of case studies from North America and Germany show how the respective territorial conditions lead to distinct methodological approaches.
Prototyping a Microcomputer-Based Online Library Catalog. Occasional Papers Number 177.
ERIC Educational Resources Information Center
Lazinger, Susan S.; Shoval, Peretz
This report examines and evaluates the application of prototyping methodology in the design of a microcomputer-based online library catalog. The methodology for carrying out the research involves a five-part examination of the problem on both the theoretical and applied levels, each of which is discussed in a separate section as follows: (1) a…
ERIC Educational Resources Information Center
Nickerson, Carol A.; McClelland, Gary H.
1988-01-01
A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)
Development of regional stump-to-mill logging cost estimators
Chris B. LeDoux; John E. Baumgras
1989-01-01
Planning logging operations requires estimating the logging costs for the sale or tract being harvested. Decisions need to be made on equipment selection and its application to terrain. In this paper a methodology is described that has been developed and implemented to solve the problem of accurately estimating logging costs by region. The methodology blends field time...
The Garbage Crisis: Environmental Issues for Adult ESL Learners.
ERIC Educational Resources Information Center
Heffernan, Helen
This module on the garbage crisis is for intermediate and advanced learners of English as a Second Language. It seeks to inform learners about this issue and to give them an opportunity to direct their concerns about the environment into positive action. The guide uses the problem-posing methodology of Paulo Freire. This methodology has three…
Analyzing Media: Metaphors as Methodologies.
ERIC Educational Resources Information Center
Meyrowitz, Joshua
Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…
Teaching Instrumentation and Methodology in Human Motion Analysis
2001-10-25
TEACHING INSTRUMENTATION AND METHODOLOGY IN HUMAN MOTION ANALYSIS V. Medved Faculty of Physical Education , University of Zagreb, Zagreb, Croatia...the introducement of teaching curricula to implement the apropriate knowledge. Problems are discussed of educating professionals and disseminating...University of Zagreb, undergraduate teaching of locomotion biomechanics is provided only at the Faculty of Physical Education . Following a need to teach
ERIC Educational Resources Information Center
Stephen, Timothy D.
2011-01-01
The problem of how to rank academic journals in the communication field (human interaction, mass communication, speech, and rhetoric) is one of practical importance to scholars, university administrators, and librarians, yet there is no methodology that covers the field's journals comprehensively and objectively. This article reports a new ranking…
ERIC Educational Resources Information Center
Brember, V. L.
1985-01-01
Presents Checkland's soft systems methodology, discusses it in terms of the systems approach, and illustrates how it was used to relate evidence of user survey to practical problems of library management. Difficulties in using methodology are described and implications for library management and information science research are presented. (8…
ERIC Educational Resources Information Center
Vitale, Michael R.; Romance, Nancy
Adopting perspectives based on applications of artificial intelligence proven in industry, this paper discusses methodological strategies and issues that underlie the development of such software environments. The general concept of an expert system is discussed in the context of its relevance to the problem of increasing the accessibility of…
Testing Methodology in the Student Learning Process
ERIC Educational Resources Information Center
Gorbunova, Tatiana N.
2017-01-01
The subject of the research is to build methodologies to evaluate the student knowledge by testing. The author points to the importance of feedback about the mastering level in the learning process. Testing is considered as a tool. The object of the study is to create the test system models for defence practice problems. Special attention is paid…
Didactic Aspects of the Academic Discipline "History and Methodology of Mathematics"
ERIC Educational Resources Information Center
Sun, Hai; Varankina, Vera I.; Sadovaya, Victoriya V.
2017-01-01
The purpose of this article is to develop the content and methods, as well as the analysis of the approbation of the program of the academic discipline "History and methodology of mathematics" for graduate students of the Master's program of mathematical program tracks. The leading method in the study of this problem was the method of…
ERIC Educational Resources Information Center
Mukan, Nataliya; Kravets, Svitlana
2015-01-01
In the article the methodology of comparative analysis of public school teachers' continuing professional development (CPD) in Great Britain, Canada and the USA has been presented. The main objectives are defined as theoretical analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research;…
NASA Astrophysics Data System (ADS)
Zhang, Hongjuan; Kurtz, Wolfgang; Kollet, Stefan; Vereecken, Harry; Franssen, Harrie-Jan Hendricks
2018-01-01
The linkage between root zone soil moisture and groundwater is either neglected or simplified in most land surface models. The fully-coupled subsurface-land surface model TerrSysMP including variably saturated groundwater dynamics is used in this work. We test and compare five data assimilation methodologies for assimilating groundwater level data via the ensemble Kalman filter (EnKF) to improve root zone soil moisture estimation with TerrSysMP. Groundwater level data are assimilated in the form of pressure head or soil moisture (set equal to porosity in the saturated zone) to update state vectors. In the five assimilation methodologies, the state vector contains either (i) pressure head, or (ii) log-transformed pressure head, or (iii) soil moisture, or (iv) pressure head for the saturated zone only, or (v) a combination of pressure head and soil moisture, pressure head for the saturated zone and soil moisture for the unsaturated zone. These methodologies are evaluated in synthetic experiments which are performed for different climate conditions, soil types and plant functional types to simulate various root zone soil moisture distributions and groundwater levels. The results demonstrate that EnKF cannot properly handle strongly skewed pressure distributions which are caused by extreme negative pressure heads in the unsaturated zone during dry periods. This problem can only be alleviated by methodology (iii), (iv) and (v). The last approach gives the best results and avoids unphysical updates related to strongly skewed pressure heads in the unsaturated zone. If groundwater level data are assimilated by methodology (iii), EnKF fails to update the state vector containing the soil moisture values if for (almost) all the realizations the observation does not bring significant new information. Synthetic experiments for the joint assimilation of groundwater levels and surface soil moisture support methodology (v) and show great potential for improving the representation of root zone soil moisture.
Boldt, Lea J.; Kochanska, Grazyna; Yoon, Jeung Eun; Nordling, Jamie Koenig
2014-01-01
We examined children’s attachment security with their mothers and fathers in a community sample (N = 100). At 25 months, mothers, fathers, and trained observers completed Attachment Q-Set (AQS). At 100 months, children completed Kerns Security Scale (KSS) for each parent. Children’s adaptation (behavior problems and competence in broader ecologies of school and peer group, child- and parent-reported) was assessed at 100 months. By and large, the child’s security with the mother and father was modestly to robustly concordant across both relationships, depending on the assessment method. Observers’ AQS security scores predicted children’s self-reported security 6 years later. For children with low AQS security scores with mothers, variations in security with fathers had significant implications for adaptation: Those whose security with fathers was also low reported the most behavior problems and were seen as least competent in broader ecologies, but those whose security with fathers was high reported few problems and were seen as competent. Security with fathers, observer-rated and child-reported, predicted children’s higher competence in broader ecologies. A cumulative index of the history of security from toddler age to middle childhood, integrating measures across both relationships and diverse methodologies, was significantly associated with positive adaptation at 100 months. PMID:24605850
Problem-Based Learning in Accounting
ERIC Educational Resources Information Center
Dockter, DuWayne L.
2012-01-01
Seasoned educators use an assortment of student-centered methods and tools to enhance their student's learning environment. In respects to methodologies used in accounting, educators have utilized and created new forms of problem-based learning exercises, including case studies, simulations, and other projects, to help students become more active…
Problems with the Construct and Measurement of Social Anxiety.
ERIC Educational Resources Information Center
Leary, Mark R.
There has been little agreement regarding appropriate definitions of social anxiety. Many existing definitions are conceptually problematic and these problems have had serious methodological implications. Social anxiety may be defined as anxiety resulting from the prospect or presence of interpersonal evaluation in real or imagined social…
Generalised Assignment Matrix Methodology in Linear Programming
ERIC Educational Resources Information Center
Jerome, Lawrence
2012-01-01
Discrete Mathematics instructors and students have long been struggling with various labelling and scanning algorithms for solving many important problems. This paper shows how to solve a wide variety of Discrete Mathematics and OR problems using assignment matrices and linear programming, specifically using Excel Solvers although the same…
What Is a Psychological Misconception? Moving toward an Empirical Answer
ERIC Educational Resources Information Center
Bensley, D. Alan; Lilienfeld, Scott O.
2015-01-01
Studies of psychological misconceptions have often used tests with methodological shortcomings, unknown psychometric properties, and ad hoc methods for identifying misconceptions, creating problems for estimating frequencies of specific misconceptions. To address these problems, we developed a new test, the Test of Psychological Knowledge and…
How School Staff Understand the Relationship between Problem Behaviours and Language Difficulties
ERIC Educational Resources Information Center
Ramsay, Janet; Cowell, Naina; Gersch, Irvine
2018-01-01
This exploratory study adopted a mixed methods methodology, a critical realist ontological stance and a constructionist epistemological position to consider how special educational needs coordinators and pastoral managers in mainstream high schools understand the relationship between problem behaviours and language development. Semi-structured…
Frameworks of Managerial Competence: Limits, Problems and Suggestions
ERIC Educational Resources Information Center
Ruth, Damian
2006-01-01
Purpose: To offer a coherent critique of the concept of managerial frameworks of competence through the exploration of the problems of generalizability and abstraction and the "scientific" assumptions of management. Design/methodology/approach: Employs the ecological metaphor of intellectual landscape and extends it to examining the…
NASA Astrophysics Data System (ADS)
Romo, David Ricardo
Foreign Object Debris/Damage (FOD) has been an issue for military and commercial aircraft manufacturers since the early ages of aviation and aerospace. Currently, aerospace is growing rapidly and the chances of FOD presence are growing as well. One of the principal causes in manufacturing is the human error. The cost associated with human error in commercial and military aircrafts is approximately accountable for 4 billion dollars per year. This problem is currently addressed with prevention programs, elimination techniques, and designation of FOD areas, controlled access, restrictions of personal items entering designated areas, tool accountability, and the use of technology such as Radio Frequency Identification (RFID) tags, etc. All of the efforts mentioned before, have not show a significant occurrence reduction in terms of manufacturing processes. On the contrary, a repetitive path of occurrence is present, and the cost associated has not declined in a significant manner. In order to address the problem, this thesis proposes a new approach using statistical analysis. The effort of this thesis is to create a predictive model using historical categorical data from an aircraft manufacturer only focusing in human error causes. The use of contingency tables, natural logarithm of the odds and probability transformation is used in order to provide the predicted probabilities of each aircraft. A case of study is shown in this thesis in order to show the applied methodology. As a result, this approach is able to predict the possible outcomes of FOD by the workstation/area needed, and monthly predictions per workstation. This thesis is intended to be the starting point of statistical data analysis regarding FOD in human factors. The purpose of this thesis is to identify the areas where human error is the primary cause of FOD occurrence in order to design and implement accurate solutions. The advantages of the proposed methodology can go from the reduction of cost production, quality issues, repair cost, and assembly process time. Finally, a more reliable process is achieved, and the proposed methodology may be used in other aircrafts.
A Methodology For Measuring Resilience in a Satellite-Based Communication Network
2014-03-27
solving the Travelling Salesman Problem (TSP) (Solnon p. 1). Based upon swarm intelligence, in a travelling salesman problem ants are sent out from...developed for the Travelling Salesman Problem (TSP) in 1992 (Solnon p. 1), this metaheuristic shows its roots in the original formulations. Given v, the...is lost. To tackle this problem , a common LEO orbit type is examined, the polar orbit. Polar LEO satellites travel from the south pole to the
Palesh, Oxana; Peppone, Luke; Innominato, Pasquale F; Janelsins, Michelle; Jeong, Monica; Sprod, Lisa; Savard, Josee; Rotatori, Max; Kesler, Shelli; Telli, Melinda; Mustian, Karen
2012-01-01
Sleep problems are highly prevalent in cancer patients undergoing chemotherapy. This article reviews existing evidence on etiology, associated symptoms, and management of sleep problems associated with chemotherapy treatment during cancer. It also discusses limitations and methodological issues of current research. The existing literature suggests that subjectively and objectively measured sleep problems are the highest during the chemotherapy phase of cancer treatments. A possibly involved mechanism reviewed here includes the rise in the circulating proinflammatory cytokines and the associated disruption in circadian rhythm in the development and maintenance of sleep dysregulation in cancer patients during chemotherapy. Various approaches to the management of sleep problems during chemotherapy are discussed with behavioral intervention showing promise. Exercise, including yoga, also appear to be effective and safe at least for subclinical levels of sleep problems in cancer patients. Numerous challenges are associated with conducting research on sleep in cancer patients during chemotherapy treatments and they are discussed in this review. Dedicated intervention trials, methodologically sound and sufficiently powered, are needed to test current and novel treatments of sleep problems in cancer patients receiving chemotherapy. Optimal management of sleep problems in patients with cancer receiving treatment may improve not only the well-being of patients, but also their prognosis given the emerging experimental and clinical evidence suggesting that sleep disruption might adversely impact treatment and recovery from cancer. PMID:23486503
Bringing Lean Six Sigma to the Supply Chain Classroom: A Problem-Based Learning Case
ERIC Educational Resources Information Center
Miller, Keith E.; Hill, Craig; Miller, Antoinette R.
2016-01-01
The article describes a project that employs problem-based learning (PBL) to teach the Lean Six Sigma (LSS) methodology as part of an undergraduate or graduate business course. It is scalable to a variety of course delivery and schedule formats, and uses data sets that can create distinct problem-solving scenarios for up to 16 student teams. It…
Portable parallel stochastic optimization for the design of aeropropulsion components
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Rhodes, G. S.
1994-01-01
This report presents the results of Phase 1 research to develop a methodology for performing large-scale Multi-disciplinary Stochastic Optimization (MSO) for the design of aerospace systems ranging from aeropropulsion components to complete aircraft configurations. The current research recognizes that such design optimization problems are computationally expensive, and require the use of either massively parallel or multiple-processor computers. The methodology also recognizes that many operational and performance parameters are uncertain, and that uncertainty must be considered explicitly to achieve optimum performance and cost. The objective of this Phase 1 research was to initialize the development of an MSO methodology that is portable to a wide variety of hardware platforms, while achieving efficient, large-scale parallelism when multiple processors are available. The first effort in the project was a literature review of available computer hardware, as well as review of portable, parallel programming environments. The first effort was to implement the MSO methodology for a problem using the portable parallel programming language, Parallel Virtual Machine (PVM). The third and final effort was to demonstrate the example on a variety of computers, including a distributed-memory multiprocessor, a distributed-memory network of workstations, and a single-processor workstation. Results indicate the MSO methodology can be well-applied towards large-scale aerospace design problems. Nearly perfect linear speedup was demonstrated for computation of optimization sensitivity coefficients on both a 128-node distributed-memory multiprocessor (the Intel iPSC/860) and a network of workstations (speedups of almost 19 times achieved for 20 workstations). Very high parallel efficiencies (75 percent for 31 processors and 60 percent for 50 processors) were also achieved for computation of aerodynamic influence coefficients on the Intel. Finally, the multi-level parallelization strategy that will be needed for large-scale MSO problems was demonstrated to be highly efficient. The same parallel code instructions were used on both platforms, demonstrating portability. There are many applications for which MSO can be applied, including NASA's High-Speed-Civil Transport, and advanced propulsion systems. The use of MSO will reduce design and development time and testing costs dramatically.
Learning dominance relations in combinatorial search problems
NASA Technical Reports Server (NTRS)
Yu, Chee-Fen; Wah, Benjamin W.
1988-01-01
Dominance relations commonly are used to prune unnecessary nodes in search graphs, but they are problem-dependent and cannot be derived by a general procedure. The authors identify machine learning of dominance relations and the applicable learning mechanisms. A study of learning dominance relations using learning by experimentation is described. This system has been able to learn dominance relations for the 0/1-knapsack problem, an inventory problem, the reliability-by-replication problem, the two-machine flow shop problem, a number of single-machine scheduling problems, and a two-machine scheduling problem. It is considered that the same methodology can be extended to learn dominance relations in general.
An improved exploratory search technique for pure integer linear programming problems
NASA Technical Reports Server (NTRS)
Fogle, F. R.
1990-01-01
The development is documented of a heuristic method for the solution of pure integer linear programming problems. The procedure draws its methodology from the ideas of Hooke and Jeeves type 1 and 2 exploratory searches, greedy procedures, and neighborhood searches. It uses an efficient rounding method to obtain its first feasible integer point from the optimal continuous solution obtained via the simplex method. Since this method is based entirely on simple addition or subtraction of one to each variable of a point in n-space and the subsequent comparison of candidate solutions to a given set of constraints, it facilitates significant complexity improvements over existing techniques. It also obtains the same optimal solution found by the branch-and-bound technique in 44 of 45 small to moderate size test problems. Two example problems are worked in detail to show the inner workings of the method. Furthermore, using an established weighted scheme for comparing computational effort involved in an algorithm, a comparison of this algorithm is made to the more established and rigorous branch-and-bound method. A computer implementation of the procedure, in PC compatible Pascal, is also presented and discussed.
Neger, Emily N; Prinz, Ronald J
2015-07-01
Parental substance abuse is a serious problem affecting the well-being of children and families. The co-occurrence of parental substance abuse and problematic parenting is recognized as a major public health concern. This review focuses on 21 outcome studies that tested dual treatment of substance abuse and parenting. A summary of theoretical conceptualizations of the connections between substance abuse and parenting provides a backdrop for the review. Outcomes of the dual treatment studies were generally positive with respect to reduction of parental substance use and improvement of parenting. Research in this area varied in methodological rigor and needs to overcome challenges regarding design issues, sampling frame, and complexities inherent in such a high-risk population. This area of work can be strengthened by randomized controlled trials, use of mixed-methods outcome measures, consideration of parent involvement with child protective services, involvement of significant others in treatment, provision of concrete supports for treatment attendance and facilitative public policies. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification
NASA Astrophysics Data System (ADS)
Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.
MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.
NASA Astrophysics Data System (ADS)
Avilova, I. P.; Krutilova, M. O.
2018-01-01
Economic growth is the main determinant of the trend to increased greenhouse gas (GHG) emission. Therefore, the reduction of emission and stabilization of GHG levels in the atmosphere become an urgent task to avoid the worst predicted consequences of climate change. GHG emissions in construction industry take a significant part of industrial GHG emission and are expected to consistently increase. The problem could be successfully solved with a help of both economical and organizational restrictions, based on enhanced algorithms of calculation and amercement of environmental harm in building industry. This study aims to quantify of GHG emission caused by different constructive schemes of RC framework in concrete casting. The result shows that proposed methodology allows to make a comparative analysis of alternative projects in residential housing, taking into account an environmental damage, caused by construction process. The study was carried out in the framework of the Program of flagship university development on the base of Belgorod State Technological University named after V.G. Shoukhov
NASA Astrophysics Data System (ADS)
de Vito, Rossella; Portoghese, Ivan; Pagano, Alessandro; Fratino, Umberto; Vurro, Michele
2017-12-01
Increasing pressure affects water resources, especially in the agricultural sector, with cascading impacts on energy consumption. This is particularly relevant in the Mediterranean area, showing significant water scarcity problems, further exacerbated by the crucial economic role of agricultural production. Assessing the sustainability of water resource use is thus essential to preserving ecosystems and maintaining high levels of agricultural productivity. This paper proposes an integrated methodology based on the Water-Energy-Food Nexus to evaluate the multi-dimensional implications of irrigation practices. Three different indices are introduced, based on an analysis of the most influential factors. The methodology is then implemented in a catchment located in Puglia (Italy) and a comparative analysis of the three indices is presented. The results mainly highlight that economic land productivity is a key driver of irrigated agriculture, and that groundwater is highly affordable compared to surface water, thus being often dangerously perceived as freely available.
Neger, Emily N.; Prinz, Ronald J.
2015-01-01
Parental substance abuse is a serious problem affecting the well-being of children and families. The co-occurrence of parental substance abuse and problematic parenting is recognized as a major public health concern. This review focuses on 21 outcome studies that tested dual treatment of substance abuse and parenting. A summary of theoretical conceptualizations of the connections between substance abuse and parenting provides a backdrop for the review. Outcomes of the dual treatment studies were generally positive with respect to reduction of parental substance use and improvement of parenting. Research in this area varied in methodological rigor and needs to overcome challenges regarding design issues, sampling frame, and complexities inherent in such a high-risk population. This area of work can be strengthened by randomized controlled trials, use of mixed-methods outcome measures, consideration of parent involvement with child protective services, involvement of significant others in treatment, provision of concrete supports for treatment attendance and facilitative public policies. PMID:25939033
D'Agostino, Fabio; Del Core, Marianna; Cappello, Simone; Mazzola, Salvatore; Sprovieri, Mario
2015-10-01
Here, we describe the methodologies adopted to ensure that natural seawater, used as "influent water" for the land test, complies with the requirement that should be fulfilled to show the efficacy of the new ballast water treatment system (BWTS). The new BWTS was located on the coast of SW Sicily (Italy), and the sampled seawater showed that bacteria and plankton were two orders of magnitude lower than requested. Integrated approaches for preparation of massive cultures of bacteria (Alcanivorax borkumensis and Marinobacter hydrocarbonoclasticus), algae (Tetraselmis suecica), rotifers (Brachionus plicatilis), and crustaceans (Artemia salina) suitable to ensure that 200 m(3) of water fulfilled the international guidelines of MEPC.174(58)G8 are here described. These methodologies allowed us to prepare the "influent water" in good agreement with guidelines and without specific problems arising from natural conditions (seasons, weather, etc.) which significantly affect the concentrations of organisms at sea. This approach also offered the chance to reliably run land tests once every two weeks.
Wolfs, Vincent; Villazon, Mauricio Florencio; Willems, Patrick
2013-01-01
Applications such as real-time control, uncertainty analysis and optimization require an extensive number of model iterations. Full hydrodynamic sewer models are not sufficient for these applications due to the excessive computation time. Simplifications are consequently required. A lumped conceptual modelling approach results in a much faster calculation. The process of identifying and calibrating the conceptual model structure could, however, be time-consuming. Moreover, many conceptual models lack accuracy, or do not account for backwater effects. To overcome these problems, a modelling methodology was developed which is suited for semi-automatic calibration. The methodology is tested for the sewer system of the city of Geel in the Grote Nete river basin in Belgium, using both synthetic design storm events and long time series of rainfall input. A MATLAB/Simulink(®) tool was developed to guide the modeller through the step-wise model construction, reducing significantly the time required for the conceptual modelling process.
Strategies and Methodologies for Developing Microbial Detoxification Systems to Mitigate Mycotoxins
Zhu, Yan; Hassan, Yousef I.; Lepp, Dion; Shao, Suqin; Zhou, Ting
2017-01-01
Mycotoxins, the secondary metabolites of mycotoxigenic fungi, have been found in almost all agricultural commodities worldwide, causing enormous economic losses in livestock production and severe human health problems. Compared to traditional physical adsorption and chemical reactions, interest in biological detoxification methods that are environmentally sound, safe and highly efficient has seen a significant increase in recent years. However, researchers in this field have been facing tremendous unexpected challenges and are eager to find solutions. This review summarizes and assesses the research strategies and methodologies in each phase of the development of microbiological solutions for mycotoxin mitigation. These include screening of functional microbial consortia from natural samples, isolation and identification of single colonies with biotransformation activity, investigation of the physiological characteristics of isolated strains, identification and assessment of the toxicities of biotransformation products, purification of functional enzymes and the application of mycotoxin decontamination to feed/food production. A full understanding and appropriate application of this tool box should be helpful towards the development of novel microbiological solutions on mycotoxin detoxification. PMID:28387743
NASA Astrophysics Data System (ADS)
Miola, Apollonia; Ciuffo, Biagio
2011-04-01
Maritime transport plays a central role in the transport sector's sustainability debate. Its contribution to air pollution and greenhouse gases is significant. An effective policy strategy to regulate air emissions requires their robust estimation in terms of quantification and location. This paper provides a critical analysis of the ship emission modelling approaches and data sources available, identifying their limits and constraints. It classifies the main methodologies on the basis of the approach followed (bottom-up or top-down) for the evaluation and geographic characterisation of emissions. The analysis highlights the uncertainty of results from the different methods. This is mainly due to the level of uncertainty connected with the sources of information that are used as inputs to the different studies. This paper describes the sources of the information required for these analyses, paying particular attention to AIS data and to the possible problems associated with their use. One way of reducing the overall uncertainty in the results could be the simultaneous use of different sources of information. This paper presents an alternative methodology based on this approach. As a final remark, it can be expected that new approaches to the problem together with more reliable data sources over the coming years could give more impetus to the debate on the global impact of maritime traffic on the environment that, currently, has only reached agreement via the "consensus" estimates provided by IMO (2009).
Evaluation of a primary school drug drama project: methodological issues and key findings.
Starkey, F; Orme, J
2001-10-01
This paper describes the impact evaluation of a primary school drug drama project developed by a health promotion service and a theatre's education department in England. The project targeted 10-11 year olds in 41 schools with an interactive drama production and workshop day on attitudes, choices, decisions and risks of alcohol, tobacco and illegal drug use. Parents were also involved in parents' evenings and watching children's performances. The research consisted of both process evaluation, consultation with pupils, teachers, parents, actors and health promotion staff on the project itself, and impact evaluation which looked at potential changes in children's knowledge, attitudes and decision-making skills. This paper reports findings of the impact evaluation, from six of the schools participating in the project. The impact evaluation consisted of pre- and post-project testing using a 'draw and write' and a problem-solving exercise. These findings suggest that the project had a significant impact on the children's knowledge of names of specific illegal drugs, and on their awareness that alcohol and cigarettes were also drugs, and secondly encouraged the children to think in less stereotypical terms about drugs and drug users. The problem-solving exercise, involving decision-making scenarios, showed small but positive trends between pre- and post-project solutions in more than half of the response categories. Methodological difficulties relating to evaluating such a project are discussed.
Development of Methodology for Programming Autonomous Agents
NASA Technical Reports Server (NTRS)
Erol, Kutluhan; Levy, Renato; Lang, Lun
2004-01-01
A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently
One Controller at a Time (1-CAT): A mimo design methodology
NASA Technical Reports Server (NTRS)
Mitchell, J. R.; Lucas, J. C.
1987-01-01
The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.